Plan 9: Real Time Streaming
Stream a single window or the whole screen via Real Time Streaming Protocol (RTSP).
This uses MJPEG stream over TCP-interleave. Plan 9 /dev/screen and /dev/window produces RGBA byte stream with Plan 9 headers. This is converted to JPEG using tojpg and streamed using RTSP.
In the video, host port 12554 port is being forwarded to 554 in the Plan 9 VM.
ffplay -rtsp_transport tcp rtsp://localhost:12554/stream1
Code
- https://gitlab.com/atamariya/plan9front/-/blob/dev/sys/src/cmd/rtsp/server.c
- https://gitlab.com/atamariya/plan9front/-/blob/dev/sys/src/cmd/jpg/tojpg.c
- https://gitlab.com/atamariya/plan9front/-/blob/dev/sys/src/cmd/jpg/writejpg.c
Youtube Live
Use Real-Time Messaging Protocol (RTMP) to feed into Youtube / Twitch Live.
video/hj264 -f 25 /dev/screen | video/rtmp -a /dev/audio rtmp://.... rtmp://...
Note:
- The encoder and the publisher programs work with the Inter Video Format (IVF) container.
- If you are looking for a simple H264 encoder to port to another platform, minih264e is a good place to start.
Demo Live URL: https://youtube.com/live/dvoWNjdwMCo
Lessons Learnt
- MJPEG and H264 are the most widely supported formats.
- YUV422 (type 0 or 64) and YUV420 (type 1 or 65) with q=50 are the most widely supported formats for MJPEG streaming. Don't use YUV444.
- q factor is used to calculate quantization table DQT. You must use the formula defined in the RFC since the table itself is NOT transmitted.
- RTSP client displays a frame only AFTER the arrival of next frame and NOT after receiving the end of frame marker (eg. FF D9 for MJPEG). Be mindful when you are testing a single frame. You need to close the connection to view it.
- Human eyes are more sensitive to brightness than to color - this is the basis of Chroma-subsampling.
- We use XRGB instead of simply RGB for 32 bit alignment. X might be alpha or be unused.
- YUV is called rawvideo format in ffmpeg and the sub-sampling is specified as the pixel_format.
- XRGB to YUV420 compression ratio is 4:1.5 .
- Youtube Live NEEDS BOTH audio and video streams.
- Use readn() to deal with fixed size data e.g. a frame.
References
- RTP Payload Format for JPEG-compressed Video (RFC2435)
- Anatomy of a JPEG
- JPEG Syntax and structure
- Chroma Subsampling
- FFMPEG Implementation
- GStreamer Implementation
Comments
Post a Comment