Plan 9: Real Time Streaming
Stream a single window or the whole screen via Real Time Streaming Protocol (RTSP).
This uses MJPEG stream over TCP-interleave. Plan 9 /dev/screen and /dev/window produces RGBA byte stream with Plan 9 headers. This is converted to JPEG using tojpg and streamed using RTSP.
In the video, host port 12554 port is being forwarded to 554 in the Plan 9 VM.
ffplay -rtsp_transport tcp rtsp://localhost:12554/stream1
Code
- https://gitlab.com/atamariya/plan9front/-/blob/dev/sys/src/cmd/rtsp/server.c
- https://gitlab.com/atamariya/plan9front/-/blob/dev/sys/src/cmd/jpg/tojpg.c
- https://gitlab.com/atamariya/plan9front/-/blob/dev/sys/src/cmd/jpg/writejpg.c
Youtube Live
Use Real-Time Messaging Protocol (RTMP) to feed into Youtube / Twitch Live.
video/hj264 -f 25 /dev/screen | video/rtmp -a /dev/audio rtmp://.... rtmp://...
Lessons Learnt
- MJPEG and H264 are the most widely supported formats.
- YUV422 (type 0 or 64) and YUV420 (type 1 or 65) with q=50 are the most widely supported formats for MJPEG streaming. Don't use YUV444.
- RTSP client displays a frame only AFTER the arrival of next frame and NOT after receiving the end of frame marker (eg. FF D9 for MJPEG). Be mindful when you are testing a single frame. You need to close the connection to view it.
References
- RTP Payload Format for JPEG-compressed Video (RFC2435)
- Anatomy of a JPEG
- JPEG Syntax and structure
- Chroma Subsampling
- FFMPEG Implementation
- GStreamer Implementation
Comments
Post a Comment