Plan 9: Real Time Streaming

Stream a single window or the whole screen via Real Time Streaming Protocol (RTSP).

This uses MJPEG stream over TCP-interleave. Plan 9 /dev/screen and /dev/window produces RGBA byte stream with Plan 9 headers. This is converted to JPEG using tojpg and streamed using RTSP.

In the video, host port 12554 port is being forwarded to 554 in the Plan 9 VM. 

ffplay -rtsp_transport tcp rtsp://localhost:12554/stream1 
 

 

Code

 

Youtube Live

Use Real-Time Messaging Protocol (RTMP) to feed into Youtube / Twitch Live.

video/hj264 -f 25 /dev/screen | video/rtmp -a /dev/audio rtmp://.... rtmp://...     
 

 

Lessons Learnt

  • MJPEG and H264 are the most widely supported formats. 
  • YUV422 (type 0 or 64) and YUV420 (type 1 or 65) with q=50 are the most widely supported formats for MJPEG streaming. Don't use YUV444.
  • RTSP client displays a frame only AFTER the arrival of next frame and NOT after receiving the end of frame marker (eg. FF D9 for MJPEG). Be mindful when you are testing a single frame. You need to close the connection to view it. 
 

References

  1. RTP Payload Format for JPEG-compressed Video (RFC2435) 
  2. Anatomy of a JPEG 
  3. JPEG Syntax and structure
  4. Chroma Subsampling
  5. FFMPEG Implementation
  6. GStreamer Implementation    

 

Comments

Popular posts from this blog

Plan 9 : The Infinity Notebook

Emacs: Binary File Viewer

Plan 9 Remote File Access from Emacs