Does WebM have its own delivery method? How does it relate to RTMP and HTTP Live Streaming?

When I read up on streaming media formats and packaging methods, I am confused by the way WebM is described as mutually exclusive to RTMP and HTTP Live Streaming.

From my understanding, WebM is a video format, like a way of encoding a video file, with the .webm extension, whereas both RTMP and HLS are ways of sending video formats across the web in a live stream (a way of packaging bits of information and sending them over some web protocol, like HTTP TCP).

Does WebM have its own proprietary method for sending itself across the web? Is it the case the RTMP and HLS cannot send WebM formatted videos?


Solution 1:

WebM is in essence a container format for carrying VP8 or VP9 video and Vorbis or Opus audio. It does not specify how it should be streamed, and generally, it does not have wide support for streaming, compared to other container formats like MPEG-2 TS or MP4. It can however still be used for streaming.

Note that by "streaming", I do not mean just downloading a single file from a website, or embedding a single (long) clip in an HTML5 <video> tag. Over the last years, several more advanced streaming technologies have been used:

  • RTMP streaming (Real Time Messaging Protocol), requires an RTMP server like Adobe Flash Media Server, which will stream Flash-supported file formats (MP4, FLV) to the client. This is still quite widespread, but slowly and surely dying out – like all Flash-based technologies.

    As WebM is not supported in Flash, you cannot use it there.

  • RTSP streaming (Real Time Streaming Protocol) is a control protocol for streaming servers such as the QuickTime Streaming Server or Helix Server. The client and server will exchange control messages through this protocol, while the data is transmitted through RTP (Real-time Transport Protocol) payload. This is rarely found on the Web, but rather IPTV.

    There are specifications on how to encapsulate WebM in RTP.

  • HTTP Live Streaming and MPEG-DASH are adaptive streaming technologies in which the client requests chunks of a video from a server through simple HTTP requests, based on an M3U8 playlist file (in the case of HLS) or an MPD Manifest (in the case of DASH). This file indexes these file chunks containing the actual audio and video data.

    In HLS the video must be stored in MPEG-2 TS or, since 2017, fragmented MP4 (ISO base media format) files. MPEG-DASH has broader support; here, segmented WebM can also be used.