CMAF allows you to solve two tasks: fast delivery with low latency & optimization of storage of video segments.
Typically, a video stream goes through a large number of transformation stages from the video recording site to the final viewer. At each stage, the video is divided into segments with a duration of several seconds, so that each segment is better converted, encoded, and packaged for transmission over the public Internet. We have worked to reduce latency from the traditional 30-50 seconds to 4-5 seconds. The main idea of Low Latency HTTP-based streaming (DASH, HLS) is to use HTTP 1.1 CTE extension called "Chunked Transfer Encoding". CMAF allows us to divide the main video segment into small parts (chunks), and transfer new parts instantly as they arrive from the transcoding servers. At the same time, delivery is possible to any viewer on any platform.
Two well-known video delivery protocols are HTTP Live Streaming (HLS) and MPEG Dynamic Adaptive Streaming over HTTP (MPEG-DASH). They are used to deliver videos to different platforms. Although both represent data in a similar way, they are incompatible with each other. You have to encode the video 2 times to transfer the same original audio and video data.
CMAF solves this problem by defining a more abstract format that allows using different manifests for the same encoded data. So the encoded data only needs to be stored once, with one manifest for HLS and one for DASH. For devices that do not support CMAF, the fallback option to classic HLS MPEG-TS is possible.
Comparison of traditional video delivery media and Gcore:
How it works?
Your live streams can be in RTMP or SRT protocols. The streams will be transcoded by us using the Adaptive Bitrate (ABR) technology into several bitrates for viewing by the end-user: 4K UHD, FullHD, HD, SD, LQ. The number and quality of bitrates can be set for the client's needs The number of simultaneous transcoded streams is not limited.
Encoder and packager dynamically assemble a video stream with CMAF support into MP4 chunks. Protocols for transmission to the end viewer are MPEG‑DASH CMAF or HLS with fragmented MP4 (fMP4).
We use our own powerful CDN for stable video content delivery and load balancing between servers.
For the demo, we looped the video in Live mode, simulating an online broadcast.
The stream has a built-in clock at the bottom that shows the video encoding time (in the UTC time zone). The clock in the stream is no more than the above-mentioned 4-5 seconds behind the real-time.
Manifests, if you want to check how it works:
– HLS is available here
– MPEG-DASH is available here
Want to try your own stream? Register and use your RTMP or SRT streams.