Streaming latency is the timespan between the moment a frame is captured and that frame is displayed on the viewers' screens. Latency occurs because each stream is processed several times during broadcasting to be delivered worldwide:
Each step affects latency, so the total timespan can increase to 30–40 seconds, especially if the streaming processing isn't optimized. For some companies (such as sports or metaverse events, news releases, etc.), such latency is too large, and it's crucial to reduce it.
The Gcore Streaming Platform receives live streams in RTMP or SRT protocols; transcodes to ABR (adaptive bitrate) via CDN in LL-HLS and LL-DASH protocols.
Also, Gcore uses CMAF (Common Media Application Format) as a base for LL-HLS/DASH. CMAF allows dividing segments into chunks (video fragments) for faster delivery over HTTP networks.
LL-HLS and LL-DASH reduce latency to 3–4 sec, depending on the network conditions.
The standard video delivery approach involves sending the entirely created segment to the CDN, and once the CDN receives the complete segment, it transmits it to the player. With this approach, video latency depends on segment length. For example, if a segment is 7 seconds long when requesting and processing the first segment, the player displays a frame that is already 7 seconds late compared to the actual time.
The Low Latency approach uses the CMAF-CTE extension (Chunked Transfer-Encoding), which helps divide live stream segments into small, non-overlapping, and independent fragments (chunks) with a length of 0.5–2 seconds. The independence of the chunks allows the encoder not to wait for the end of the complete loading of the segment but to send it to the CDN and the player in ready-made small fragments.
This approach helps eliminate the segment duration factor affecting video latency in standard video delivery methods. Therefore, latency for 10-second and 2-second segments will be the same and minimal. The total latency between the CDN server and the viewers will be at most 4 seconds.
Compared to the standard approach, a 7-second segment will be divided into 2–3 seconds chunks. Thus, the total latency will be lower.
We support Low Latency streaming by default. It means your live streams are automatically transcoded to LL-HLSv6 or LL-DASH protocol when you create and configure a live stream. Links for embedding the live stream to your own player contain the /cmaf/ part and look as follows:
https://12345.gvideo.io/cmaf/12345_111/index.mpd(LL-DASH, which is supported by any device but does not work with iOS).
https://12345.gvideo.io/cmaf/12345_111/master.m3u8(LL HLSv6, which is supported by iOS (Safari browser) but doesn’t work with non-Apple devices).
where 12345 is the unique ID of your account and 111 is the unique live stream ID.
We also support legacy modes for full backward HLS compatibility across all devices and infrastructures.
Add at the end of the link for embedding the query string as follows:
To return to using LL HLS, delete the query parameter in bold or replace it with the parameter: ?HLS_version=ll (these actions are identical).
The changeover to the legacy format will be displayed in the URL:
Was this article helpful?
Explore the Streaming Platform by Gcore