Low Latency Streaming: Improve Video Latency & Live Stream With no Delay!

In order to deliver a top-notch experience in live videos, there needs to be a focus on low latency streaming to enhance the experience for all your viewers. Keeping your video latency low in your live streams is one In order to deliver a top-notch experience in live videos, there needs to be a focus on low latency video streaming to enhance the experience for all your viewers.

Keeping your video latency low in your live streams is one of the main ways to keep your audience tuned in throughout your program in real-time.

Low latency streaming: Improve video latency and live stream with no delay

If you are a gamer – or have a child who is – you may already be familiar with the concept of latency. Regardless of how good gamers are, if it takes a long time for their computers to communicate with the servers in play and receive data describing what’s happening at any given moment in the game, they could be blindsided by an opponent with lower network latency and wind up in a virtual Valhalla.

Similarly, latency is a fact of life in television. Comedian Super Dave Osborne (the late Bob Einstein) even did a comedy bit around it that illustrates the effect of latency in a live satellite feed better than words can describe.

What is Video latency?

When live streaming, you want to deliver the lowest possible latency. For broadcasting and live video production, low latency video streaming is extremely essential for video delivery. There are many cases when latency is involved. In general, the definition of latency is the time needed for data to process over a network connection. 

Video latency is the delay between a captured video frame and when the frame is displayed across the network. This is the case when live streaming whether it be video calling, delivering video from a remote studio at home, or live production on set. When delivering video in real-time, you want the delay of your live stream to be so small it is unnoticeable to the viewers watching the event from any platform on any device.

The comedy sketch works because it exaggerates the lag time –or latency—TV viewers have come to expect when a correspondent in a faraway land is interviewed by the talent in the studio.

Like the gamer suffering from network latency, the latency Super Dave and Jimmy encounter in their faux satellite feed prevents them from having real-time interaction.

Low latency live streaming using mobile device or smart phone

That is not simply because of the roughly quarter of a second it takes to send a video signal 23,000 miles up and down via a geostationary satellite and then the other quarter second to return a signal. Other factors contribute as well, including all of the signal processing, routing and earth station hops that happen between the correspondent and the studio.

Similarly, anyone who contributes video by streaming a signal via the internet must contend with latency—the same sort of lag, which results from multiple slight delays introduced into the stream by the computers and routers that handle streaming IP packets and direct them to their final destination.

Further delay is introduced before the video is even streamed during the encoding phase when large video files are compressed and at the receiver when that IP packet stream is decoded.

What is low latency streaming?

Low latency streaming is a measurement of the time an image is captured to the point a viewer experiences it on their device or screen. The total time difference between the source and viewer is described as glass-to-glass. Low latency streaming is a glass-to-glass delay of five seconds or less. 

Low latency video streaming enables an experience where viewers can view your live stream without delay. In broadcasting, latency takes place from processing live video to video delivery. This together can take time as there are many factors that come together in order to complete the procedure of a live stream. Although, this doesn’t have to take a lot of time as there are ways to optimize the delay of each process.

Sometimes, live streams can have pretty high latency measurements ranging from 45 – 120 seconds. Others can be much higher than this.

Is low latency good for live streaming?

Low latency streaming is good for broadcasting and live streaming. It mostly depends on the procedures a captured image must go through and the use case of the live stream. Maybe the live stream has high-quality video, wireless remote production, live audience interaction, thousands of viewers, or a cloud-based production that adds more value to your live stream as it is distributed and displayed across multiple devices and many different screens.

Low latency video stream for live production broadcasts
News reporters sometimes appear delayed in their responses due to latency

In these use cases, low latency live streaming is very important. Most professional broadcasts and live streams take advantage of cloud-based technology to deliver the highest quality live stream possible. Even for amateur youtube live streaming, audience interaction has become very common today which requires low latency to avoid buffering.

When breaking into real-time streaming there are many features that can be done to add value to your live video stream as a whole to keep viewers and users always coming back to view more. If you ever decide you want to grow your audience to interact in real-time or you are a professional in live video production of real-time content, your main focus is to have the proper setup and remote production equipment to process and deliver video across a low latency streaming network.

High Latency vs Low Latency

High latency can have a negative impact on any live video broadcast from the lag of the delay. Viewers that tune into a live stream want to feel connected and be in the present in real-time. High latency hurts the viewer’s experience and it can disconnect them from the stream completely.

High latency vs Low latency for different live streaming use cases globally
In the United States, raw latency on a cellular network is about 250 milliseconds (ms)

When there is a delay in the delivery of content or video is constantly loading, it takes away from the experience viewers look for in a live stream. Lower latency and avoiding delays bring so much value and credibility to your broadcasting by delivering the content in real-time giving the viewers exactly what they need to be engaged and connected to the live event.

For most actions taken over the internet, like browsing webpages, high latency will not have such an effect on the experience and mainly relies on bandwidth. Yet, there is a big difference when it comes to video, streaming, broadcasting, or when real-time comes into play that requires high amounts of data to be processed and delivered over a network. Here are different cases where low latency live streaming is very important for your viewers:

  • Live remote productions
  • Cloud-based live production
  • Live sports broadcasting and streaming
  • Live remote collaborations
  • Remote Commentary
  • Live video distribution or multistreaming to multiple platforms
  • Real-time chat or audience participation
  • Online gaming

For Video over IP, users viewing live video over the public internet may be able to notice even the smallest amount of delay. In online gaming, the same case can apply and high latency can cause lags and delays.

What causes video latency

In general, a few factors contribute to network latency, including buffering in all network devices like the routers and gateways between the source and destination as well as the very way in which Transmission Control Protocol (TCP) works.

What causes video latency over cellular networks and video over ip
Live signals often have very far to travel, which creates latency

Another factor is network congestion – perhaps at a news event with many other TV station news crews competing for wireless bandwidth on the same cell tower, or at a sporting event or large political gathering where thousands of people are accessing the internet via the same towers. In these instances, there may be IP packet loss that could cause latency to drift.

1. Image Capture

Capturing live image and transcoding it into digital takes time whether it be from a camera on your phone to a high-end live production system. The lowest speed duration of a capture frame at 30fps is 1/30th of a second coming out to about 33ms (milliseconds). Live video mixers will introduce other areas that can affect video latency from decoding to processing, transcoding, and transmitting. The image capturing and processing will determine the quality and value.

  • Time range for Image Capturing: 5ms to 700ms

2. Live Video Encoding

Encoding a raw image to a compressed format for transmission over the internet will take time whether using encoding software or hardware. Using the best live video encoders can reach the level needed for low latency live streaming while keeping a high-quality encoded video.

  • Time Range for live video encoding: 1ms to 50ms.

3. Live Video Transmission

At this point, the video has been encoded and is now ready to transmit to the internet. Video latency can be affected by the video bitrate and bandwidth of the internet connection. Using a lower video bitrate provides lower latency.

  • Time Range for video transmission: 5ms to 750ms

4. Buffering

Due to the large amounts of communication routes across the internet encoded video can take time to reach its destination. It is important to maintain a specific time boundary as one could run the risk of losing data if latency is too low while a higher latency ensures data coming in late is recovered.

Several resources are available online that further explain what network latency is and what affects it, the causes of internet latency, what buffer bloat is, what’s been done to remedy it, and IP packet flows and routers.

  • Time Range for Buffering: hundreds of milliseconds to several seconds

5. Transcoding

To provide a high-quality viewer experience across multiple devices, transcoding must be able to adapt the encoded video for different screens and networks where viewers would receive the encoded video. Using the best cloud production tools like TVU Producer will allow encoded video to be transcoded and transrated at multiple different quality levels for different platforms and different networks

  • Time Range for Transcoding: 1 to 10 seconds

6. Transmission to Viewer’s Device

When transmitting live video to viewers there are two important protocols to consider – Non-HTTP and HTTP protocols.

Non-HTTP protocols (RTSP and RTMP) use TCP and UDP communications and can reach a very low latency. But for adaptive streaming, this can run into different issues and when scaling for many viewers can increase costs.

HTTP protocols (HLS, HDS, MSS, and MPEG-DASH) use the standard web servers and content distribution networks (CDNs) enabling them to scale to many thousands of viewers simultaneously. Included in these protocols are support for adaptive streaming and mobile devices.

HTTP protocols are best for live streaming from the support and scalability. But the latency will have a higher range more so than in Non-HTTP protocols.

  • Time Range for Non-HTTP protocols: 5ms to 10ms
  • Time Range for HTTP protocols: 2 seconds

7. Decoding for Device Display

Decompressing live streaming video can be as low as the duration of each frame (1/30th per second). Although it usually values around 3 times the frame duration. Ultimately this is dependent on the capabilities of the viewers’ device.

Time Range for Decoding: 33ms to hundreds of milliseconds

Low Latency Live Streaming through Cellular Networks

When it comes to live streaming via a cellular network, the biggest factor affecting latency is the type of connection in use. For instance, in the United States, the raw latency introduced by an LTE network is about 250 milliseconds (ms) —at least for connections between most cities and many towns.

Low latency streaming over 5g cellular network
If lots of people are competing for the same signal, this can create network congestion, causing further latency to drift

As 5G wireless networks are deployed and become more widely available, low latency will be possible. However, anyone depending on low latency 5G connections in the early days of deployment should remember that the initial wave of early 5G adopters may actually exceed the capacity of available 5G network resources, causing modems to revert to existing LTE networks to handle excess traffic and thus revert to LTE latency.

Improve video latency for mobile streaming production
Some companies choose to go live with a satellite if the cellular connection isn’t strong enough. Photo credit: newtek.eu

In more rural, remote areas where the availability of an LTE network is limited or non-existent, other wireless connections, such as a dedicated satellite uplink like Inmarsat’s BGAN network, maybe the only practical solution to live stream video. In such cases, roundtrip network latency will be in the range of 900ms.

Still, other networks used to contribute streaming video, such as Wi-Fi networks, Gigabit Ethernet and 10GbE have their own latency characteristics.

How can I improve video latency when live streaming?

If you are contributing your live stream via a cellular network, which is quite common in newsgathering, there isn’t a lot you can do to lower, or improve, your latency. As the phrase goes, it is what it is.

However, that doesn’t mean there are not steps you can take to manage the effect of network latency on your live stream.

In order to reduce video latency, high-quality hardware with cloud-based solutions are necessary. Although there are numerous ways to make sure low latency streaming is achieved. That is to say that your video latency is below 5 seconds to reach a level for low latency live streaming. 

Improve video latency with high-end live streaming hardware

From a hardware point of view, when using a wireless IP transmitter like the TVU One, the first step is to dial in the latency you expect from your wireless network connection. For instance, if you are live streaming via an LTE wireless network, dial in 0.8 seconds. This should accommodate the 250ms of raw LTE connection latency as well as the latency introduced by encoding and decoding.

TVU One low latency video streaming live video transmitter
Products such as the TVU One factor in latency as part of the production

Next, before going live to contribute your shot, conduct a test transmission to confirm you selected the appropriate latency for your network connection. If not, dial in more and repeat the test until you’ve nailed the right latency.

TVU One and the company’s other products, such as the TVU Remote Production System (RPS), were designed to treat latency as a production value, not a technical limitation. As such, once the test to verify the correct network latency value has successfully been conducted, TVU’s streaming solutions lock it in to ensure latency never drifts.

Improve video Latency in Live Streaming Production

From a production point of view, both the reporter in the field and the talent behind the desk in the studio should understand before going live how much latency is in play. Doing so will make it easier for them to finesse their live two-way conversation.

TVU timelock low latency video streaming during live stream production
Latency can be particularly problematic with multi-camera productions, but TVU Timelock can allow up to six TVU transmissions to the same delay to achieve synchronization

Another production consideration is determining whether network latency is low enough to allow, for example, a remote camera operator contributing video via an IP network to execute a pan, zoom or other camera movements as instructed by the director in a control room before the situation on the field changes.

Low Latency Video Streaming Requirements

Most professional broadcasts for television rank low latency streaming under 10 seconds. Others to a very high level, like TVU Networks, go into ultra-low latency ranking in under 1 second.

Most professional broadcasts for television rank low latency streaming under 10 seconds. Others to a very high level, like TVU Networks, rank with speeds in the milliseconds that reach ultra-low latency video streaming levels. The following list below can help lead to achieving low latency streaming.

Internet Connection

The most important way to reduce video latency is to have a powerful internet connection or use a 5G cellular network. For professional live production crews, cellular bonding over a 5G network with cloud-based technology is the best approach to transmitting live video over the internet.

5G cellular bonding solutions can help enhance video transmissions and distribution by bonding multiple networks together giving your increased bandwidth for a stable internet connection while assisting to reduce video latency. The bonus is you can be totally remote and transmit live video productions across the public internet making it much more affordable and effective for low latency streaming.

High-End Live Streaming Encoder

Using a quality live streaming encoder that can capture and encode efficiently will help achieve low latency streaming. This can require high-end hardware on set to encode live video. One can also use an effective codec or decrease the frame rate of the video being encoded.

High-Quality Live Video Transmitter

A camera or video pack that has the ability to capture high-quality video and process it for high-end video transmissions. Video packs like TVU One are live video transmitters able to capture and encode live video best to transmit over the public internet. 

TVU One processes live video using the state-of-the-art IS+ video/audio transmission algorithm that is built into TVU One. This enables reduced data and latency while maximizing transmission reliability. It combats packet loss making it possible to achieve near zero latency down to 0.8 seconds even when mobile.

HLS (HTTP Live Streaming) Protocol

Adaptive streaming protocols help reduce video latency. HLS protocol is the best for live video streaming platforms over the internet. Although the latency will be higher on one end, on the other end is the support for HTML5 video players and mobile devices. HLS protocol can provide a better streaming experience while maintaining a high level of reliability avoiding delays and lags.

Content Deliver Network (CDN)

Reducing latency would require reducing the distance between the live source and the viewer. Physically this is impossible, but with a  Content Delivery Network, this is possible. CDNs can help achieve low latency streaming by making it easy to distribute live video globally to many different platforms.

Start Live Streaming with No Delay!

Network latency is a fact of life for anyone contributing live video streams—whether those streams are destined for a local or network TV newscast or a social media platform.

With an understanding of your connection’s network latency, the appropriate settings dialed in to the IP transmitter, reporters and talent that understand what the delay will be and a bit of finesse on their part, viewers may notice the brief lags in their back-and-forth, but they won’t be laughing like the Kimmel audience who saw Super Dave’s schtick.

Let's
talk!

Talk to our team to learn more about our solutions