Why Is Low Latency Streaming ImportantIf you are a gamer – or have a child who is – you may already be familiar with the concept of latency.
Regardless of how good gamers are, if it takes a long time for their computers to communicate with the servers in play and receive data describing what’s happening at any given moment in the game, they could be blindsided by an opponent with lower network latency and wind up in a virtual Valhalla.
Similarly, latency is a fact of life in television. Comedian Super Dave Osborne (the late Bob Einstein) even did a comedy bit around it that illustrates the effect of latency in a live satellite feed better than words can describe.
What Is Low Latency Streaming?
The comedy sketch works because it exaggerates the lag time –or latency—TV viewers have come to expect when a correspondent in a far-away land is interviewed by talent in the studio.
Like the gamer suffering from network latency, the latency Super Dave and Jimmy encounter in their faux satellite feed prevents them from having real-time interaction.
That is not simply because of the roughly quarter of a second it takes to send a video signal 23,000 miles up and down via a geostationary satellite and then the other quarter second to return a signal. Other factors contribute as well, including all of the signal processing, routing and earth station hops that happen between the correspondent and the studio.
Similarly, anyone who contributes video by streaming a signal via the internet must contend with latency—the same sort of lag, which results from multiple slight delays introduced into the stream by the computers and routers that handle streaming IP packets and direct them to their final destination.
Further delay is introduced before video is even streamed during the encoding phase when large video files are compressed and at the receiver when that IP packet stream is decoded.
What factors affect latency?
In general, a few factors contribute to network latency, including buffering in all network devices like the routers and gateways between the source and destination as well as the very way in which Transmission Control Protocol (TCP) works.
Another factor is network congestion – perhaps at a news event with many other TV station news crews competing for wireless bandwidth on the same cell tower, or at a sporting event or large political gathering where thousands of people are accessing the internet via the same towers. In these instances, there may be IP packet loss that could cause latency to drift.
Several resources are available online that further explain what network latency is and what affects it, the causes of internet latency, what bufferbloat is, what’s been done to remedy it, and IP packet flows and routers.
When it comes to live streaming via a cellular network, the biggest factor affecting latency is the type of connection in use. For instance, in the United States the raw latency introduced by an LTE network is about 250 milliseconds (ms) —at least for connections between most cities and many towns.
As 5G wireless networks are deployed and become more widely available, low latency will be possible. However, anyone depending on low latency 5G connections in the early days of deployment should remember that the initial wave of early 5G adopters may actually exceed the capacity of available 5G network resources, causing modems to revert to existing LTE networks to handle excess traffic and thus revert to LTE latency.
In more rural, remote areas where the availability of an LTE network is limited or non-existent, other wireless connections, such as a dedicated satellite uplink like Inmarsat’s BGAN network, may be the only practical solution to live stream video. In such cases, roundtrip network latency will be in the range of 900ms.
Still other networks used to contribute streaming video, such as Wi-Fi networks, Gigabit Ethernet and 10GbE have their own latency characteristics.
Can I improve my latency when live streaming?
If you are contributing your live stream via a cellular network, which is quite common in newsgathering, there isn’t a lot you can do to lower, or improve, your latency. As the phrase goes, it is what it is.
However, that doesn’t mean there are not steps you can take to manage the effect of network latency on your live stream.
From a hardware point of view, when using a wireless IP transmitter like the TVU One, the first step is to dial in the latency you expect from your wireless network connection. For instance, if you are live streaming via an LTE wireless network, dial in 0.8 seconds. This should accommodate the 250ms of raw LTE connection latency as well as the latency introduced by encoding and decoding.
Next, before going live to contribute your shot, conduct a test transmission to confirm you selected the appropriate latency for your network connection. If not, dial in more and repeat the test until you’ve nailed the right latency.
TVU One and the company’s other products, such as the TVU Remote Production System (RPS), were designed to treat latency as a production value, not a technical limitation. As such, once the test to verify the correct network latency value has successfully been conducted, TVU’s streaming solutions lock it in to ensure latency never drifts.
From a production point of view, both the reporter in the field and the talent behind the desk in the studio should understand before going live how much latency is in play. Doing so will make it easier for them to finesse their live two-way conversation.
Another production consideration is determining whether network latency is low enough to allow, for example, a remote camera operator contributing video via an IP network to execute a pan, zoom or other camera movement as instructed by the director in a control room before the situation on the field changes.
Network latency is a fact of life for anyone contributing live video streams—whether those streams are destined for a local or network TV newscast or a social media platform.
With an understanding of your connection’s network latency, the appropriate settings dialed in to the IP transmitter, reporters and talent that understand what the delay will be and a bit of finesse on their part, viewers may notice the brief lags in their back-and-forth, but they won’t be laughing like the Kimmel audience who saw Super Dave’s schtick.