Why is latency the next big thing in networking?

Posted on September 10, 2018

Latency

Remember latency? Around five years ago, as improvements to WordPress and other blogging platforms led to a mini-explosion in website numbers, latency was all the rage.

The speed at which your website would load was crucial to your online presence, businesses were told. This was old news for many larger companies, such as financial services players, which had relied on split-second online reaction times for years.

Since then, however, IT managers seem to have picked up new concerns. If you look at a list of strategic priorities for chief information officers in 2018, you’re likely to find items such as digitisation, cybersecurity, and the Internet of Things.

Does this mean latency no longer counts? Not at all: latency is as important as ever. And it could be about to become even more so, for a number of reasons. Take ongoing changes in the nature of network traffic, for example.

Once upon a time, data networks only had to deal with email and the sluggish traffic from dial-up modems. In those days, getting a connection at all was a blessing. Getting a low-latency connection was all but unheard of.

Networks have got faster

Since then, networks, software, and endpoints have all got much, much faster. The number of network users has ballooned, and the types of traffic we see on the Internet has changed dramatically.  

Traffic has gone up, as you would expect, but in particular there has been an explosion in the kinds of traffic that are highly susceptible to latency. The first of these is voice communications, particularly via cloud-based services.

Whereas early voice communications were carried on separate networks to data, today the two types of traffic share network infrastructures. Voice over IP means phone calls are just another form of data, albeit one that has to be delivered with very low latency.

Dwarfing the latency requirements of today’s voice calls, though, is video. This is one of the most latency-sensitive applications you can run over a network, and its use is mushrooming.

According to the Cisco Visual Networking Index, by 2021 82 percent of all IP traffic will be video. In combination with this, we are seeing massive growth in cloud applications. Cisco says 95 percent of Internet traffic will come from the cloud in 2021.

Mission-critical applications

Not all cloud traffic is sensitive to latency. But as more and more mission-critical applications migrate to the cloud, there is a growing need to make sure traffic does not get delayed in transit—particularly if your cloud provider is based at a remote location.

The upshot is that even if you’re not hearing much talk about latency, you should be paying plenty of attention to it when you speak to your network provider.

Bear in mind that it can be hard for some operators to provide the kind of latency you need in today’s fast-reacting, video-heavy digital world. Telcos that do not own or have the ability to best route their own networks, for example, can find it hard to guarantee low-latency.

Similarly, for traffic that is coming from or going overseas, which may include a large portion of your applications or production workload, you ideally need a direct link to major global interconnectors such as the Australia Singapore Cable.

Taking the time to seek out a great low-latency network may seem unnecessary unless you are engaged in algorithmic trading or the like, but it’s not.

Even getting a hosted corporate video to run correctly or making your website responsive enough to drive up sales could hinge on you getting the latency question right.

Related products & articles