Jitter — a real phenomenon

Posted on October 03, 2018


It’s never a good sign and always seems hard to avoid, particularly on voice and video devices — Jitter.

It probably gives you the shivers just thinking about it — your new technology getting a bout of the jitters.

Much like the jitters that overcome people, jitter on devices is a very real phenomenon with real-world implications.

Jitter is that feeling when you’ve just tuned in, started hoeing into the popcorn, and notice something is not quite right on the screen, such as lips totally out of sync with the audio. Or when you’re geared up playing that new online game and packet loss starts setting in, derailing your quality of experience (QoE). Worse still, catching up with an old friend only to encounter a lag on multidirectional consumer services like Skype or FaceTime.

For these reasons, among others, there has been an onslaught of academic literature published about the topic of jitter in recent years. Some go as far as to propose entirely new solutions to packet streaming issues.

The internet jitterbug

Think of it this way. Jitter is caused by a ‘traffic jam’ in an IP network. Depending on the situation, this traffic jam can occur either at the router interfaces, or in a provider or carrier network if the circuit has not been provisioned correctly.

By definition, jitter is the variation in the delay of received packets. For optimum viewing experiences, packets should instead be sent in continuous, evenly spaced streams.

Naturally, with all things ICT there is an acceptable level of jitter, and it can be hard to find the line without a qualified professional. Some of this boils down to the accuracy of a specialised clock, which can measure and report jitter down to the very important milliseconds.

Too much jitter will lead to packet loss because of buffer overflow or underflow, and that impacts user experience. But how much is too much? That’s the question that technical professionals rack their brains over, but some, are adamant at solving.

While it’s still not exactly clear-cut, the answer is somewhat two-fold, depending on whether the jitter is happening through networked devices or video equipment.

Buffering, please wait.

At home, jitter is a little annoying, but like other things, you can learn to live with it. A lag on Netflix or Stan isn’t the end of the world. Plus, because these internet streams are unidirectional, where you’re merely a viewer and not a participant, they basically have the capacity to handle more data. That capacity, essentially, should stop jitter in its tracks.

However, in a business environment jitter can be more than a little annoying. It’s enough to already have the jitters in an important teleconference, but worse when the stream has the jitters too. In another example, there’s the extra awkward factor in a prospective employee-employer job interview situation, where both parties experience a lag effect.

How to maintain QoS

According to Cisco, there are general parameters for acceptable jitter. At the most basic level, jitter should be below 30 milliseconds, packet loss shouldn’t be more than 1 percent, and network latency shouldn’t go over 150ms.

These figures are regarded as ‘best practice’ for maintaining quality of service (QoS), and ICT professionals should keep these top of mind.

So, when looking at your enterprise network configuration, ask your provider how they can implement QoS solutions so the only jitter you experience is in this article.

Related products & articles