X

AV-over-IP – Compression and Its Implications

Mike Maniscalco | Jun 24, 2020
Networking 450The third in a series of articles on Audio Video-over-Internet Protocol
 
The last AV-over-IP article in this series discussed the IP network components of an AV-over-IP system. Today’s high-quality 4K and 8K video have high bitrate requirements that can easily overwhelm an unprepared network. For example, the bandwidth requirements of 4K can be more than 12Gbps, and an 8K uncompressed stream can require nearly 48Gbps. While certain networking equipment could handle those bitrates, the costs associated with a system to meet those requirements would likely be prohibitive. Luckily, there is another key player in an AV-over-IP system — video compression.
 
Without compression, the demands of high-quality video on a network would be challenging to satisfy. The good news is that the evolution of compression technologies and the processing capabilities of today’s silicon chips enable visually lossless compression with various levels of quality and performance tradeoffs. There are also a variety of compression technologies that give designers and engineers more flexibility in meeting client requirements. Each of the compression technologies have tradeoffs that should be well understood.
 
Generally, different compression techniques offer many options in regard to latency, quality, and bandwidth requirements. These possibilities compete in what is often described as a “performance paradigm.”
 
The most natural benchmark for a distribution system is video quality. AV-over-IP systems use a variety of compression techniques, which all have various impacts on video quality. The best quality would be uncompressed video, but with 4K and 8K bitrates, it is often not practical. Alternatively, higher compression reduces bitrate but can have a noticeable impact on video quality. Finally, frame look-ahead techniques for compression can reduce bitrates with potentially less impact on video quality – but they do introduce latency.

Without compression, the demands of high-quality video on a network would be challenging to satisfy.


Latency is the amount of time it takes for a video to travel over AV-over-IP from an encoder to a decoder. Measurements of latency in an AV-over-IP system are expressed in units of milliseconds or microseconds, which are fractions of a second. In some applications, such as streaming a single feed of a movie, latency is not a critical factor because the user would not perceive a transmission delay. Conversely, during sporting events, where a viewer may be watching live-action alongside a video stream, higher latency would be perceivable. 
 
Bandwidth is another major factor in designing and choosing the appropriate solution. The bitrate of the AV-over-IP system directly drives bandwidth requirements for the network. With an unlimited budget and fiber optic cabling, this is not a big concern. However, in most installations, factors such as existing network cabling, switching infrastructure costs, and the budget for the encoders and decoders will all be important factors.
 
Additionally, it is also important to note that AV-over-IP may not be the right solution for all client applications. AV distribution experts generally agree that AV-over-IP does not compete with other distribution technologies, such as HDBaseT, but rather complements them. It is the job of the CEDIA integrator to understand client application requirements and budget when deciding which distribution technique is most appropriate.
 
Want to learn more? Check out the AV-over-IP coursework in the new CEDIA Academy:

As technology rapidly evolves, the ability to deliver quality audio and video signals via IP has become more and more reliable, providing better and better experiences for clients. In this AV-over-IP course, you'll learn about the exciting new ways to distribute and deliver content throughout the modern home using these techniques.