In 2014, ESPN opened Digital Center 2 (DC2) in Bristol, CT, a $125 million 190,000-square foot broadcast facility. With Evertz fiber-optic routers, Arista switches, comms provided by Riedel, and 1,100 miles of fiber optic cable, the resulting infrastructure could handle 60,000 simultaneous signals, 46 Tbps throughput, and stood as one of the first large-scale implementations of an Ethernet AVB network solution.

This blog begins a three-part series exploring AVB/TSN’s role and impact at ESPN.

Why Ethernet Audio Video Bridging?

When ESPN began looking to upgrade their broadcast facilities, they realized they faced serious limitations with their ten-year-old Digital Center 1 location. Problems included expanding data demands from advances in audio and video quality and an increasingly out-of-date infrastructure. They made plans to build a new data center from scratch.

Plans included five studios, six production control rooms, and sixteen rooms dedicated to editing and processing the massive flow of high definition video and audio. They focused on assembling a single production and IT infrastructure that could service the growing demands of their industry. This included media outlets, digital, VOD, and unpredictable requirements of the future.

AVB/TSN Solution

The design and engineering teams focused on a solution that would allow vendor flexibility for full interoperability, as well as a smooth transition from their analog system to full digital capability. They wanted a robust, comprehensive approach for the new broadcast center that capitalized on the advantages of a consumer off-the-shelf (COTS) IT infrastructure, while using a Top of Rack (ToR) network approach. Additionally, they needed to guarantee “analog-like” sound quality, as well as “analog cable-like” uptime.

ESPN turned to Ethernet Audio Video Bridging / Time Sensitive Networking (AVB/TSN). This is the IEEE set of standards that uses bandwidth reservation protocols and synchronization to ensure delivery of time-sensitive data with low latency. The proven deterministic nature of AVB/TSN meant a guaranteed performance for large amounts of streams in a normal switched network.

Synchronization via Presentation Time

Endpoints on an ethernet audio video bridging network stay synchronized with each other through the use of a common clock, which endpoints derive from messages. With all nodes on the same clock, any source (talker) on the network could send out a signal, while defining the presentation time (a determ offset of 2ms), and each potential receiver (listener) would know the exact same presentation time. So, if data arrives even slightly off-schedule, the system would buffer to accommodate those mismatches. This approach enables high-quality, “analog-like” delivery of audio and video anywhere on that network.

Time-Critical Media Traffic

AVB/TSN allows reservation of up to 75% of bandwidth for critical data traffic, enabling the guaranteed delivery of time-critical media data, via two classes of traffic:

  • Class A traffic would keep within a 2ms maximum delay over 7 hops
  • Class B traffic would max out at a 50ms delay over 7 hops

And, AVB/TSN’s credit-based traffic shaping allows the system to restrict bursts in traffic, as well as avoid buffer overrun by keeping any misbehaving devices in line.

The resulting network provided the benefits of reliable transport of professional media flows, and automatic provisioning without any need for custom controllers, all due to the power and simplicity of AVB/TSN.

In Part 2, we explore how AVB/TSN could support ESPN Digital Center 2’s audio and video production.


Photo credit: NFL Draft 2010 ESPN Set by Marianne O’Leary via Creative Commons license 2.0.
Hayes, Michael. “ESPN DC2 Project Overview.” Audio Engineering Society Convention. Los Angeles Convention Center, Los Angeles, CA. October 2014. Powerpoint presentation.
Pannaman, Jonathan. “ESPN DC2 Design Philosophy.” October 2014. Powerpoint presentation.
Daley, Dan. (2014, June 10) “ESPN’S DC2 Scales AVB Large.” Retrieved from