During the USA-Germany World Cup match, Univision peaked at 750,000 concurrent live streams, breaking all of its own streaming records, and experiencing no streaming issues. Univision gets an “A” grade for the World Cup streaming event. During the same game, EPSN peaked at 1.7M concurrent live streams, more than doubling Univision’s traffic. ESPN does require it users take an extra step when viewing sporting events, requiring them to verify their Pay TV subscription. All was good for a while, than ESPN streaming services went caput, crashing during the most watched live streaming event in the history of mankind. As a result, Twitter lit up with angry tweets from ESPN subscribers. According to an ESPN spokesman, “we did investigate some limited issues due to unprecedented demand during the first half”. For this incident, we give EPSN a non-passing grade for the World Cup Live Streaming event.
ESPN streaming failures sounded very similar to the streaming problems HBO encountered delivering the last season of True Detective, and also the Oscars. Why Did ESPN Service Crash: I believe the reason ESPN crashed is because it relied too much on its own internal infrastructure, and not enough on a CDN. Lesson: Any media company that wants to stream a large live event such as the World Cup, Oscars, Super Bowl, Olympics, and so on, must use Akamai as the primary CDN, and a backup CDN such Level 3 or EdgeCast Verizon. I personally would give Akamai 90% of the streaming traffic, and 10% to the secondary CDN. Big event live streaming failure is unacceptable in today’s world. It’s not the fault of inadequate infrastructure, but the problems are due to inadequate planning. Resolution: For the World Cup, ESPN should have used Akamai to ingest content at the Akamai publishing point in LatAm, or a location close by, then Akamai could have transcoded a high bit rate stream to multiple output formats and bit rates, enabling the event to be viewed on any device. In other words, keep it on Akamai infrastructure for as much as possible.
The Bandwidth Math behind 1.7M Viewers
According to ESPN, the 1.7M viewers was unprecedented, overwhelming its infrastructure. But if we use CDN Math, the 1.7M is really not that much. Let’s break down the 1.7M concurrent viewers, and divide it by the number of POPs. If Akamai streams 1.7M across 50 POPs, and the viewers are watching the event at bit rates between 500K to 2Mbps, the bandwidth per POP is between 17Gbps to 850Gbps per POP. My guess is that since most of the users were at work, it would tend to be closer to the 500K bit rate. That means 1.7M is 17Gbps. By the way, 2 heavy duty servers can handle 17Gbps of live streaming traffic, but it’s the bandwidth that kills you, and that’s why Akamai is needed. Let’s just hope that ESPN doesn’t drop the ball again on the next USA match.
- 1,700,000 concurrent views x 2Mbps bandwidth = 3,400,000Mbps
- 3,400,000Mbps / 1000 = 3,400Gbps
- 3,400Gbps / 1000 = 3.4Tbps
- 1,700,000 concurrent viewers x 500k = 850,000Mbps
- 850,000Mbps / 1000 = 850Gbps
1.7M Concurrent Streams Per CDN POP
- 3.4Tbps / 50 Akamai POPs = 68Gbps per POP
- 850Gbps / 50 Akamai POPs = 17Gbps per POP