Do It Yourself CDN: Netflix, Comcast and Facebook

Categories

Over the past couple years, video streaming, gaming, photo sharing and other forms of high-data usage media consumption have exploded across network platforms. More and more people are online, requesting more and more content, and at the crux of this all is the CDN. CDNs are there to keep content moving as the stable backbone of the network, but an emerging trend in the market has implications that some believe might render traditional, commercial CDNs an expendable middleman in their own industry. Is there any truth to this?

The Trend: DIY CDNs

The trend has been that more and more companies are building their own CDNs. Large companies like Netflix, Comcast and Facebook, which all operate in very different forms, have all begun to shift towards this unifying trend of building their own CDNs. All three are using their own CDNs in varying ways, which has a definite impact on how we look at the future CDN industry. But does Netflix, Comcast and Facebook represent the general marketplace?

Netflix Open Connect

Back in June of 2012, Netflix announced the rollout of their CDN called Open Connect. With Open Connect, Netflix no longer relies on third party CDNs like Akamai, Limelight and Level 3 as much, which were the only companies with the capacity to support the level of bandwidth Netflix required at the time. Several factors came into play when Netflix decided to build their own CDN.

First of all, from a logistics standpoint it made more sense. As the number of their subscribers grew, Netflix was receiving more and more complaints about the quality of their streaming services diminishing because their vendors could not support the traffic. Control of their user experience and content was at the hands of these CDNs who could not match the growth that Netflix required, so the logical next step was to build a CDN tailored to their own specific needs.

While this might have seemed like a big jump at the time, looking at the move financially it also made more sense, since the streaming industry was exploding into such a huge market, the expense of outsourcing their CDN needs became restrictive to their growth in some respects.

The volume of their business became contingent on the volume that third party vendors could handle, which was significantly limiting their business model and expansion opportunities in a market that was on the verge of an incredible boom. This is something that many popular applications see at the start of their development—more demand seems like it would be prosperous, but when outsourcing all their CDN needs, their expenses grow significantly, putting them in the red even though the business is expanding.

While all the pros were there for Netflix, one of the biggest hurdles they had to jump was dealing with the ISPs. Commercial CDNs like Akamai have extensive relationships with ISPs, but Netflix now had to form those relationships directly from scratch. And ISPs weren’t exactly welcoming, requiring Netflix to sign controversial pay-for-speed deals in order to support their network. The terms of these deals, set in a precedent by Netflix, makes it harder for other, smaller companies hoping to make the same transition, since they can’t afford those same terms, but at least for Netflix they now peer directly with several large ISPs, which has cut out the middleman, reducing delivery costs and increasing user experience. So it’s a definite first step towards a new emerging business model.

What also stands out about the way Netflix and other companies built their own CDNs is the fact that they are using open source caching software and bare-metal hardware. It doesn’t get much more DIY than that. These massive companies that have billions of dollars in revenue are using software that is universally accessible to everyone, with a completely open policy, sharing the developments they are making to the software, which only makes it easier for new emerging businesses to follow in their footsteps.

Comcast B2C CDN

On a similar note, last August Comcast launched its B2C CDN to support their own video streaming delivery efforts. Their CDN started off as an internally built caching platform and grew into a new CDN enterprise. They are now the only Cable Operator using open source caching software, which is a huge step towards proving that anyone can build their own CDN with the available open source tools on the market.

Comcast was having similar issues to the ones that Netflix faced when it came to increased demand and lower customer quality of experience for their video streaming and gaming services, which is why they decided to build their own CDN to meet their video media needs. They are now number two in PoP counts in North America after Akamai, with 100 deployed PoPs and growing.

Comcast is transitioning to a new world where they have a faster release of software, that is spearheaded by the building of their own infrastructure. And just like Netflix, they had a big choice on their hands—buy the infrastructure that ISPs love, or build it themselves? And the prevailing answer to that quandary is becoming more and more one-sided.

Facebook Photo Infrastructure

Over the past year, Facebook has also been testing the waters with their own CDN, migrating a lot of their photo infrastructure to their own CDN. Currently, Facebook has over 1.7 billion user photos, with over 160 terabytes of used photo storage, over 3 billion photo images served everyday, and almost 100,000 images served per second during peak traffic windows. Their photo infrastructure is so integral to their business model, and again just like Netflix and Comcast, they crux of their business was all in the hands of a commercial CDN.

In 2012, just after Netflix’s announcement, Facebook  announced they began building their own “edge network,” which basically means CDN, to help manage the traffic for their photo infrastructure. VP at Facebook, Frank Frankovsky, explained that they are expanding their own network, with plans to deconstruct the server, reorganize how the servers functions in the racks, as well as open up their capacity in order to help them spearhead further innovation. Currently, they still use Akamai for the bulk of their CDN needs, but they are falling in line with a successive trend towards the DIY approach.

Building Basics

Clearly this trend in building your own CDN is something of significance, and over the next several years, might prove to have some real staying power given the magnitude of the companies that are using it. And if you’re deciding on building your CDN there are some basics that could help when considering the path to take.

When deciding how to build your CDN, it’s good to look at others as a blueprint. Comcast, for instance gives an in-depth look at how they built their CDN with four main tiers of focus:

  • Content Routing: This is all about getting the customer to the best cache for the content that they request, at their given location. The basic principles follow that
    • Distance: closest available content to the user
    • Network Cost: minimize the cost that the CDN incurs per connection
    • Network Link Quality: Maximize the quality of the link for the customer
    • Availibilty of Content: distribute the most amount of content required

Under Content routing you also have choices between DNS routing of HTTP routing, which depends on your preferences and needs.

  • Health Protocol: This tracks the health of the network by pulling health related data every 8 seconds to ensure that the system is functioning at normal levels.
  • Management and Monitoring System: Comcast built this in-house using Perl/Mojolicious framework
  • Reporting System: This service Comcast buys from Spluck, where the logs act as the billing system. They thought it would be easier to outsource this and found it necessary for their own specific framework.

Looking at their architecture may help you to begin to see the building blocks that are important when deciding how to create your own CDN from the ground up. But above all of these there is one decision that is probably the most vital when making a CDN and that’s the cache.

The Cache

Arguably the most important part of building your own CDN, comes in choosing the right caching software for you. For Comcast they decided to use Apache TS. In their research, they found that Nginx was fast but not compliant for their network. Advancements in Nginx and Varnish technology have been made over the past several years, but Comcast began building their CDN years prior, so they became invested in Apache TS and would find it more difficult to switch at this stage of their development.

For you own needs it may be good to evaluate which system has the best current working features, but any HTTP 1.1 Compliant cache will work. Comcast did look at Varnish and consider it for a while, but found issues scaling it when writing to the disk. They also found that Apache TS worked out of the box with their routers and one of the main selling points was the community of people who work with the software.

On the other hand, Netflix chose Nginx, specifically for achieve their goal of getting “more and more gigabits per second from a single box,” as stated by Nginx developer Gleb Smirnoff who worked closely with Open Connect engineers during their development, Due to their massive number of subscribers, Netflix wanted to maximize the quantity of users that each router could serve at the same time.

Since Open Connect was, and still is, developing better ways to accomplish this goal, open source material was the best option for them, since it is constantly evolving with new trends and discoveries. Also, given the medium of online streaming, they decided to go with Nginx specifically because of its proven speed. They also enjoy the fact that unlike Apache TS, Nginx Plus comes with devoted full-time engineers who help mitigate any issues with the service.

Varnish is also a popular option, but unlike Nginx and Apache TS it serves solely as an HTTP web accelerator, not as a web server. Because of this Varnish is usually paired with other systems, but it comes with some major advantages  when it comes to speeds as a reverse proxy, load balancing, DDoS defending and much more.

Takeaway

All the parts that make up a CDN are more readily accessible and manageable, which again begins to strip more of the power away from CDNs. Yes, CDNs like Akamai are still unmatched in the industry, but this emerging trend is definitely something to keep an eye on in the future of the CDN market. As a result of Netflix, Apple, and Facebook are moving a lot of their traffic to their own CDNs. Akamai has seen some slowing in revenue growth, but this is more of a small blip than anything else. Akamai is still unmatched in terms of quantity of edge servers, and their well positioned to grow even if Netflix, Comcast, Facebook and Apple do DIY, because these four players are not representative of the general market.

Scroll to Top