Akamai is Top of Mind
Since the day Akamai announced their earnings, they have been Top of Mind in the industry, dominating the PR battle against the competition, including CloudFlare. But not only is Akamai winning the PR game, they are also winning the innovation game, having introduced a bunch of new services in the last few months, like Managed Kona, which wraps Professional Services + 24/7 support into their WAF offering. Now let’s discuss some of the trends, shifts, and disruptions happening in the market.
Akamai’s Earnings: The dust has settled on Akamai’s stock price and it’s almost back to normal now. During the time of every earnings call, the market reaction before and after plays out like a broken record, which is fun to watch. The story goes – Akamai announces earnings, their stock drops a decent amount, only to come back to where it was a month later. During the earnings call, Akamai stated that two of their largest customers, likely Apple and Facebook, which represented 12% of total revenue in Q2-2015 decreased to 5% of total revenue.
In addition, Akamai’s Top 6 customers that represented 17.8% of total revenue in Q2-2015 dropped to 10.7% in Q2-2016. The market thinks this is bad news, however, we believe it’s the best news coming out of the Akamai camp in the last year. Akamai should put all their efforts and focus on reducing the 5% (two largest clients) and 10.7% (top 6 clients) to .005% and 3%. This is Business 101, Michael Porter style – the last thing you want in this world is to have your fate in the hands of one or two customers that have all the leverage in the world. Even if Apple and Facebook have a million caching servers in place, they’ll always need a secondary CDN as backup in case their networks take a dump, which happens more often than people think. And Akamai, having the largest network, it’s them by default.
Buckets and Bits and Numbers Don’t Lie
The Street and many thought leaders in the industry are screaming from the rooftops that DIY is a game changer that will wreak havoc on existing business models because of what Facebook, Amazon, Netflix, Apple, Google and Microsoft are doing. Just because Amazon, Facebook and Google are going nuts with DIY CDN, doesn’t mean everyone is doing it. Google, Facebook, and Amazon are DIY everything to their core; DIY servers, DIY routers, DIY storage, DIY everything. For clarity, let’s break down the CDN customer segment into buckets and bits and see if what happens:
Buckets (Demographics)
- Bucket 1: Google, Apple, Amazon, Facebook, Netflix and Microsoft
- Bucket 2: Big Telcos: Verizon, AT&T, Comcast, Telefonica, etc
- Bucket 3: Riot Games, World of Warcraft (Blizzard), Steam (Valve Software, etc)
- Bucket 4: Automatic (WordPress), Vimeo, Wowza, Wix,
- Bucket 5: Hosting companies
- Bucket 6: Disney, Viacom, Lions Gate, etc
- Bucket 7: Stripe, LinkedIn, Whatsapp, Instagram, Airbnb, Salesforce, etc.
Based on the above buckets, the companies in Bucket 1 should definitely have an internal CDN + 3rd party, due to their size, global reach and money in the bank. In Bucket 2, having a CDN may or may not be a good idea – too many factors to consider to give a definite answer. In Bucket 3, these big gaming companies should have a hybrid CDN in place, which includes DIY + leveraging 3rd CDNs like Valve Software. Everyone from Bucket 4 to Bucket 7 is not a good fit for DIY CDN.
The main premise for DIY is to save money, thus, can these companies save money by building their own CDN and keeping performance in the range of a 3rd party CDNs? Better yet, does it make business sense to go the DIY route?
The most important metric in determining whether a company should go the DIY is bandwidth. If a company is delivering 100Gbps of content 24×7 than DIY might make sense. Leasing cabinets, buying servers, buying routers, buying switches, and configuring software like caching, load balancing, and so on – that’s somewhat difficult. Any gifted geek can set this up if they put their heart and mind to it. The most difficult and overlooked piece of the DIY puzzle is bandwidth, as in transit. That’s the hard part.
If a company is delivering less than 100Gbps, DIY is a no go. And building one or two PoPs and stacking them with caching servers isn’t a CDN – it’s web hosting like Godaddy. Not only is it difficult working with carriers, but transit costs are high. Let’s run through a CDN build exercise for insight. Company ABC is delivering 100Gbps of content 24/7. They decide to build a CDN for US delivery, which represents 75% of their CDN cost. Based on a market analysis, they estimate that 7 PoPs is a minimal footprint required to adequately cover the US map from West Coast to East Coast. Here are the metrics:
- Delivering 100Gbps 24/7 means you need 25% to 35% of extra capacity lying around. Let’s say its 30%. Thus, 330Gbps of capacity is needed in total to accommodate burst, etc..
- 330Gbps divide by 7 PoPs = 50Gbps (we rounded up to 50Gbps since transit is sold in 10Gbps increments)
- 2 to 3 carriers are needed at each PoP for redundancy and performance purposes. Lets use 3 carriers
- The optimal mix in the US would be Cogent (inexpensive) + Level 3 (global reach) + Last Mile (Comcast)
- Setup: 1 Carrier is 10Gbps and Two Carriers x 20Gbps = 50Gbps at PoP
- Transit deals are usually done on monthly commits + busting; thus requires 10% of 50Gbps = 5Gbps monthly minimum commit per PoP
- Price per Mbps for the three carriers mentioned above (price = carrier is not mentioned on purpose) Carrier 1 – $.75/Mbps + Carrier 2 – $1.00/Mbps + Carrier 3 – $1.50/Mbps
- Lets take an average of the three $1.00/Mbps
- 10% of 50Gbps = 5Gbps x $1.00/Mbps = $5,000/mo. per PoP as the minimum commit
- Since company ABC is delivering 100Gbps of content, 100Gbps x $1.00/Mbps = $100,000/mo in transit cost = $1.2M in annual transit cost
- Now let’s include include hardware for each PoP: three servers + 2 routers + 2 switches (CDNs know how to combine these but not enterprises) equals about a $100k per PoP x 7 PoPs = $700k.
- $1.2M in transit + $700k in hardware is about $2M
- 5 Year Cost for DIY CDN = $1.2M x 5 Years + $700k = $6.7M, which doesn’t include engineering services for installing, configuring, upgrading, maintaining, etc. Lets use one million per year for talent
- Total DIY Cost: Hardware + transit + talent = Â $11.7M for DIY CDN over 5 years
Of course its much more complicated than this, as transit cost and hardware cost vary, but it’s not too far off the mark. Conclusion: Is it cheaper to build a CDN to delivery 100Gbps of content 24/7 instead of relying on 3rd party CDN? No. It’s more expensive. The purpose of the exercise is to demonstrate the nightmare and headache of dealing with bandwidth in DIY, which is often overlooked. In summary, its is not “DIY” that is the game changer, but the Trio (AWS + Azure + Google) that’s the game changer, because major platforms and providers are building their infrastructure on them, leveraging their CDN in the process.