Network Performance Benchmark – Akamai vs. AWS Cloudfront vs.  Google CDN

Categories

PerfOps, a startup that has recently launched an analytics platform, has benchmarked the network performance of some of the biggest names in content delivery. PerfOps, which has been in development for the last two and a half years, officially launched two months ago.  

On this particular performance test, the PerfOps teams benchmarked Akamai, AWS and Google in the APAC region. No performance tool is perfect, but PerfOps is driven to provide the fairest benchmark test for all providers, regardless of size, including Akamai.

The test below is only a snapshot in time and there are countless external factors that can impact performance, even the weather for example. In order to smooth out deviations, the tests were run over a 30 day period. The shorter the period, the more dramatic the fluctuations in the line curves representing a service provider.

Benchmark Performance Tests

The five vendors compared across the following benchmark performance tests are Akamai, AWS Cloudfront, Google, Cloudflare and Fastly.

Benchmark Performance Test #1

PerfOps Benchmark Performance Test #1

The graph above shows performance data for the last 30 days for APAC with a one day interval. The data is based on millions of Real User Metrics (RUM) tests. PerfOps’ JavaScript runs in the browser of the users of its partner websites. This runs benchmarks for all the CDNs, allowing PerfOps to get 200+ benchmarks per second or 2B+ per month per provider.

Benchmark Performance Test #2

PerfOps Benchmark Performance Test #2

This graph shows performance data for the last 48 hours for APAC with a one hour interval. As above, the data is based on millions of RUM tests.

Benchmark Performance Test #3

PerfOps Benchmark Performance Test #3

The manual latency performance test is based on manual benchmarks using PerfOps’s global infrastructure. The manual test was performed on Google CDN to validate the performance data in the graph and it shows that Google is indeed performing at those levels. This is active data (vs. passive), meaning you can run your own real-time benchmarks based on customized parameters.

Benchmark Performance Test #4

PerfOps Benchmark Performance Test #4

The Network Utilities test involves running a small real-time benchmarks test in Australia for Google CDN. It shows the route the request took from a server in Sydney to Google’s facility in Australia, and is measured in milliseconds.

The results show that there are only six hops (i.e. the number of routers the packet passes en route to their facilities): an impressive number, indicating that Google has good performance in Australia. We ran tests for other providers and it showed that the packet can travel up to as many as 15 hops in Australia before the response comes back to the user.

Benchmark Performance Test #5

PerfOps Benchmark Performance Test #5

The final benchmark performance test here relates to RUM uptime for the five vendors in APAC over a 48 hour period with a 1 hour interval (each provider gets an equal number of benchmarks across every country to ensure fair analysis). The tests in this snapshot show that AWS Cloudfront was experiencing some network issues, which are likely related to routing or peering problems, since they have a 96.75% benchmark in APAC compared to the other four vendors.

These types of problems can be analyzed further in the raw logs. Raw logs identify specific issues down to the user level and why those problems occurred.  

Scroll to Top