Cloudflare Steps into Leadership Position In Serverless among CDNs

Categories

Cloudflare is increasingly stepping into a leadership position in serverless among CDNs. Before we look at why, let’s first take a step back and look at why serverless.

There are an increasing number of options for how best to build and manage applications, including serverless and the use of containers. Containerization has become a particularly attractive option over the last few years as with containerized solutions, there is no need to launch an entire virtual machine (VM) for each app; rather, several isolated applications or services are able to run on a single host and access the same OS kernel. The underlying reason for their huge growth in popularity? Essentially, containers make developers significantly more productive. As Chenxi Wang, founder of the Jane Bond Project cybersecurity strategy consulting firm writes, containers let developers “deploy, replicate, move, and back up a workload even more quickly and easily than you can do so using virtual machines.”

However, working with containers can pose significant challenges in terms of hiring new talent and training. One way to think of serverless is that it offers the benefits of containers without the need to think about them or find the elite talent that is necessary for working with container technologies like Docker and Kubernetes. With serverless, developers are left freer to concentrate on business strategy and logic instead of having to place their focus on infrastructure. This abstraction of server management and capacity planning is the reason for the name serverless since serverless computing still requires servers and their management.

In serverless, a third-party, usually a cloud provider, acts as the server, dynamically administering the allocation of machine resources and making decisions about purchasing, renting or provisioning servers or VMs for the back-end code to run on. This is known as BaaS or Backend as a Service. Serverless architectures can also run in ephemeral containers (FaaS or Function as a Service) through serverless vendors like AWS Lambda that typically offer a “pay-as-you-go” service. Serverless code can still also be used in connection with code written in traditional server style, like microservices, even with the same application.

In addition to the efficiency it offers, serverless has also grown in popularity because of its potential for cost savings. As you typically pay via pay-as-you-go methods, you don’t have to pay for periods of underutilization and are only charged for the time and memory allocated to run your code without fees for idle time. This significantly reduces operating system admin costs, such as installation, licenses, maintenance, support, etc.

AWS – The Leader of Serverless

Amazon got out the industry gate quickly in relation to serverless, introducing AWS Lambda in 2014 and being the first major cloud provider to do so. Initially, Lambda only supported Node.js. Today, it supports Python, Java, C# and Go, and Node.js offers the capability to indirectly invoke code written in other languages. Lambda allows you to run code for almost any kind of application or backend service with zero administrational requirements. You upload your code to Lambda and it will take care of the configuration necessary to run and scale your code with high availability. Lambda automatically scales your application by running code when a trigger is set off. Your code runs in parallel, processing each trigger on an individual basis, meaning that scaling happens directly in relation to the size of the present workload.

According to Amazon Web Services (AWS) CEO Andy Jassy, serverless at AWS saw a startling growth of over 300% year-over-year in 2017. The latest bi-annual Cloud Native Computing Foundation (CNCF) survey found that the top hosted serverless platform by a long way was AWS Lambda at 70%. It is the undisputed public cloud leader by a significant margin. Google Cloud Functions and Azure Functions trail behind AWS with Google Cloud Functions at 25% up from 13% last year and Azure Functions – up from 12% to 20%.

The CNCF survey found that the majority of companies using containers are also deploying to AWS, although the numbers had fallen from 69% to 63% followed by on-premise servers (at 43%), Google Cloud Platform (35%), Microsoft Azure (29% – up from 16% in 2017), VMware (24%) and OpenStack (20%).

There has been a lot of hype in the industry about the level of dominance that AWS currently holds and as a result, its likely continued dominance in the future. Leading Edge Forum’s Simon Wardley for instance, tweeted, “AWS Lambda owns 70% of the active serverless platform user base – let me translate that for you. Amazon is currently positioned to own 70% of the future of ALL software.”

Others have dismissed this as hyperbole, including Lawrence Hecht, writing in The New Stack. Hecht instead predicts that, “as serverless expands and deepens that enterprises will continue to utilize multiple clouds, with AWS usually being part of the mix”.

For a detailed comparison of the major cloud providers’ approach to serverless, check out our earlier post.

Trends in Serverless

There is a growing serverless community engaged in mapping and building new trends. Its minimal DevOps requirements have won serverless a large number of advocates, particularly within the developer community who are relieved not to have to worry about load balancing, security patching, scaling, etc. and can put their attention on other areas. There are several key trends to watch within the space.

Serverless Startups

Startups are finding a way to combat AWS’ dominance in the field in part by cashing in on AWS Lambda’s success and building extensions on top of Lambda to improve and extend its core services ahead of the speed at which AWS can do the same. They are focusing on specific service types, aiming to outshine Lambda in particular fields.

Stackery, for instance, offers a “complete serverless toolkit” for building “production-ready applications”, which began by focusing on the construction of a serverless infrastructure monitoring tool – built specifically to solve some of AWS Lambda’s persistent challenges, particularly around its Application Performance Management (APM) monitoring, which lacks transparency and storage capacity.

Co-founder and CEO Nate Taggart is unabashed about building a startup to combat some of Lambda’s inefficiencies, telling The New Stack, “AWS Lambda is great, but it leaves a lot to be desired in terms of production worthiness for serverless”. Stackery has also focused on building self-healing architectural patterns and instrumentation to boost performance and observability in response to problems that arise within Lambda’s distributed architecture, which can cause bottlenecks and mistakes downstream.

Another serverless startup to take note of is Galactic Fog, HQ’ed in New Jersey, which is taking a slightly different approach and is focused on helping enterprises migrate to the cloud and streamlining application development through its own Gestalt platform. In essence, the platform is a container management and serverless implementation built to provide companies with a straightforward path to the adoption of cloud-native technologies and leave room for flexibility in implementation in accordance with future developments of their applications and infrastructure.

Serverless Databases

Various serverless databases have also been released, which extend the execution model to the RDMS (regional database management system):

Azure Data Lake and Azure Data Lake Analytics – a highly scalable data storage and analytics service, hosted in Azure, Microsoft’s public cloud.

Google Cloud Datastore – an eventually-consistent document store, which is a standalone version of the database component of Google App Engine.

Firebase – a hierarchical database, which is available via pay-as-you-go and fixed plans (also owned by Google).

FaunaDB – a globally distributed, transactional database using the tech behind Twitter as a foundation. It can run on multiple public cloud providers or on-premises with pay-as-you-go pricing.

Cloudflare Assumes a Leadership Position among CDNs in Serverless

Among the major CDNs, Cloudflare is increasingly assuming a leadership position in serverless. Earlier this week, the innovative CDN announced that Cloudflare Workers (its service enabling developers to write JavaScript to run on Cloudflare’s edge) is now integrated into its serverless framework as a serverless cloud provider.

Until now, deploying Workers involved developers doing their own editing through Cloudflare’s browser-based IDE or constructing customized tooling on top of Cloudflare’s API. The new announcement will allow developers instead to: “define the entire structure and routing behavior of your Workers scripts in code and deploy them with ease using serverless deploy from your own development environment”. In addition, Cloudflare offers, “Store configuration files in version control alongside your application code. And [the chance to] feel more confident testing your application with serverless invoke, a new way to quickly send requests to endpoints of interest with specific arguments and headers”.

Cloudflare Serverless Framework

The Cloudflare Serverless Framework is open-source and available on GitHub. It offers developers the option to build web, mobile and IoT applications through writing code and deploy them to serverless architectures on multiple platforms, including the three major cloud providers and others. Its new plugin enables developers to use its serverless command line tool with Cloudflare Workers by defining the structure of the Worker in serverless,yml and then deploying it to Cloudflare’s global network of 152 data centers.

Use Cases

In a recent blog post, Cloudflare offers various templates that demonstrate the ease of use of running applications on Workers and deploying them with its serverless framework, including this complete serverless.yml for an external webhook Slackbot that fetches the latest stock prices running on Workers:

service:
   name: slack-bot
   config:
     accountId: CLOUDFLARE_ACCOUNT_ID
     zoneId: CLOUDFLARE_ZONE_ID
     workers:
       slackbot:
         routes:
           – slackbot.example.com

provider:
 name: cloudflare

plugins:
 – serverless-cloudflare-workers

functions:
 SlackBot:
   worker: slackbot
   script: bot

Serverless uses FaaS to separate concerns within an application. In Cloudflare Workers, a function is the application, thus the script field directly refers to the name of your Worker script locally on disk.

To deploy the new Worker, run serverless deploy or serverless deploy -f SlackBot.

Another recent Cloudflare blog post offers another alternative demonstration of getting an API aimed at generating “some Bob Ross Lorem Ipsum” running across its global network.

In a different post, Cloudflare invited guest Paddy Sherry, serverless enthusiast and Lead Developer at Gambling.com Group, which builds performance marketing websites and tools using Cloudflare as its CDN, to write a piece on “Using Workers to Make Static Sites Dynamic”. Sherry shared why Workers was a valuable asset for his company, which favors static websites for their speed and dynamism:

“The reason we were so keen to experiment with Workers is that for anyone running static sites, 99% of the time, the product requirements can be met but there will always be that one occasion when some computation is needed instead of sending back a static response.

Until recently, the most suitable option would have been to add some JavaScript that fires after page load and alters the UI or fetches data from an endpoint. The drawback of this is that users see the page shifting after it loads, even if the script is loaded asynchronously. Flickering pages can be infuriating and there is nothing more irritating than trying to click a link but opening something else because the DOM changed midway through.

A common workaround is to hide the page content until all JavaScript has processed, but this leaves you exposed to a slow loading script with users seeing a white page until the browser has downloaded it. Even if all scripts are downloading quickly, there will be users with slower Internet speeds or located far away from a data centre that can respond to their request.

Enter Cloudflare Workers. Developers can handle these requests and respond dynamically before they even reach the server. There is no post load computation and Workers respond so fast in the background, the transition is unnoticeable”.

Cloudflare Global “Real World Serverless” Talks

The CDN is curating six global talks on serverless in San Francisco, Austin, London, Singapore, Sydney and Melbourne in an effort to build community and bring “the best minds on serverless technology from Cloudflare together to lead a series of talks on practical use cases for Cloudflare Workers”. The company is also open to invitations to host serverless events in other cities. The first already took place on September 11th in San Francisco and the last is scheduled for October 17th in Melbourne.

Scroll to Top