Serverless Computing: AWS Lambda vs. Azure Functions


Since the introduction of AWS Lambda in 2014, serverless computing has emerged as a way to decrease the cost and time needed for development on cloud services. Serverless computing allows developers to write code that is triggered by events and the cloud service automatically provisions the compute, memory, and storage resources for those functions and handles scaling, allocating resources as needed for the event volume. The functions are pay-per-use, meaning they cost nothing if they are not called.

Although AWS Lambda was first-to-market with this business model, Microsoft’s Azure Functions, a similar compute-on-demand service, is beginning to gain traction. And while both services provide an efficient, inexpensive way to display user-specific data, scale images, generate analytics for IoT data, and execute other event-driven tasks, there are a number of differences between the two services that are worth examining.

AWS Lambda can be developed in Java, Node.js, C#, and Python. Developers will specify the amount of memory and max execution time for a function. Based on that configuration, Lambda will provision a new container and deploy code from a zip file for each new function request. For repeated functions, containers may be re-used to minimize latency, although that is not always the case. The number of function requests then goes into determining how the user is charged.

Like AWS Lambda, Azure Functions can also be developed in a variety of languages, supporting Node.js, Python, PHP, F#, and C#. However, the container architecture for Azure Functions is different from AWS Lambda. With Azure Functions, resources are provisioned as needed, but the files run on Azure’s WebJobs rather than being deployed from a zip file. Another difference exists in that Azure Functions is built on Azure App Service, and can be deployed either via their App Service plan, where users pay per app service, or through their Dynamic Service Plan, which runs independently, allowing users to pay per function (as they would with AWS Lambda). For functions that take a long time to execute, the App Service is likely a better fit; for intermittent, quick functions, users should opt for the Dynamic Service.

The difference in pricing between AWS Lambda and Azure Functions’ dynamic service plan is minimal, both in terms of total cost and how it is calculated. For both AWS Lambda and Azure Functions, pricing is based on the number of function requests executed and the amount of resources those functions consume. Both services provide users with a free grant of 1M requests and 400,000 GB-seconds of compute time. Beyond that, Azure Functions charges users $0.20 per 1 million requests and $0.000016 per GB-second used, measured to the nearest 128MB up to the maximum memory size of 1,536MB. Lamdba’s cost is almost identical at $0.20 per 1 million requests and $0.00001667 for every GB-second used, rounding up to the nearest 100ms.  

There are also a few variables that limit execution of functions. For Lambda, functions are subject to a time limit of five minutes and there is a soft limit of 100 concurrent executions for Lambda, although there is no limit on function time or concurrent executions for Azure. However, Azure limits users to 20 functions per project, whereas Lambda has no limit to the number of functions.

The biggest difference between the two services, however, is that Azure Functions is open-sourced, so users can deploy it on local servers or other cloud services. Although Lambda is by far the more established service, Azure’s flexibility makes it an attractive up-and-comer. As competitors continue to emerge and evolve, there’s certainly room for it–or another competitor such as Google Cloud Functions or IBM’s OpenWhisk–to upset Lambda as the go-to service for serverless computing.

Scroll to Top