Serverless computing lets developers focus on writing code without worrying about managing servers. AWS Lambda is a leading example, offering simplicity, scalability, and cost efficiency for hosting app functionality. Lambda functions scale automatically based on demand, reducing overhead and accelerating development. You pay only for used compute time, measured in milliseconds, making it ideal for variable workloads. In addition, Lambda functions are distributed across multiple AWS availability zones, ensuring fault tolerance and high availability.
The CPU power allocated to your Lambda function is proportional to the amount of memory option you select. AWS Lambda automatically allocates CPU power linearly in proportion to the memory configuration. For example, a function with 256 MB of memory will have half the CPU power of a function with 512 MB of memory. You can select an option betwen 128 MB and 10240 MB and each option will provide a different compute speed which will impact your app performance.
Note: You can find AWS Lambda price per ms for each memory option here.
Depending on the option you select, your application will perform differently resulting in a complex behaviour. If your code runs faster on larger Lambda memory option, it will cost more per ms but will require less compute time. This results in a situation where the best configuration is not directly visible prior testing.
Image taken from https://github.com/alexcasalboni/aws-lambda-power-tuning
As you can see for the case study presented on the image above, the best cost-performance configuration is achieved for a Lambda fucntion with 1536 MB memory. Compared to the intial 128 MB setup, the Lambda is executing faster and for lesser cost.
The cost-performance optimization should be done for each new code you want to has as a Lambda function to find best configuration. Doing this manually is a long and tiresome task but to our luck the AWS Lambda Power Tuning repo automates this operation. The code deploys an AWS Step Function that can be attachted to your optimization Lambda target function via its ARN and the state machine will invoke it and fine the best configuration. This makes the tool language agnostic and easy to use.
Image taken from https://github.com/alexcasalboni/aws-lambda-power-tuning
Note: The repository provides an option for deployment with AWS SAM CLI which you can find here but we are going to use terraform instead
Make sure to clone AWS Lambda Power Tuning repository
Navigate to the terraform
folder.
Modify the variables.tf
file with your target AWS Account and region
1
2
3
4
5
6
7
variable "account_id" {
default = "123456789101"
}
variable "aws_region" {
default = "eu-west-1"
}
Deploy State Machine via the IaC
1
2
terraform init
terraform apply
After you deploy the terraform IaC you can navigate to the Step Functions Console and find the new step function. Provide an execution setup by targeting the Lambda you want to optimize and the different configuration options you want to test with the following JSON:
1
2
3
4
5
6
7
8
{
"lambdaARN": "your-lambda-function-arn",
"powerValues": [128, 256, 512, 1024, 1536, 2048, 3008],
"num": 50,
"payload": {},
"parallelInvocation": true,
"strategy": "cost"
}
Note: You can find all available field for the configuration JSON here.
Click "Start Execution" to begin. The output will display the optimal memory configuration and the associated average cost per execution.
Note: Credits go to Alex Casalboni (@alexcasalboni). Big Thanks for providing this awesome tool to the cloud community for free.
Optimizing AWS Lambda functions for cost and performance is crucial for efficient serverless applications. By carefully selecting memory configurations and leveraging tools like AWS Lambda Power Tuning, you can achieve significant cost savings and performance improvements. Always test different configurations to find the optimal setup for your specific workload. Happy optimizing!
We use cookies to ensure you get the best experience on our website. For more information on how we use cookies, please see our cookie policy.