AWS is a cloud computing platform with services galore. One of the most common is AWS Lambda — a serverless solution that allows us to run any application using event-driven triggers. In this article, I described how we can test and develop AWS Lambda solutions whose execution is based on the S3 trigger created with the Chalice framework.

Chalice introduction

Chalice is a framework for writing serverless apps in Python. It lets us quickly create and deploy applications that use AWS Lambda. The Chalice framework provides the following capabilities:

  • A command line tool for creating, deploying, and managing your app.
  • A decorator-based API for integrating with Amazon API Gateway, Amazon S3, Amazon SNS, Amazon SQS, and other AWS services.
  • Automatic IAM policy generation.

Chalice CLI

To connect Chalice code with AWS infrastructure, the Chalice team prepared CLI tools that will allow us to deploy the application to AWS. The most important commands are:

  • chalice deploy — Deploy the app to the AWS.
  • chalice delete — Delete already deployed infrastructure.
  • chalice local — Run chalice locally.
  • chalice new-project — Bootstrapping a new chalice project.

Basic setup

In order to start using the Chalice, we need to install it first:

$ pip install chalice

Then run a command to start a new Chalice project:

$ chalice new-project helloworld

For the sake of this article, I created an uncomplicated Chalice app with Lambda running on the S3 ObjectCreated event:

Local testing

There are several options to test your Lambdas in the local environment. The most straightforward is the chalice local CLI command, which spawns a web server that will allow you to test Lambdas responsible for running the REST APIs. Sadly, it might not be enough when it comes to testing trigger-based Lambdas.

Localstack support

To test the trigger-based Lambdas in a local environment, you might consider using the Localstack app. Localstack is a local cloud stack that allows us to develop and test our cloud solutions without initializing them in the AWS infrastructure. It gives us a way to test the business logic of the Lambdas without actually having to pay for AWS services.

Unfortunately, Chalice does not have out-of-the-box support for the Localstack. Thankfully, we can use a Python package called chalice-local that grants us the possibility to deploy the Chalice application straight to the Localstack.

Before we start, we need to set up the Localstack Docker container. To do that, we will use the docker-compose tool. To run the Lambda function triggered by the S3 event, we will need the following Localstack configuration:

The most important thing to note is that we need to select the resources we want to use. For the purpose of this, we need to set the SERVICES environment variable in the Localstack container definition:

SERVICES: s3,lambda,iam

Next, we will have to create the S3 bucket we will use. We can do it either through docker-entrypoint-initaws.d Docker volume or by executing the AWS CLI command after starting the containers. For the sake of this tutorial, we will create a bucket through AWS CLI. To do that, we need to run the Localstack’s container first:

$ docker-compose -f docker-compose.yml up --b

Executing the above command will build the Localstack’s Docker image and start all requested resources for us. When everything is done and running, we can finally create a bucket. First, let’s get into the Localstack container’s shell:

$ docker-compose -f exec localstack bash

And then, inside the Localstack’s shell, run the following command to create the S3 bucket, which we will use to trigger the Lambda:

$ awslocal s3 mb s3://helloworld-bucket

Now, we have the Localstack, we have the bucket, so it’s time to deploy our Chalice app to it. To do that, we can use the aforementioned chalice-local library. Open another terminal and install the lib by using PIP:

$ pip install chalice-local

After that, let’s deploy the app to the Localstack with the below command:

$ chalice-local deploy

After running the command, you should get the following results (with an accuracy of up to your AWS region):

Creating deployment package.
Reusing existing deployment package.
Updating policy for IAM role: helloworld-dev
Updating lambda function: helloworld-dev-handler
Configuring S3 events in bucket helloworld-bucket to function helloworld-dev-handler
Resources deployed:
  - Lambda ARN: arn:aws:lambda:eu-west-1:000000000000:function:helloworld-dev-handler

When everything is up and running, go back to the Localstack’s shell and run the following commands to trigger the Lambda:

$ touch test.txt
$ awslocal s3 cp test.txt s3://helloworld-bucket

After executing the overhead commands, you should see the following logs in the Localstack’s console:

{"hello":"world"}
> START RequestId: 27ff0253-db58-1b88-c550-2947ae487ed3 Version: $LATEST
> END RequestId: 27ff0253-db58-1b88-c550-2947ae487ed3
> REPORT RequestId: 27ff0253-db58-1b88-c550-2947ae487ed3 Init Duration: 134.97 ms Duration: 1.97 ms Billed Duration: 2 ms Memory Size: 1536 MB Max Memory Used: 26 MB

If so, everything is up and running correctly. Now, we can start testing our Lambdas locally.

Unit tests with Chalice

Unit tests are substantial when it comes to Continuous Integration and automatic tests. Luckily for us, Chalice provides a test client that we can use to create unit tests for our Lambda. The test client has the capability to generate AWS events and run the Lambdas in the unit tests. Chalice’s test client should be used as a context manager, likewise the Flask’s test client. For instance, a test for the S3 triggered handler presented above could look like:

In order to run the test, you will need a test runner like pytest.

Conclusion

AWS Lambda is a powerful tool that can help to create amazing solutions. Sadly, testing trigger based Lambdas without the right tools might be a demanding task. Fortunately, Chalice & Localstack give us a possibility to do it quite effortlessly without pushing anything to the actual infrastructure and paying the bills.

If you need a dedicated development team for cloud testing and deployment…

Let’s talk!

Kamil Kucharski is a dedicated Backend Developer at Makimo, constantly exploring the dynamic terrain of DevOps, AWS, and Software Architecture. A fearless hero in the tech world, he confidently navigates his way through complex challenges, often sharing his insightful discoveries through articles on Makimo’s blog. Guided by his passion for Python and Clojure, he continually seeks the right tool for every unique challenge. Outside of his professional journey, Kamil channels his energy into mastering the art of Kendo, embodying the fearlessness and heroism he expresses at work.