Ways to cache your Serverless applications published 12/11/2019 | 5 min read

Hello everyone, welcome to a new devspedia story. Today I'll talk about different ways to cache your serverless application, assuming an AWS-based architecture.




Hint: You can rapid start your serverless project by using Serverless framework.

Serverless - The Serverless Application Framework powered by AWS Lambda, API Gateway, and more
Build web, mobile and IoT applications using AWS Lambda and API Gateway, Azure Functions, Google Cloud Functions, and more.



Core system components we'll be discussing

  1. The client-side app.
  2. Route 53.
  3. CloudFront.
  4. API Gateway.
  5. Lambda.

Here's how these components are stacked

As you see in the flow above, we have 4 places where we can utilize cache:

  1. In the client side.
  2. In CloudFront.
  3. In API Gateway.
  4. Inside your Lambda function.
Developers prefer to cache on the client side where possible, it's both cost efficient, and offers the lowest possible latency.

Let's discuss them one by one.



Client side caching

By caching on the client side, we can allow best possible low latency, since no requests will reach next component – Route 53 – which means basically skipping latency and execution of the rest of the stack.

That happens by using caching techniques such as Cache Control headers, ETag, Web Storage,.. etc



Cache on CloudFront

By caching in CloudFront, we've skipped 2 actually fetching a file from S3, this makes response time a lot faster, due to serving the cached version from edge locations distributed around the world, meaning that the client will always get a cached version from the nearest location.

How Caching Works with CloudFront Edge Caches - Amazon CloudFront
Describes what a cache hit ratio is and how caching works with CloudFront.


Cache on API Gateway

Caching on API Gateway

API Gateway allows caching on their side, internally, it's utilizing CloudFront to cache across edge locations, there's a limit on how long you can keep API Gateway cache, which is up to 1 hour.

Enable API Caching to Enhance Responsiveness - Amazon API Gateway
Learn how to enable Amazon API Gateway caching to enhance your API's performance.

API Gateway caching is powerful, you can cache via request cache headers, URL query parameters, path parameters, and more.



Cache in your Lambda

Cache in your Lambda

There are multiple ways to use caching when the request have finally actually reached your lambda function.

First common case, is by adding an extra component such as ElastiCache, which is a fully managed services for RDS or Memcached, where your Lambda can start requesting them to see whether there's a cached version of a piece of data.

Amazon ElastiCache- In-memory data store and cache
Amazon ElastiCache offers fully managed Redis and Memcached. Seamlessly deploy, operate, and scale popular open source compatible in-memory data stores.

This is a common practice, and many developers already use this in production.

However there's also another way to utilize caching inside your Lambda by by utilizing invocations-shared code, check the example below:

  
// Code outside of the scope of the exported Lambda
const cache = {};

exports.handler =  async function(event, context) {
    if(cache.user) {
        return user.cart;
    }
    const user = event.user;
    return user.cart;
};

Cache user cart inside the Lambda

In the above example, we've assigned a shared const cache inside the Lambda function, this variable will be cached across all Lambda invocations.

So as long as the Lambda is being requested and is warm, since we mentioned warm and cold states of a Lambda, I'd like to share this excellent article that describes them in details:

Cold Starts in AWS Lambda
Selection of languages, instance sizes, dependencies, VPC, and more

That's all, thanks for reading devspedia, I love you, and I'll see you the next time :)



You may also like reading: