AWS Lambda, a level up (part 1): Performance

7 minute read

In my last article I related how to request a GraphQL API provided by AppSync from AWS Lambda making use of Cognito User Pools for authentication. A quick summary of the steps followed would be as:

  • Creation of a new Client App for our Pool that would allow server-side authentication (ADMIN_NO_SRP_AUTH).
  • Obtaining a JW Token for our user in the pool.
  • Initialising the AppSync client with the token given and our API configuration.
  • Querying and/or mutating the data.

As it was already mentioned in the article, the main goal was achieved but there was significant room for enhancement. So, I’d like to share a few things I learned on my way to get a better solution since the approaches they are based on could be applied to any project involving the use of Lambda functions and the framework Serverless:

  • Performance. Could we decrease the execution time of our functions reusing data across calls? In this article, we’ll pick up the example started in the previous one and we’ll see how a simple strategy in the way we structure our code can save some valued time, skipping the authentication process when we've already been granted with a JWToken in a previous invocation. We’ll do it in Typescript but you should find similar equivalents in any other language.
  • Automation. In that article, we assumed as certain the existence of a user with access to our Cognito Pool. But this raises some questions, how can we completely automate our Serverless deploys? Are CloudFormation templates enough? We’ll leave this subject for the next episode of the series.

Execution context in AWS Lambda

As it is stated by AWS, an execution context is created when a Lambda function is first invoked. The context initialises the dependencies and data needed for our code to run and it is kept for a while (we have no further information about how long it may last) to try to reuse the already loaded resources. These processes are informally named as “cold-starts” and “warm-ups”.

Normally, Lambda gets rid of data that is declared or instantiated inside the main function handler. It is considered their values won’t be of much use for the next call.

However, we can structure our code to make use of “util” static classes to provide the logic and store the data that can be reused across invocations. You won’t need any major change in your code because:

  • Classes, as any other dependency are normally declared outside the main function handler:
  • By definition, static variables will keep the data, being accessible throughout the execution of the function. And between different calls, once the context is warm.

Important, leave this note in a visible place: when following this or a similar solution, we can't trust the data will be kept by Lambda, even for a warmed runtime environment, so you'll always need to back up your data declaration with a default value or initialiser.

Reuse of JW Tokens in our app

Our basic Lambda function would look something like this: 

Our function gets a token through a Cognito Pool User to subsequently use it in a task that requires of authentication like AppSync. The problem is obvious, though we make use of a lighter protocol (ADMIN_NO_SRP_AUTH), we need to repeat the same operation over and over again even when the token is valid and could be used for a while once it is granted (in our case 30 minutes).

A couple of simple changes will solve our problem:

  • Move the logic to retrieve the token to a separate class OurTokenFactory.
  • Store the value in a static variable.

Cool! With this approach, after the first use of getToken(), successive calls will make use of the already stored value behind the scenes, saving the auth request.

Our backup, in case the token is not found, consists of a new request. But not only that, our class should be responsible for providing correct data. So, we’ll need to make the last tweak to make sure the token returned is not expired (as it could happen if we manage to keep the execution context warm for more than 30 minutes): 

The handler will be able to utilise our class in a standard way, leaving the code clean and tidy:

Setting up your local environment

Before you start writing and testing your code make sure you install the plugin Serverless offline which emulates an AWS Lambda environment in your local machine. Otherwise, you’ll find the context is regenerated in every call!

Conclusion

Serverless is a really powerful model which present new challenges. Getting a better understanding of its key points will help you to implement solutions that fit its idiosyncrasy. In the next article, I’ll run through a couple of simple tips that will improve the automation of your deploys.

Written by Jesús Larrubia (Full Stack Developer). Read more in Insights by Jesús or check our their socials Twitter, Instagram