How to use your mocked DynamoDB with AppSync and Lambda

6 minute read

Among the hottest latest features released by AWS Amplify CLI is the option of mocking locally a few components of your Amplify stack.

With amplify mock, you’ll be able to run some AWS services in you local environment without needing to have a development stack to deploy your changes when you are implementing/testing new code.

In particular, when mocking an AppSync API, Amplify CLI will automatically spin up an instance of DynamoDB Local (if this is the chosen storage system) to keep the data created locally. Amplify will create the folder amplify/mock-data/dynamodb to allow persisting and accessing the data even if you stop/start the service.

Thus, performing CRUD operations in the DB through AppSync is transparent to the developer and we don’t have worry about anything apart from making use of our queries or mutations.

However, as your application grows you’ll probably find yourself building custom queries and mutations which rely on lambda resolvers to tackle more complex scenarios. Good news! Amplify mock supports lambda and you’ll also be able to call your functions locally.

If our Lambda resolver has to operate with data located in our DynamoDB we’ll end up with a situation like:

A section of our mocked Amplify stack

The bad news… there are no guidelines yet about how the local DynamoDB is instantiated by the Amplify CLI, how directly access to it (when it is not via AppSync) or what their differences are with regard the DB deployed to our remote stack. 

Using DynamoDB local

According to the documentation of DynamoDB Local, by default, the port 8000 is used to launch the instance in your computer. However, it is not the case for the Amplify CLI. 

It took a bit of investigation (checking the ports reserved by the process created by the mock service) to find out the DB was actually listening 62224. Still, running aws dynamodb list-tables --endpoint-url http://localhost:62224 returned an empty array instead of the list tables created by our @models in schema.graphql

Then, George, had the great idea of wondering how Amplify itself connected to the DB so checking the codebase, there we find it!

We were missing the values for  region, accessKeyId and secretAccessKey. With these parameters, you’ll be able to connect to your DB via the SDK or a GUI like dynamodb-admin to visualise in an easy way the data and metadata from a browser.

Note: It is important to notice since this hasn’t been documented by AWS, it might not be considered as a final solution and the values can change in the future so I’d recommend to keep up to date with the latest announcements related to the topic. 

Local vs remote stack

Another small difference when using DynamoDB locally vs when in the remote stack is the table naming patterns.

For a given type in the GraphQL schema:

The corresponding Dynamo table will be named:

  • Locally: `${type_name}Table` - BlogTable

  • Remotely: `${type_name}-{aws_appsync_id}-${stack_name}` - Blog-xxxx-staging

Your code will need to account for the variations if you want to get it working in all scenarios:

  • Make sure your lambda function has access to the api component. Amplify will add the needed Outputs to the CloudFormation stack so that you can have access, in the form of environment variables, to the parameters AppSync Id and AppSync GraphQL Endpoint.

  • Add some local environment variables to use when mocking your stack, they will substitute the automatically added values in the deployed Lambda function:

  • Then, you can solve the local/remote variations with some logic:

We created a custom DynamoDB wrapper class to carry out this bit behind the scenes.  Now, your code will flow with freedom. No need to know where it is being run ;)

Enjoy your mocked resources!