Building a GraphQL API with serverless framework, AWS and Apollo Server 2

Okay, let’s build an API with the help of the serverless framework, GraphQL and Apollo Server 2, with a Dynamo database and AWS. We will build a table where users will be recorded, and lambda functions to create those users, to request informations from them, and update them. All that using a graphQL layer, and apollo-server-lambda making our job easier.

You can find the final code here.

Before anything happens, we have to subscribe to AWS, to install serverless framework and to set-up AWS credentials. I will let you subscibe to AWS, that is simple enough. I assume that we have nodeJS and NPM installed, so let’s install serverless:

$ npm i -g serverless

You can check this video to set-up your credentials. I recommend you to follow the quick start if you’re new to serverless framework => here. First let’s create a folder and install some packages :

$ mkdir api-graphql
$ cd api-graphql/
$ npm i aws-sdk graphql apollo-server-lambda@rc

Open your favorite IDE and let’s create two files : serverless.yml and handler.js:


Create the serverless.yml file

Copy this code into the serverless.yml:


service: api-graphql

  name: aws
  runtime: nodejs8.10
  stage: dev
    USER_TABLE: users-table-${self:provider.stage}
    - Effect: Allow
        - dynamodb:Query
        - dynamodb:Scan
        - dynamodb:GetItem
        - dynamodb:PutItem
        - dynamodb:UpdateItem
        - dynamodb:DeleteItem
      Resource: "arn:aws:dynamodb:${opt:region, self:provider.region}:*:table/${self:provider.environment.USER_TABLE}"

      Type: 'AWS::DynamoDB::Table'
          - AttributeName: ID
            AttributeType: S
          - AttributeName: ID
            KeyType: HASH
          ReadCapacityUnits: 1
          WriteCapacityUnits: 1
        TableName: ${self:provider.environment.USER_TABLE}

    handler: handler.graphql
      - http:
          path: graphql
          method: post
          cors: true
      - http:
          path: graphql
          method: get
          cors: true

Ok let’s take 5 minutes to explain what we have here. The serverless.yml file is the place where we set-up our back-end environment. In this case, the AWS environment. It’s a yaml syntax and we can use the CloudFormation syntax, which accepts the yaml, for setting-up aws services, like dynamoDB.

The provider object describes the back-end environment. Variables passed under environment are constants that we can find in our JS files with process.env.YOUR_VARIABLE. The iamRoleStatements is where we give permission to our aws profile. Here we Allow some DynamoDB actions on a specific resource that is our table. IAM stands for ‘Identity and Access Management’. We can create and set different profiles on our AWS console and be very specific about which profile is allowed to do specifics tasks, which lambda functions is allowed to access specifics resources, etc…

The resources object set-up the dynamoDB table in this example. You can find how the syntax works here. NoSQL databases don’t work like SQL databases (which makes sense!). Here we define only the attribute ID (type String define by the S) of the table named users-table-dev which is going to be the Primary Key, defined by HASH. But we are totally allowed to add, in the future, other attributes to our user item which is an instance of our user table. And items from our user table can have different attributes between each other. As you can see there is no rule, because the structure is not defined in advance. We are far from, for example the Symfony framework in PHP, where we define our tables attributes and generate our PHP classes with an ORM doing the link. It can be great, but it can be dangerously messy too. You’ll find a lot to read about NoSQL database, but for starters, you can read this nice introduction to DynamoDB.

The last part is the functions object, where we define our lambda functions. We use only one entry point, which is our apollo server. The name of the function is ‘graphql’. The handler, which is the local path where we find the function, is located in the handler.js file where we export the graphql function, which explains the handler.graphql. Protocol http is the way we trigger our function, and we can add some details, like the path in the url to use, the request method, etc…

Create the handler.js file

Let’s take a look now to our handler.js file :


const { ApolloServer, gql } = require('apollo-server-lambda');
const {userTypeDef, userResolvers} = require('./models/user');

const typeDefs = gql`${userTypeDef}`;
const resolvers = userResolvers;

const server = new ApolloServer({
    context: ({ event, context }) => ({
        headers: event.headers,
        functionName: context.functionName,

exports.graphql = server.createHandler({
    cors: {
      origin: '*',
      credentials: true,

We just set-up our apollo server and linked it to the ‘graphql’ end-point that we export. Every request of the type “https://your_amazon_aws_url/graphql" will trigger the apollo server. But as you can see, there is nothing about the user schema here and no back-end logic, we just imported some stuff from user.js. Let’s create a few other things and discuss about it.

Create the back-end logic

// ./functions/promisfy.js

module.exports = foo => new Promise((resolve, reject) => {
    foo((error, result) => {
        if (error) {
        } else {
// ./models/user.js

const AWS = require('aws-sdk');
const promisify = require('../functions/promisify');
const crypto = require('crypto');

dynamoDb = new AWS.DynamoDB.DocumentClient();

//Schema of user

exports.userTypeDef = `
    type User {
        ID: String
        email: String
        country: String
    type Query {
        user(ID: String!): User
    type Mutation {
        createUser(email: String): Boolean
        updateUser(ID: String, country: String): User

exports.userResolvers = {
    Query: {
        user: (_, { ID }) => getUser(ID),
    Mutation: {
        createUser: (_, { email }) => createUser(email),
        updateUser: (_, { ID, country }) => updateUser(ID, country),

// Lambda functions of user

const createUser = email => promisify(callback => 
        TableName: process.env.USER_TABLE,
        Item: {
            ID: crypto.createHash('md5').update(email).digest('hex').toString(),
            email: email,
        ConditionExpression: 'attribute_not_exists(#u)',
        ExpressionAttributeNames: {'#u': 'ID'},
        ReturnValues: 'ALL_OLD',
    }, callback))
    .then( (result) => true)
    .catch(error => {
        return false;

const getUser = ID => promisify(callback =>
        TableName: process.env.USER_TABLE,
        Key: { ID },
    }, callback))
    .then(result => {
        if(!result.Item){ return ID; }
        return result.Item;
    .catch(error => console.error(error))

const updateUser = (ID, country) => promisify(callback => 
        TableName: process.env.USER_TABLE,
        Key: { ID },
        UpdateExpression: 'SET #foo = :bar',
        ExpressionAttributeNames: {'#foo' : 'country'},
        ExpressionAttributeValues: {':bar' : country},
        ReturnValues: 'ALL_NEW'
    }, callback))
    .then(result => result.Attributes)
    .catch(error => console.log(error))

promisify.js: In version 10 of nodeJS, there is a function called promisify that transforms functions into promises easily, which is very cool. AWS is not running nodeJS 10 yet, so we create and export our own promisify function.

user.js: We finally find the code that we are interested here. We can put everything into the handler.js file, but since your project is not goint to be as simple as a tutorial, it’s good to see how we can organize ourselves (and you can read this to learn more about structuring an apollo project). We can find the schema and the resolvers of the user at the start. With apollo-server our job is nicely simplified. The resolvers take promises where we use the aws-sdk library to manipulate our dynamoDB table.

A little detail here, we can’t ask dynamoDB to generate automatically a unique ID for our table, due to the nature of a NoSQL database. There are some solutions that you can find, here we simply hash the email with the crypto library of Node to get our ID, and we checked if the ID already exists in our table. If yes, an error is thrown, if no, everything’s alright. This method is relatively basic, and may not work if you expand to different regions. Be warned.

Now let the serverless framework do it’s magic:

$ sls deploy

Test the API

Once our project is deployed, serverless should give us the end-points of our functions. Here we got 2 identics urls, for post and get requests, in a form of https://******
We can test our api directly with the curl command, or by going to our aws account in the API Gateway service. There we can see our end-points, see the functions attached and test them. It’s very nice so we will do that.
By the way, the aws console is something important that you have to understand if you want to use AWS services. There really is a lot of stuff to see, lot of services, you can be lost checking everything, but it’s worth it.

Ok I assume that you are in your API Gateway page. If nothing appear, check that you are in the right region, here us-east-1. Click on dev-api-graphql, post, and test. In the header text-area, put Content-Type: application/json. In the Request Body, paste :

	"query": "mutation CreateUser($arg: String!){ createUser(email: $arg)}",
	"operationName": "CreateUser",
	"variables": {
		"arg": ""

I vividly encourage you to test your json with It can save you a lot of time if you’re like me and you’re mistaken : with = and you passed the last 2 hours without noticing it… Anyway, click Test, and if the result looks like this :

  "data": {
    "createUser": true

Well done, you created your first user! If you click again on Test, the result should be false this time because we can’t create two users with the same email. We can go check your dynamoDB table on our aws console to see our item with our two attributes, ID and email. Let’s try to get the email of the user:

	"query": "query gUser($arg: String!){ user(ID: $arg){ email }}",
	"operationName": "gUser",
	"variables": {
		"arg": "86250407fc87f3d297e3076b08133cfd"

The response is :

  "data": {
    "user": {
      "email": ""

Perfect it’s exactly what we wanted! Ok now let’s update the user :

	"query": "mutation UpdateUser($id: String!, $country: String! ){ updateUser(ID: $id, country: $country){ ID email country }}",
	"operationName": "UpdateUser",
	"variables": {
		"id": "86250407fc87f3d297e3076b08133cfd",
		"country": "France"

And the response is:

  "data": {
    "updateUser": {
      "ID": "86250407fc87f3d297e3076b08133cfd",
      "email": "",
      "country": "France"

And our item has now a country attribute.

Ok I let you find out how we can delete this user, you should have the logic now!

This method of building an API is not the best. Why? Because there is no communication between our graphQL schemas where we define our types and the functions where we interact with our tables. You always have to be careful about what you are doing with the data and what graphQL can do with it. Fortunately, there is something call AWS AppSync that is going to make our job easier, and I plan to write about it soon!

comments powered by Disqus