« Back
in AWS serverless Node.js Slack read.
Creating a Slack Bot with Serverless Framework and AWS Lambda

Creating a Slack Bot with Serverless Framework and AWS Lambda.

In case you didn't already know, Serverless works with AWS' API Gateway and Lambda functions which does away with the need for infrastructure. There are certainly some limits to it which I won't get into here. But for tasks that take < 5 minutes and things such as building a Slack bot, it's the perfect play project.

This blog post is using Node.js and Serverless 1.0.0-beta2.2 and we'll look at achieving:

  • Body Mappings for Slack commands (application/x-www-form-urlencoded)
  • Unit testing locally
  • Having Lambda functions in a folder structure that isn't the root folder (there is a trick to this)
  • Writing in ES6/ES2015 and transpiling with Babel

Eventually this bot will be hooked up to a Raspberry Pi 3 doing some IoT stuff but that's going to be another post. If anyone has other ways of achieving the aforementioned please leave a comment, I'd love to find other ways from the community experts.

This bot is currently a work in progress but feel free to jump straight to the code on GitHub: https://github.com/serinth/pongbot

Body Mappings

Serverless by default will give you an entire body mapping for just for application/json. At this time I don't know of a way to automatically include universal body mappings for the former. I had to actually manually go into the console and add the following for the command endpoint. Here is the body mapping code which you can cut and paste:

{
    "data": {
        #foreach( $token in $input.path('$').split('&') )
            #set( $keyVal = $token.split('=') )
            #set( $keyValSize = $keyVal.size() )
            #if( $keyValSize >= 1 )
                #set( $key = $util.urlDecode($keyVal[0]) )
                #if( $keyValSize >= 2 )
                    #set( $val = $util.urlDecode($keyVal[1]) )
                #else
                    #set( $val = '' )
                #end
                "$key": "$val"#if($foreach.hasNext),#end
            #end
        #end
    }
}

This will expose the data node with all the key value pairs that Slack will send us. We'll take a look at what that looks like in a second in the Unit Tests below.

Here's what the body mappings should look like in API Gateway on the Integration Endpoint:

Body Mapping Application UrlEncoded API Gateway

So now we can access all the form data in event.data in our function handler. It will already exist as an object so we don't need to JSON.parse() it either.

Unit Testing On Serverless

Since lambda functions are just functions, we can use Mocha, Chai, Sinon and Proxyquire.
Now that we have it as an object from the body mapping we can create a sample response from Slack:

I've put all the Slack events into test/slackEvents/:
The pongbot has one command at the moment. A user can issue the /challenge @anotheruser command and the bot will post to the channel some text.

Here's a sample Slack event after it goes through the body parser in API Gateway:
challengeEvent.js

export default () => (  
  {
    data:{ 
      token: 'SAMPLETOKEN',
      team_id: 'ID',
      team_domain: 'pongbot',
      channel_id: 'C249MSRMF',
      channel_name: 'general',
      user_id: 'UxxxW',
      user_name: 'ttruong',
      command: '/challenge',
      text: '@gwittchen',
      response_url: 'https://hooks.slack.com/commands/1234/5678'
    }
  }
);

If you're interested in how the challenge command is implemented you can take a look in the /src/challenge_cmd.js but it's not particular important in this post.

Okay cool so now we need something to test with this event.
I've done this in /src/challenge_cmd.spec.js. Yes it lives in the same folder as the implementation but we will see later that the final build will omit *.spec.js files.

Here's what the spec looks like:

import { expect } from 'chai';  
import chai from 'chai';  
import sinon from 'sinon';  
import sinonChai from 'sinon-chai';  
import proxyquire from 'proxyquire';  
import commandEventStub from '../test/slackEvents/commandEvent';

chai.use(sinonChai);

const contextStub = sinon.stub();  
const callbackStub = sinon.stub();  
const httpSpy = sinon.spy();  
const configStub = {  
    token: commandEventStub().data.token
};

const challenge_cmd = proxyquire('./challenge_cmd', {  
  '../config.json': configStub,
  './util/http': { default: (path, data) => { httpSpy(path, data); return Promise.resolve('value'); } }
});

describe('challenge', () => {  
  it('should httpPost to Slack command response url /commands/1234/5678 with in channel response: <@ttruong> has challenged <@gwittchen>!', () => {
    challenge_cmd.challenge(commandEventStub(), contextStub, callbackStub);
    expect(httpSpy).to.have.been.calledWith('/commands/1234/5678',{response_type: 'in_channel', text: '<@ttruong> has challenged <@gwittchen>!'});
  });
});

We've attached a Sinon spy to the HTTP Post call via Proxyquire so that no http post is actually made.

We're expecting {response_type: 'in_channel', text: '<@ttruong> has challenged <@gwittchen>!'} which is part of Slack's expected responses. Having a response_type of in_channel will post to the channel instead of the user.

Slack bot challenge command

Now to run everything I've modified the test run command in Package.json:

  "scripts": {
    "test": "./node_modules/mocha/bin/mocha src/*.spec.js --compilers js:babel-core/register "
  },

Simply run with npm run test. It uses Babel since our tests are in ES6 and runs everything in the /src folder with *.spec.js.

Lambda Functions in /lib

Now this was kind of a pain in my side when it shouldn't have been but here are the few gotchas I ran into. First of all, I'm running Windows 10 and that totally messed up the deployment when I ran serverless Deploy or serverless function deploy. This is a known bug and I believe it has been merged into master now. That was the first problem. I decided to deploy with my Linux machine instead.

The second problem I had was that I abstracted the HTTP Util out into a /util folder so that the functions can be shared. This module was not found by AWS Lambda. So let me rewind a bit and list how I've got things:

ES6/ES2015 code lives in /src/ and I want all final deliverables to live in /lib transpiled by Babel.

I've got a /src/util folder where functions can re-use items in util.

So why was it that the util module couldn't be found? Apparently node_modules can't just sit at the root level of the repo. One guy solved it by putting another package.json in the /lib folder and did another npm install on that. I don't really want to do that. I'll get to the solution in a second. There's one more gotcha.

Serverless does not change your function handler if you've re-deployed it or renamed it. If you go to the Lambda function on the AWS console it will stay what you originally had it as. So when my function lived in the root folder as challenge_cmd.js with the function name challenge it made the execution path for the handler: challenge_cmd.challenge. When I changed the serverless.yml file it did not reflect in the console to lib/challenge_cmd.challenge which is what I wanted. Just keep that in mind.

So the final build command in package.json as promised:

  "scripts": {
    "build": "rm -r lib; node ./node_modules/babel-cli/bin/babel.js src --out-dir lib --ignore *.spec.js && mkdir lib/node_modules && cp -r node_modules lib",
    "test": "./node_modules/mocha/bin/mocha src/*.spec.js --compilers js:babel-core/register "
  },

Remove a lib folder if it exists and then babel compile everything in /src except *.spec.js and output to /lib. Then copy node_modules into the /lib folder.

I think AWS Lambda actually adds additional libraries. When I went to the Lambda function in the console and downloaded the code to see what actually got pushed up I got 2 node_modules folders. One at the root level which had less libraries in it and one in /lib:

Different node_modules in aws lambda serverless

But hey now my util/http function can be found and everything is peachy.

I'm going to continue with Serverless for this bot and keep an eye on how it progresses. It has been a very nice experience to issue one command to get API Gateway and Lambda all up. I'm really looking forward to future developments.

comments powered by Disqus