Skip to main content

Creating Lambda Functions for Our API Gateway API with the AWS CLI

Hands-On Lab


Photo of

Training Architect





In this lab we will learn how to create AWS Lambda functions using the AWS CLI. We will emulate a ‘local’ development environment with an EC2 instance that will be provided for you. We will create Lambda function deployment packages that we use to create functions from AWS S3 and from our ‘local’ EC2 machine. We will also learn how to test our lambda functions using the CLI and creating test JSON events.

What are Hands-On Labs?

Hands-On Labs are scenario-based learning environments where learners can practice without consequences. Don't compromise a system or waste money on expensive downloads. Practice real-world skills without the real-world risk, no assembly required.

Creating Lambda Functions for Our API Gateway API with the AWS CLI

In this lab, we are working for the company PrometheonPartners. We have been tasked to configure and deploy an AWS Lambda function with our PrometheonMusic table using the AWS CLI. To do this, we emulate a local machine, using the EC2 instance provided. We create Lambda function deployment packages that are used to create functions from AWS S3 and our emulated local EC2 machine.

Create .zip Files

Using the information provided by the lab, sign into the emulated local machine. Once logged in, check your print working directory with pwd. Then, after confirming we are in /home/cloud_user, perform an ls to list out what we have in the folder:

[cloud_user@IP ~]$ pwd
[cloud_user@IP ~]$ ls
create.js delete.js get.js list.js update.js

The first .zip file we'll create is for list.js. To create it, enter the following code:

[cloud_user@IP ~]$ zip ./ ./list.js

List out the information again with ls. This time appears. Now we want to do this again for both update.js and delete.js:

[cloud_user@IP ~]$ zip ./ ./update.js
[cloud_user@IP ~]$ zip ./ ./delete.js

List out again to check that all three .zip files are there.

Create a Bucket

With our files created, it is time to create a bucket to hold them. We will be using purple-dog as our bucket name, but you may have to come up with something completely different for yours. Make it something you can remember throughout the lab.

To create the bucket, do the following:

[cloud_user@IP ~]$ aws s3api create-bucket --bucket purple-dog

If it does not work, then try a different bucket name.

Once created, we take our .zip items and send them to our bucket. Again, purple-dog is our example bucket name:

[cloud_user@IP ~]$ aws s3 cp ./ s3://purple-dog/
[cloud_user@IP ~]$ aws s3 cp ./ s3://purple-dog/

To make sure all those items are in the S3 bucket, complete the following:

[cloud_user@IP ~]$ aws s3 ls purple-dog

Create a Lambda Function

With those packages created and stored, we can make them into lambda functions. To do so, we first need some information from the DynamoDB Full Lambda Access Policy that we created in an early lab:

[cloud_user@IP ~]$ aws iam list-roles

Copy the "Arn" of the DynamoDBFullLambdaAccess role, without the quotes, and save it in a note. From now on, this value will be referred to as ARNVALUE$. Using the following code, we will create the basis for our function.

[cloud_user@IP ~]$ aws lambda create-function --function-name list 
> --runtime nodejs6.10 
> --role ARNVALUE$ 
> --handler list.list 
> --code S3Bucket=purple-dog,

You may get an error saying "You must specify a region". If you do, do the following:

[cloud_user@IP ~]$ aws configure

Hit enter until you get to the Default region name as we want to keep all other defaults. Add us-east-1 to it and hit enter on the next line to keep it as the default.

AWS Access Key ID [None]:
AWS Secret Access Key [None]:
Default region name [None]: us-east-1
Default output format [None]:

Once we've set that up, enter the following command, either by hitting the up arrow to get to the previously made command, or copying in the information from before:

[cloud_user@IP ~]$ aws lambda create-function --function-name list --runtimenodejs6.10 --role ARNVALUE$ --handler list.list --code S3Bucket=purple-dog,

Metadata will appear.

Repeat the command, but this time for Update.

[cloud_user@IP ~]$ aws lambda create-function --function-name update --runtimenodejs6.10 --role ARNVALUE$ --handler update.update --code S3Bucket=purple-dog,

Secondary way to Create a Lambda Function

This is the secondary way of running a Lambda function. Note that you must have your ARNVALUE$ to make the Lambda function. For this Lambda function, we'll be using our file. This time we are pulling the file from our local machine instead of from our aws online files:

[cloud_user@IP ~]$ aws lambda create-function --function-name delete 
> --runtime nodejs6.10 
> --role ARNVALUE$ 
> --handler delete.delete 
> --zip-file fileb://

We have successfully created the function.

Run a Lambda Function

Now, let's test our functions. For right now, we'll do the list function. We will not be entering anything specific for our payload, and the listOutfile.txt is where we will have our information send to:

[cloud_user@IP ~]$ aws lambda invoke --function-name list 
> --payload '{}' 
> listOutfile.txt

To see what the list function did, perform a cat on listOutfile.txt:

[cloud_user@IP ~]$ cat listOutfile.txt

The first 10 items from the table appear. Copy the first item in its entirety, starting with Price, between the {}, and place it in a note for later use.

Testing Files

To get started, we want to create a new file to perform our tests with. Use the touch command to create the updateTest.json file:

[cloud_user@IP ~]$ touch updateTest.json

Check with ls to make sure the file has created:

[cloud_user@IP ~]$ ls
create.js list.js updateTest.json
delete.js get.js listOutfile.txt update.js

updateTest.json appears in our list of files.

Edit the updateTest.json File

Using your preferred editor, enter the updateTest.json file:

[cloud_user@IP ~]$ EDITOR$ updateTest.json

Paste in the information we copied:

{"body": "COPIED ITEM$"}

Change the price to 12.89 and the CriticRating to 9.66. To search for this later, we need to know the Artist and SongTitle keys. Copy those and save them for later use. Do NOT change these items.

Leave the file.

With the file updated, we can check to make sure that the update function is working correctly. Run the following:

[cloud_user@IP ~]$ aws lambda invoke --function-name update 
> --payload file://updateTest.json 
> updateOutfile.txt

The status code set at 200 lets us know that everything went correctly.

[cloud_user@IP ~]$ cat updateOutfile.txt

Using our get function, let's see what our table returns to us. Make sure there is a space after --key (make sure you replace the Artist and SongTitle with the ones for the item you just updated):

[cloud_user@IP ~]$ aws dynamodb get-item --table-name PrometheonMusic --key '{"Artist": {"S":"ARTIST YOU UPDATED"}, "SongTitle":{"S": "SONG TITLE YOU UPDATED"}}'

Once we hit enter, the item we updated appears.

Perform Delete

We've used our get, list, and update commands, so now it is time to use the delete. First, we need a place to put our delete Lambda function. Using touch, create deleteTest.json.

[cloud_user@IP ~]$ touch deleteTest.json
[cloud_user@IP ~]$ EDITOR$ deleteTest.json

Now, using the Artist and SongTitle information we copied, we can designate the item we want to delete. Use the following code to perform the deletion (specifying your artist and song):

{"body": "{"Artist":"YOUR ARTIST","SongTitle":"YOUR SONG"}"

Save this file.

[cloud_user@IP ~]$ aws lambda invoke --function-name delete 
> --payload file://deleteTest.json 
> deleteOutfile.txt

We get an error, or maybe not. Did you catch the mistake we made earlier? If not, go back to your editor.

[cloud_user@IP ~]$ EDITOR$ deleteTest.json

If you didn't catch it before, close out the item with a }. Save the change and then perform the following command again:

[cloud_user@IP ~]$ aws lambda invoke --function-name delete 
> --payload file://deleteTest.json 
> deleteOutfile.txt

To make sure that the item deleted like we wanted, cat out the deleteOutfile.txt file.

[cloud_user@IP ~]$ cat deleteOutfile.txt

The 200 message tells us that all information was deleted successfully.


Congratulations. Upon completing this lab, you now have the know-how to create lambda functions from the command line in two different ways along with how to implement those functions.