Skip to main content

Working with AWS SQS FIFO Queues

Hands-On Lab


Photo of

Training Architect





In this hands-on lab, you will learn how to create and interact with first-in-first-out (FIFO) SQS queues. This lab will provide an important distinction from SQS standard queues, noting the key characteristics and functionality differences between the two. By the end of the lab, you should feel comfortable creating and interacting with SQS FIFO queues. You will also learn and become familiar with various API actions used to interact with the SQS service via the AWS CLI and AWS SDKs.

What are Hands-On Labs?

Hands-On Labs are scenario-based learning environments where learners can practice without consequences. Don't compromise a system or waste money on expensive downloads. Practice real-world skills without the real-world risk, no assembly required.

Working with AWS SQS FIFO Queues

In this hands-on lab, we will learn how to create and interact with First-In-First-Out (FIFO) SQS queues. The biggest benefit of a FIFO queue is that we're guaranteed to have all the information in the FIFO queue processed in the same order it was received.

This lab will provide an important distinction from SQS standard queues, noting the key characteristics and functionality differences between the two. By the end of the lab, we should be comfortable creating and interacting with SQS FIFO queues. We will also learn and become familiar with various API actions used to interact with the SQS service via the AWS CLI and AWS SDKs.

We will be using an EC2 instance to act as both a producer and a consumer of an SQS FIFO queue. By using it along with the AWS SDK for Python (Boto3), we will gain an understanding of how to send and consume messages from a queue.

Introduction to the Live AWS Environment with SQS FIFO Queues

If you haven't already done so, sign in to the EC2 instance using the credentials provided on the lab page. If you're on a Mac, use the Terminal app. If you are using Windows, you will need an SSH client such as PuTTY. (Learn how to connect to an EC2 instance from a Windows PC here.

We need to open three terminals and sign in to the EC2 instance in each of them, because we're going to use a few different commands simultaneously in order to correctly interact with the lab.

Enter the same command in each terminal:

$ ssh cloud_user@<PUBLIC_IP_ADDRESS>

Make sure to replace <PUBLIC_IP_ADDRESS> with the public IP address of your EC2 instance listed on the lab page.

When the prompt asks if you're sure you want to continue connecting, enter yes.

Then, paste in the password provided on the lab page.

Repeat this process for three different connections to this EC2 instance.

Working with SQS FIFO Queues

Diving into the Files We'll Use

We have a few different files we'll be using to interact with the SQS service, and they're accessible in any of the terminals we've opened in the console since we've signed in using the exact location of our EC2 instance.

>Note: We're eventually going to run different scripts in each of the terminals we have open. Once we get to that point, make sure you remember which one is running which.

Let's take a look at some of these scripts. List them out by entering:


The Script

In any of the terminals, open in Vim by entering:


This file uses Boto3 to create a new FIFO queue with SQS. It uses the SQS client created by Boto3 to use the create queue method, and create a new queue called mynewq.fifo. The .fifo is required for FIFO queues, and we're also required to pass a FifoQueue attribute of true in the Attributes section.

Additionally, we have to set up a way to duplicate the content that's coming into the queue. In this case, we selected ContentBasedDuplication and set it to true.

We've also set DelaySeconds to 5, which means we delay items that go into the queue for five seconds before they're available to be picked up by consumers. This will appear when we're monitoring our queue a little later.

Finally, we have MessageRetentionPeriod, which is in seconds, like the DelaySeconds attribute. It determines how long the messages will stay within the queue if they're never picked up and deleted.

At the bottom, we're printing out the response attribute of QueueUrl, which we'll be using to interact with the queue after we've created it.

Close out of this file by typing:


In the same terminal, run the script by entering:

$ python3.5

When we hit Enter, it should run the entire chunk of code we just saw and produce our new QueueUrl for our recently created queue. Copy the URL.

The Script

Let's use Vim again. Staying in the same terminal, enter:

$ vim

This is a fairly simple file that allows us to set a value for the QUEUE_URL and eventually import it in other files.

Press a, which will allow us to start editing, and replace REPLACEME with our new queue URL.

To save the changes and overwrite the file, hit Escape and enter:


Now, read the file contents out to the terminal with this:

$ cat

We'll see it now contains our new FIFO queue URL.

If we want, we can clear the screen by entering:

$ clear

List out the files again:

$ ls

The data.json File

Let's take a look at our data.json file. In the same terminal, enter:

$ cat data.json

This file contains donation data to be processed and later stored in some way. In this case, we're just going to put the information into the SQS queue, log it to the console, and delete the information from the queue. During the in-between step, though, we can do things like record it inside a DynamoDB or take some sort of analytics action. If there were email information like we have here, we could contact the donor and thank them for their donation. Each of these actions could be taken independently, and we'll take a look at how we could do them.

Feel free to clear the screen again.

The Script

Before we start throwing things in our queue, note we could use the file to clear the queue out completely.

Let's see what this file looks like. In the same terminal, enter:

$ cat

This uses the purge_queue method of the SQS client created from Boto3 to purge the queue URL we just set inside the SQS queue URL.

If we wanted to run the example multiple times, or we got stuck and needed to start over, purge_queue would allow us to reset the example to a completely empty queue that has no messages inside it.

Feel free to purge the current queue at any time. Try it now by entering:

$ python3.5

This won't show any output — it should just empty the current queue.

Clear out of this if you'd like with:

$ clear

Monitoring Our Message Status

Now, we're going to set up a monitor of our current queue.

In another empty terminal, enter:

$ python3.5

This checks the status of our queue, displaying the approximate number of:

  • Messages inside our queue
  • Visible messages
  • Delayed messages

We'll keep this terminal open to keep track of the queue status. If you'd like, make it small enough to just fit the short list of statuses and set it off to the side. We'll mainly work with the remaining two terminals and leave this one alone to monitor the status as we go along.

Let's move on to our producer.

Starting Up Our Producer

To begin, in one of the other two terminals, enter:

$ python3.5

As this file runs, it's adding messages to our SQS queue. We can see in our message status terminal that there is an approximate number of messages inside the queue, as well as an approximate number of delayed messages (it seems to be 5 for a long time).

Let's take a closer look.

In the third terminal, use Vim to open our file by entering:

$ vim

In this file, we're doing similar things to what we did earlier in creating our SQS client, as well as interacting with the data.json file in the current directory to take the data and put it into a FIFO queue.

In the message status terminal, we should see there aren't any delayed messages anymore. But why was it initially five?

In the file, we'll see we're sleeping for one second (indicated by time.sleep(1)) between each iteration of adding a message to the queue with the sqs.send_message method. This allows the function to run about five times for every five seconds. And, if we remember back to our creation of this FIFO queue, we have a delay of five seconds before messages are added to the queue. So, there's a delay stage they're held at, which is what we're receiving when we see five approximate messages delayed. When that five seconds runs out, they're added to the queue and reflected in the ApproximateNumberOfMessages.

As we can see at the end, the five messages that were inside the delayed status eventually ran out because there were no new messages being created and all the messages were inside the queue. Now, we have exactly 50 messages in this queue.

Close out of the file by entering:


In the fifo_producer terminal, we should see it's finished running, which means all the 50 JSON elements inside our data.json file have been added to the queue in the way we've expected fifo_producer would include them.

Now, let's run our FIFO consumer.

Starting Up Our Consumer

In the same terminal where we just took a deeper look at the file, which should be empty now, enter:

$ python3.5

As this fifo_consumer runs, we'll see it's receiving a message and then deleting the message after it's been received. Basically, we're receiving the messages created by fifo_producer and extracting metadata from them. Here's an example of the data we're seeing:

JOB TYPE: NewDonor
BODY: {"email": "", "donation": 31.81, "name": "Ford Castro"}

Once it's done running, let's open the fifo_consumer inside Vim by entering:

$ vim

Inside this fifo_consumer, we'll see we're using the sqs.receive_message method and the QueueUrl we defined earlier, as well as making sure we're getting all the message attributes and only getting one message at a time (by setting MaxNumberOfMessages equal to 1).

The VisibilityTimeout is set equal to 5, which means when we receive a message, we set a visibility timeout on that message of five seconds. So, for the next five seconds, no other consumer should be able to see that message inside our queue.

The WaitTimeSeconds is set to 10, which means if we run the consumer right now and there's no information inside the queue, it will wait up to 10 seconds for information to appear in the queue before it finally times out and no longer waits to process the information.

We've also inserted an artificial delay in here where we're waiting two seconds (using time.sleep()) to slow down the process so we can see it as it moves through each step.

Below that, we'll see the sqs.delete_message method, where we're using the QueueUrl and the ReceiptHandle value we got from the response when we were receiving a message. It's essentially proof to say we got the message, and we want to delete it now.

The ReceiptHandle is only valid for as long as the VisibilityTimeout remains, so let's add an artificial time.sleep() here.

Above the sqs.delete_message method, we should see:

# If our task takes too long we can't delete it
# time.sleep(5)

Press a, which will allow us to start editing. To add the artificial time.sleep(), delete the # before time.sleep(5).

Now that we've entered this, in addition to the time.sleep(2) from earlier, the entire script will take at least seven seconds to execute for each of these messages. We've also set the VisibilityTimeout to five seconds, so these seven seconds will take even longer than that.

To save the script and overwrite the file, hit Escape and enter:


Clear the screen if you'd like:

$ clear

Let's try to use the file again. Enter:

$ python3.5

It might take a few seconds to start running now that we added the five-second wait time. Once that finishes, we're going to get an error. When we tried to call the delete_message operation, we see our ReceiptHandle is invalid because it's expired. This is a reminder that when we try to process the contents of the queue, we need to make sure we're setting a large enough VisibilityTimeout so we can eventually delete the contents of the queue, if that's our intention later. The way we can fix this is going back into the file and reducing the wait time or increasing the VisibilityTimeout. Let's do the latter.

Open the file with Vim:

$ vim

Change the VisibilityTimeout from 5 to 10. Save and overwrite the file by hitting Escape and entering:


Now, let's try running it again.

But first, start from scratch by entering:

$ python3.5

Clear both the fifo_producer and soon-to-be fifo_consumer terminals to get a clean slate:

$ clear

Starting It All Up Together

Let's start the consumer before the producer. In one of the blank terminals, enter:

$ python3.5

It's running, but there isn't anything to process. It's going to wait for up to 10 seconds, which we set as the potential wait time if there's no information available in the queue. After 10 seconds, it's going to time out, and we'll get an error.

Let's try something a little different. Type the following into the same terminal, but don't hit Enter:

$ python3.5

In the other empty terminal, type the following, but don't hit Enter:

$ python3.5

Now, go back to the consumer terminal and hit Enter, and then immediately hit Enter in the producer terminal.

The producer will start adding information to the queue, but it delays the messages for up to five seconds. Once messages are added, we'll see the FIFO consumer starts consuming the messages. So we can simultaneously add messages to the SQS queue, consume them from the queue, and process them to do whatever we'd like.


We've officially completed this lab, sending and consuming messages from a queue.

You did it! Congratulations on completing this lab.