Skip to main content

Working with AWS SQS Standard Queues

Hands-On Lab

 

Photo of

Training Architect

Length

00:30:00

Difficulty

Intermediate

In this AWS Live environment, you will learn how to create and interact with SQS standard queues. You will send messages to an SQS queue that you create, and also learn how to take advantage of SQS queues to use multiple SQS consumers to process queue data at the same time! By the end of this AWS learning activity, you should feel comfortable interacting with the SQS service via using the Boto3 SDK for Python. As well as gain an understanding how you can send messages to standard queues, set queue attributes, and consume messages from the queues we create.

What are Hands-On Labs?

Hands-On Labs are scenario-based learning environments where learners can practice without consequences. Don't compromise a system or waste money on expensive downloads. Practice real-world skills without the real-world risk, no assembly required.

Working with AWS SQS Standard Queues

Introduction

Welcome aboard! In this hands-on lab, we're going to create and interact with a standard SQS queue. At one point during the lab, we'll need five terminal windows open, so be ready. It may be helpful to change the background colors of each terminal to help keep them separated as each one runs different tasks.

Log In

Let's get those terminals fired up right off the bat. Use the credentials provided on the Linux Academy lab page, and open five different terminal windows to log in to the public IP address of the machine that's been provided.

Create a Standard SQS Queue

Let's first create a queue using the Python script sitting in our home directory. In one of the terminal windows, enter:

[cloud_user@host]$ python3.5 create_queue.py

When the command finishes, we'll see some output we'll need later. It is the URL of our queue, and should look something like https://queue.amazonaws.com/xxxxxxxxxxxx/mynewq. Let's copy that URL and then edit the sqs_url.py file. Update the file so it reads:

QUEUE_URL = 'https://queue.amazonaws.com/xxxxxxxxxxxx/mynewq'

Make sure our edit stuck with:

[cloud_user@host]$ cat sqs_url.py

Monitor the Queue

In another of the windows, we want to start up a script that will keep track of what's going on in our queue. We're going to run that script with this:

[cloud_user@host]$ python3.5 queue_status.py

That's going to be checking on messages in our queue, so we'll just leave it running in that window.

Send Data

In a third terminal, we're going to run another script that will start pushing data to the queue.

[cloud_user@host]$ python3.5 slow_producer.py

Information will start flying through this window, and we're just going to leave it running. After a few seconds, we're going to start seeing our monitoring window change. We'll see the numbers start going up for the different kinds of messages there. The slow_producer script will stop running, leaving us with 50 messages total. Now, let's run fast_producer.

[cloud_user@host]$ python3.5 fast_producer.py

The only difference between the two is the time they wait between sending messages. The slow one waits 10 seconds, while the fast one only waits one second. Watching the message window while fast_producer.py runs will show us way fewer in the ApproximateNumberOfMessagesDelayed row than slow_producer.py did.

Send Some More Data

In a fourth terminal, let's run another script, one that will receive messages from the queue, extract some metadata from them.

[cloud_user@host]$ python3.5 fast_consumer.py

We can see the number next to ApproximateNumberOfMessages, which started at 100, start dropping in our messages window, as well as data flying by in the fourth terminal. In the fifth terminal, let's run yet another script.

[cloud_user@host]$ python3.5 slow_consumer.py

This does pretty much the same thing as the fast version of the script, just slower. All the while, our number of ApproximateNumberOfMessages drops. So, we've got not one but two scripts running that are both snagging messages out of the queue, extracting some data from them, and then deleting them.

Clean Up

We've got another project coming in a couple minutes, so let's clear things out using a purging script. Run this, in any window, to wipe the slate clean:

[cloud_user@host]$ python3.5 purge_queue.py

Do It All at Once

Let's pull out all the stops, and run them all at the same time! Terminal 1 is the messages window. Terminal 2 is going to be running slow_producer.py. Terminal 3 is going to be running fast_producer.py. Terminal 4 is going to be running slow_consumer.py. And Terminal 5 is going to be running fast_consumer.py.

Fire up the two slow scripts in their respective terminal windows — first the producer and then the consumer. Now, run fast_consumer.py in its terminal window and watch the message window. The number of messages should start going down. Now, if we fire up the fast_producer.py script, we'll see the numbers go up again. This will continue until the producer scripts run out of data to read, and the consumer scripts have processed all of the messages. Eventually, our messages window's three rows will all have zero counts.

Conclusion

We've officially completed this lab, sending and consuming messages to and from a standard SQS queue.

You did it! Congratulations on completing this lab.