Skip to main content

DynamoDB Data Loading and Write Operations

Hands-On Lab

 

Photo of

Training Architect

Length

00:30:00

Difficulty

Intermediate

Welcome to this hands-on live AWS learning activity in which you will learn how to use DynamoDB, specifically in the area of data loading and write operations. By completing this activity, you will gain experience using the AWS SDK (Boto3) to interact with DynamoDB via relevant API actions. You will also become familiar with important concepts such as primary keys (partition and sort keys), as well as attribute types, write operations, conditional writes, and atomic counters. All these concepts will give you a more solid and practical understanding of the DynamoDB service.

What are Hands-On Labs?

Hands-On Labs are scenario-based learning environments where learners can practice without consequences. Don't compromise a system or waste money on expensive downloads. Practice real-world skills without the real-world risk, no assembly required.

DynamoDB Data Loading and Write Operations

Introduction

Welcome to this AWS Hands-On Lab in which you will learn how to use DynamoDB, specifically in the area of data loading and write operations. By completing this activity, you will gain experience using the AWS SDK (Boto3) to interact with DynamoDB via relevant API actions. You will also become familiar with important concepts such as primary keys (partition and sort keys), as well as attribute types, write operations, conditional writes, and atomic counters. All these concepts will give you a more solid and practical understanding of the DynamoDB service.

Getting Started

Let's get started by logging in to the AWS console to gather some information about a DynamoDB table that we will be interacting with. The necessary links and credentials are provided on the hands-on lab page under the Credentials section.

  1. Click on the Open AWS Console button to open the AWS console login page.
  2. Copy and paste the username and password listed under AWS ACCOUNT in the Credentials section of the hands-on lab page. The Account ID or alias field should be filled in for you already.
  3. Now that we are logged in to the AWS console, we need to connect to our server's public IP address using SSH.

    • Open a Terminal window on your local machine

    • Connect to your server by entering the following command, replacing <PUBLIC IP> with the IP you are provided with under Public IP address of public instance in the Credentials section of the hands-on lab page:

      ssh cloud_user@<PUBLIC IP>
    • At the prompt "Are you sure you want to continue connecting (yes/no)?", enter yes

    • Use the password for the cloud_user provided under Cloud Server and Public Instance in the Credentials section of the hands-on lab page.

Now that we are logged in to the AWS console and connected to our server using SSH, we are ready to get started!

DynamoDB Data Loading and Write Operations

Load the DynamoDB table with data

Let's start by navigating to the DynamoDB section in the AWS Console. Type DynamoDB into the AWS services search bar near the top of the page and click on DynamoDB in the list, or find the link directly by expanding the All Services section under the search bar.

Click on Tables.

Click on the name of our table, MusicAlley, to bring up more information about the table.

Under the Items tab, we see there are no items currently in the table. Here we can also see the Partition Key of Artist and the Sort Key of SongTitle are already created for us.

Our goal is to upload the information in the songs.json file from our EC2 instance to this table. In order to accomplish this goal, we must first edit the upload.py file, also available on the instance, so that it will work correctly.

  1. Open the file in a text editor:

    sudo vim upload.py

    You may have to re-enter the cloud_user password which is the same password used to initially SSH into the instance and available from the Linux Academy lab page.

  2. Correct the table name:

    • Change:

      TableName='aTable?'
    • To:

      TableName='MusicAlley'
  3. Correct the Artist data type:

    • Change:

      'Artist`:{'A':song['Artist']},
    • To:

      'Artist`:{'S':song['Artist']},
  4. Correct the price data type:

    • Change:

      'price`:{'P':song['price']},
    • To:

      'price`:{'N':song['price']},
  5. Correct the address data type:

    • Change:

      'address`:{'A':song['address']},
    • To:

      'address`:{'S':song['address']},
  6. Save the file by pressing the Escape key, followed by typing :wq! and then press the Enter key.

  7. Now, run the upload.py script by running the command:

python3.6 upload.py

Refreshing the Amazon console page will now show that the values in our songs.json file have been uploaded to our MusicAlley table.

Now, we need to fix the atomic_counter.py file.

  1. Open the script in a text editor:

    sudo vim atomic_counter.py
  2. Correct the table name:

    • Change:

      TableName='whatisthis_silly_tablename?',
    • To:

      TableName='MusicAlley',
  3. Correct the Artist data type:

    • Change:

      'Artist':{'N': "Anthony Haslett"},
    • To:

      'Artist':{'S': "Anthony Haslett"}
  4. Correct the SongTitle data type:

    • Change:

      'SongTitle':{'N':"Ivory Maroon"}
    • To:

      'SongTitle':{'S':"Ivory Maroon"}
  5. Set an increment value:

    • Change:

      ':inc': {'N': ''}
    • To:

      ':inc': {'N': '3'}
  6. Save the file by pressing the Escape key, followed by typing :wq! and then press the Enter key.

  7. In the Amazon Console, select Query in the dropdown box in the top-left of the grey box in the Items tab. The Partition key should be Artist with a String of Anthony Haslett and the Sort key should be SongTitle with a String of Ivory Maroon. Click the Start Search button to show the results and the current price of that result.

Now, let's run the atomoic_counter.py script:

python3.6 atomic_counter.py

Press the Start Search button again to show the updated price.

Now, let's edit the conditional_write.py file:

  1. Open the script in a text editor:

    sudo vim conditional_write.py
  2. Correct the table name:

    • Change:

      TableName='someTableNameIthink',
    • To:

      TableName='MusicAlley',

We need to see what the current value is so we know what to change the next value to. In the Amazon Console, search the MusicAlly table using the Partition key of Anthony Haslett and the Sort key of Blue Magenta. Note the value that is shown in the price column.

  1. Set the current value (currval) to the price shown in our search:

    • Change:

      ':currval': {'N': '1.2345'}
    • To:

      ':currval': {'N': '18.37'}
  2. Save the file by pressing the Escape key, followed by typing :wq! and then press the Enter key.

  3. Run the conditional_write.py script to execute our changes:

    python3.6 conditional_write.py

We can run our table search again to show the updated price and verify our conditional_write.py script ran successfully.

Conclusion

By completing this activity, you have gained experience using the AWS SDK (Boto3) to interact with DynamoDB via relevant API actions. You have also become familiar with important concepts such as primary keys (partition and sort keys), as well as attribute types, write operations, conditional writes, and atomic counters. Great job!