Integrating Cloud Functions and Cloud SQL

Hands-On Lab

 

Photo of Joseph Lowery

Joseph Lowery

Google Cloud Training Architect II in Content

Length

00:30:00

Difficulty

Beginner

The implementation of microservices, like Google Cloud Functions, makes it possible for serverless architecture to effortlessly scale. Where Cloud Functions really shine is interacting with other Google Cloud services, such as Cloud SQL. In this hands-on lab, we’ll set up a Cloud SQL instance and database and then build a Cloud Function that, once triggered, queries a massive dataset and outputs the requested results to the log.

What are Hands-On Labs?

Hands-On Labs are scenario-based learning environments where learners can practice without consequences. Don't compromise a system or waste money on expensive downloads. Practice real-world skills without the real-world risk, no assembly required.

Integrating Cloud Functions and Cloud SQL

Introduction

In this hands-on lab, we’ll set up a Cloud SQL instance and database and then build a Cloud Function that, once triggered, queries a massive dataset and outputs the requested results to the log.

How to Log in to Google Lab Accounts

On the lab page, right-click Open GCP Console and select the option to open it in a new private browser window (this option will read differently depending on your browser — e.g., in Chrome, it says "Open Link in Incognito Window"). Then, sign in to Google Cloud Platform using the credentials provided on the lab page.

On the Welcome to your new account screen, review the text, and click Accept. In the "Welcome L.A.!" pop-up once you're signed in, check to agree to the terms of service, choose your country of residence, and click Agree and Continue.

Integrating Cloud Functions and Cloud SQL

Now, on to the lab!

Enable APIs.

  1. From the main Google Cloud console navigation, choose APIs & Services > Library.
  2. Search for Cloud SQL, and enable the service if necessary.
  3. Return to the API Library page, and search for Cloud Functions, enabling it if necessary.

Create Cloud Storage bucket.

  1. From the main navigation, go to Storage > Browser.
  2. Choose Create bucket.
  3. In the Name field, enter a globally unique name (e.g., "la-sql-" with a series of numbers at the end, like "8675309").
  4. From the Default storage class options, choose Regional.
  5. Leave the remaining values as their defaults, and click Create.

Clone a GitHub repo, and copy files to bucket.

  1. Activate the Cloud Shell by clicking the icon in the top row.

  2. From the Cloud Shell, issue the following command to clone the repository for this course:

    git clone https://github.com/linuxacademy/content-gc-functions-deepdive
  3. Change directory:

    cd content-gc-functions-deepdive/cloud-functions-sql-lab
  4. Copy the necessary files to the Cloud Storage bucket:

    gsutil cp Met* gs://<BUCKET_NAME>

Create a database instance.

  1. From the main console navigation, choose SQL.
  2. Click Create instance.
  3. Select Choose MySQL.
  4. Set the instance ID to la-met.
  5. Set the password to root.
  6. Leave the region and zone options as their defaults.
  7. Click Create.

Create database, and import schema.

  1. From the Cloud SQL dashboard, click the la-met instance.
  2. Copy the Instance connection name into a text file, as we'll need it a little later.
  3. Select the Databases tab.
  4. Click Create database.
  5. Name the database la_met_museum.
  6. Leave the other settings as their defaults, and click Create.
  7. Click Import.
  8. Locate the bucket containing the uploaded files by clicking Browse.
  9. Choose MetObjects_Table.sql, and click Select.
  10. Make sure the Format of import is set to SQL.
  11. From the Database list, choose la_met_museum.
  12. Click Import.

Import data.

  1. Click Import.
  2. Locate the bucket containing the uploaded files by clicking Browse.
  3. Choose MetObjects.csv, and click Select.
  4. Make sure the Format of import is set to CSV.
  5. From the Database list, choose met_museum.
  6. In the Table field, enter MetObjects.
  7. Click Import.

Create a Cloud Function.

  1. In the Cloud Shell, open the Cloud Shell Editor by clicking the pencil icon.
  2. Expand content-gc-functions-deepdive > cloud-functions-sql-lab folder.
  3. Open cloud-function-sql-main.py file; select all, and copy.
  4. Navigate to the Cloud Functions dashboard.
  5. Click Create function.
  6. Apply the following settings:
    • Name: la-sql-function-1
    • Trigger: HTTP
    • Source Code: Inline editor
    • Runtime: Python 3.7
    • Function to execute: greetings_http
  7. In the main.py field, paste the copied code.
  8. Replace the [YOUR_INSTANCE_CONNECTION_NAME]with your Cloud SQL instance's connection name.
  9. From the Cloud Shell Editor, open requirements.txt, and copy all.
  10. On the Create function page in the console, paste the copied code into the requirements.txt field.
  11. In the Function to execute field, enter get_sql_data.
  12. Click Create.

Test Cloud Function.

  1. After the Cloud Function is created, click its name.

  2. Choose the Trigger tab, and click the link.

  3. Verify it says "OK" in the browser.

  4. Choose View Logs from the Cloud Functions navigation.

  5. Verify log results.

  6. Back on the Cloud Function page, choose the Testing tab.

  7. In the Triggering event field, enter:

    {"year":"1920"}
  8. Click Test the function.

  9. Review results in the Logs section.

  10. Click to open one of the log entries, and navigate to the URL listed to see the result.

Conclusion

Congratulations on completing this lab!