Skip to main content

Azure Storage Deep Dive

Course

Intro Video

Photo of James Lee

James Lee

Training Architect

Length

16:00:00

Difficulty

Intermediate

Videos

40

Hands-on Labs

15

Quizzes/Exams

1

Course Details

This course is designed to provide a comprehensive overview of the core 'Azure Storage' services, including:

Azure Blob storage Azure Data Lake Storage Generation 2 Azure Files Azure File Sync Azure Queue storage Azure Table storage Azure Disks

Throughout this course, you will gain confidence:

Selecting the appropriate service for various solutions and scenarios Configuring, and implementing, and administering these services Developing solutions which leverage these services

This course provides a range of content, including lessons, hands-on-labs, quiz questions, and flashcards, all designed to ensure you are capable of leveraging the full power of the 'Azure Storage' services.

If you would like to follow along, or study using the Interactive Diagram, it can be found here.

Syllabus

Welcome to the Course

Course Introduction

00:03:27

Lesson Description:

Welcome to this Azure Storage Deep Dive course. This course is designed to help administrators, architects, and developers to understand the key concepts and skills relating to Azure Storage. Throughout this course, we'll be discussing:Azure Blob storage Azure Data Lake Storage (Gen2) Azure Files Azure File Sync Azure Queue Storage Azure Table Storage Azure DisksMost modern cloud solutions leverage some form of storage, whether it's for structured data, unstructured data, or even messages between loosely coupled services. To support you in your ability to build powerful solutions in Azure which leverage Azure storage, this course provides you skills and experience configuring, administering, and developing with these services. Starting with an introduction to storage fundamentals, we progressively build knowledge and skills with Azure Storage services. Course Support: Reach out to me directly with any questions or concerns; my passion is helping you be successful with Azure. If your question relates to an error or issue you've encountered within the Linux Academy website, we have a support team available to help. See the Course Support and Feedback lesson for the many ways to access support.

Course Support and Feedback

00:02:39

Lesson Description:

At Linux Academy, we are very passionate about providing everything needed to be successful on your learning journey. In this lesson, we provide a quick overview of the many tools available to access support, as well as provide feedback. If you experience any issues with the content, please contact me directly with the details. Course Support: Linux Academy Support: support@linuxacademy.com James Lee: james.lee@linuxacademy.com Course Feedback: Enjoying the content? Please leave a thumbs-up*! Have concerns or suggestions? Please contact me directly, or leave your comments with a *thumbs-down and I will reach out to you to address your issue!

About the Training Architect

00:01:03

Lesson Description:

G'day, everyone! Thanks for joining me. My name is James Lee, and I'll be your training instructor for this deep dive course on Azure Storage. I'm excited to be with you. Author Social Media Feel free to connect!Twitter: @jamesdplee LinkedIn: James Lee

Using the Interactive Guide

00:05:38

Lesson Description:

The Interactive Guide is used throughout the Azure Storage Deep Dive course to help illustrate important concepts and call out key points. Use this interactive diagram while following along with lessons, or as an independent study/reference guide for later use. Link: https://www.lucidchart.com/documents/view/48a5eceb-054a-4398-8a5b-2adf05763ed7

Storage Fundamentals

Data Types

00:11:46

Lesson Description:

Within this fundamentals lesson, we will discuss the following data types:Structured Semi-structured Unstructured MessagesUnderstanding these data types helps us select the most appropriate Azure storage service for our solution.

Storage Solutions

00:09:27

Lesson Description:

In this fundamentals lesson, we discuss several of the most common storage solutions available. This includes:Block-level storage File-level storage Object-level storage Queues DatabasesProgressing through these fundamental concepts, we can start to see that there are many different types of storage solutions for the different types of data we've discussed.

Azure Storage Overview

Azure Storage Overview

00:08:01

Lesson Description:

Within this lesson, we'll discuss the Azure Storage collection of services. You'll gain an understanding of:What Azure Storage service is The primary services within the Azure Storage service collection A high-level understanding of the purpose of each of the Azure Storage servicesThis high-level overview helps us to understand the purpose of the different services we'll be discussing throughout this course.

Azure Storage Accounts

00:13:37

Lesson Description:

Storage Accounts are at the core of everything we do with the various Azure Storage services. Within this lesson, we'll discuss:The purpose of a Storage Account Account Kind Performance Tiers Replication Access TiersThere are some differences in how these influence the different services, which we'll discuss throughout this course.

Azure Storage Management and Tools

00:09:20

Lesson Description:

Microsoft provides a range of tools to interact with the Azure Storage services across a range of different operating systems. Within this lesson, we'll discuss:The Azure Resource Manager (ARM) Application Programming Interface (API) Tools for scripting and management (PowerShell, and Azure CLI) Tools for management and administration (Azure Portal, and Storage Explorer) Tools for developing solutions which leverage Azure Storage services (Software Development Kits, etc.)

Azure Storage Security

00:12:55

Lesson Description:

Security is more important than ever before, with the number of news articles reporting cybercrime a daily occurrence. We all play a role, whether solution architect, developer, or administrator, to ensure that a solution is secure. This is a multi-layered, end-to-end responsibility. Within this lesson, we will discuss some important security concepts, including:Defense in depth Shared security model Azure Storage Access Control Data securityThroughout this course, we will expand on these principles and features, looking at the common security features available to all Azure storage services, as well as those specific to individual services.

Hands-on Labs are real live environments that put you in a real scenario to practice what you have learned without any other extra charge or account to manage.

00:30:00

Blob Storage

Blob Storage Overview

00:14:39

Lesson Description:

Welcome to the Blob storage section of this course. Within this overview lesson, we'll take a look at:The purpose of Blob storage The creation of Blob storage Common use cases for Blob storageThis overview lesson helps set the foundation for the remaining lessons in this section of the course.

Blob Storage Access Tiers

00:13:01

Lesson Description:

Blob storage access tiers help to provide economical access to storage, where you only pay for the performance you need, based on the age of your data. Access tiers support blob storage only (Azure Data Lake Storage Gen 2 support is in preview only). In this lesson we'll discuss:The purpose of access tiers Common lifecycles of data (active, inactive, archive) Key features of the archive tierFor up-to-date information on blob storage pricing, refer to the Microsoft blob storage pricing page.

Storage Account Replication

00:07:57

Lesson Description:

Note: This lesson applies to all services within the Azure storage service product suite. Storage account replication helps us to protect the back-end data of our storage service. Within this lesson we'll discuss:A recap of the different replication options Scenarios the replication options protect against Read-access for replicas Storage account failoverPlease be aware that not all Azure storage services support read-access for replicas at this stage.

Blob Storage Connectivity: Part I

00:08:52

Lesson Description:

In this two-part lesson, we'll discuss the private and public connectivity of the Azure blob storage service. Within this lesson, we will discuss:Public connectivity Custom domain names Static website

Blob Storage Connectivity: Part II

00:14:45

Lesson Description:

In the second lesson in this two-part series, we will be discussing private connectivity for Azure blob storage. We will discuss:Service Endpoints Private Link The storage account firewallThese features and services help us to ensure that our Azure blob storage service is secure and/or private.

Blob Storage Security: Part I

00:18:17

Lesson Description:

Please be aware: As this is the first service where we discuss security, this is a detailed lesson. We will use this lesson as a foundation for future security lessons, given the various shared/common security features available across all storage services. Within this detailed two-part lesson, we will take a look at the various security features available for Azure blob storage. In this first lesson, we will discuss:Container access levels Role-based-access-control (with Azure AD identities) Account keys Shared Access Signatures Identity-based authorization (User Delegated SAS, and Data Actions) Access PoliciesNote: It is also possible to assign an identity with data level permissions (this is demonstrated in the Queues development lesson). Commands used within this lesson: The following command creates a user-delegated-sas, associated with the currently authenticated user.

az storage blob generate-sas 
    --account-name latajlstore1 
    --container-name test1 
    --name logfile.txt 
    --permissions r 
    --expiry 2020-01-20 
    --auth-mode login 
    --as-user 
    --full-uri 

Blob Storage Security: Part II

00:04:14

Lesson Description:

This is part two of a two-part lesson on security for Blob storage. Within this lesson, we will discuss:Secure Transfer (encryption of data in transit) Storage Service Encryption (SSE; encryption of data at rest)

Developing with Blob Storage: Part I

00:09:01

Lesson Description:

Please note: As this is the first lesson where we are developing with an Azure storage service, additional information is provided, and so this lesson is split into two parts. Within part one of this two-part lesson, we will discuss:Developing with software development kits (SDKs) Developing with the REST API Useful documentation centers for Azure developmentIn part two, we will get started developing a solution. Useful links used in this lesson:Azure SDKs and Tools Azure REST API

Developing with Blob Storage: Part II

00:14:45

Lesson Description:

In this second part of the two-part series on developing with Blob storage, we will walk through a solution demonstration. With this demonstration, you will learn:How to use the Python SDK to develop with Blob storage How to use the relevant classes and methods to interact with the Blob storage service How to authentication and connect to Blob storageWithin this example, we will consider a scenario where we have two components of a solution:The first component is the service tier The second component is the client The service tier will use the account key The client will use a SAS generated by the service tierNote: this is a demonstration only, and a proper best-practice solution would generally look a lot different than this, include error handling, and much more. Demonstration code: blob_demo.py

# General packages, as required
import uuid
from datetime import datetime, timedelta



# For working with the blob service, at the storage account level
from `azure.storage.blob` import `BlobServiceClient`

# For working at the containers, within a storage account
from `azure.storage.blob` import `ContainerClient`

# Use the SAS generator function, and SAS permissions class for containers from `azure.storage.blob` import `ContainerSasPermissions`, `generate_container_sas`

connect_str = "<ADD YOUR FULL CONNECTION STRING HERE>"
user_container_name = "userimages-id-" + str(uuid.uuid4())[0:4]

# Create the `BlobServiceClient` object which will be used to create a container client
blob_service_client = BlobServiceClient.from_connection_string(connect_str)
print("**** Client connected to Storage Account: " + blob_service_client.account_name + " ****")

# Create a new container
container_client = blob_service_client.create_container(user_container_name)
print("nCreated container: " + user_container_name)



# Create a SAS for the container, to be used
print("nGenerating SAS...")
container_sas = generate_container_sas(blob_service_client.account_name, 
    container_client.container_name,
    account_key=blob_service_client.credential.account_key,
    permission=ContainerSasPermissions(read=True,write=True,list=True),
    expiry=datetime.utcnow() + timedelta(hours=1),
    start=datetime.utcnow() - timedelta(hours=1)
)

# Print the SAS
container_url = "https://" + blob_service_client.account_name + ".blob.core.windows.net/" + container_client.container_name
full_sas_url = container_url + "?" + container_sas 
print("Container SAS for mobile client: ")
print(">> SAS: " + container_sas)
print(">> FULL URL (for web browser): " + full_sas_url + "&comp=list&restype=container")


## MOBILE CLIENT

# For working with blobs, within a container
from `azure.storage.blob` import `BlobClient
import os`, `pathlib`

# Set the image path for upload
current_path = pathlib.Path(__file__).parent.absolute()
image_file_name = "james_lee.jpg"
image_upload_path = os.path.join(current_path, image_file_name)

# Create the blob client for upload
blob_client = BlobClient.from_blob_url(container_url + "/" + image_file_name + "?" + container_sas)
print("Blob Client connected to: " + container_url + "/" + image_file_name + "?" + container_sas)
print("nUploading blob:nt" + image_file_name)

# Upload the created file
with open(image_upload_path, "rb") as data:
    blob_client.upload_blob(data)

Hands-on Labs are real live environments that put you in a real scenario to practice what you have learned without any other extra charge or account to manage.

00:30:00

Hands-on Labs are real live environments that put you in a real scenario to practice what you have learned without any other extra charge or account to manage.

00:45:00

Hands-on Labs are real live environments that put you in a real scenario to practice what you have learned without any other extra charge or account to manage.

00:45:00

Hands-on Labs are real live environments that put you in a real scenario to practice what you have learned without any other extra charge or account to manage.

00:45:00

Azure Data Lake Storage (ADLS)

Azure Data Lake Storage Overview

00:08:56

Lesson Description:

Welcome to the Azure Data Lake Storage Gen2 (ADLS) section of this course. Within this lesson, we will discuss:What a data lake storage solution is used for Common scenarios including big data, analytics, and data lakes ADLS key features Creation of ADLSHelpful links:For more information on Data Lake storage and big data solutions, see Microsoft Azure Exam DP-200 - Implementing an Azure Data Solution

Working with Azure Data Lake Storage

00:06:24

Lesson Description:

Within this lesson, we will discuss:The hierarchical namespace Common security features POSIX security and access control lists (ACLs) Windows Azure Storage Blob driver (WASB) Azure Blob File System driver (ABFS)

Hands-on Labs are real live environments that put you in a real scenario to practice what you have learned without any other extra charge or account to manage.

00:30:00

Azure Files

Azure Files Overview

00:06:15

Lesson Description:

Welcome to the Azure Files section. Within this lesson, we'll discuss a high-level overview of the Azure Files service. Specifically, we'll talk about:The purpose of Azure Files Protocols and connectivity for Azure Files Scenarios for using Azure Files

Azure Files Security

00:11:37

Lesson Description:

Within this lesson, we'll discuss the security features available with Azure Files. Note: Whilst there are several common security features with Storage services in general, there are a number of differences for each service also. In this lesson we will cover:Network accessibility (REST and SMB) Network security (public and private) Access control (common, and identity-based) Encryption (at rest, and in transit)Links mentioned in this lesson:Azure AD Domain Services lesson

Azure Files Implementation

00:12:37

Lesson Description:

Within this lesson, we will take a look at some hands-on implementation tasks for working with Azure Files. We will discuss:Connecting to Azure Files (storage account key) Connecting to Azure Files (Azure AD DS identity) Snapshots Monitoring Resource locks

Developing with Azure Files and the REST API

00:19:15

Lesson Description:

Within this lesson, we will walk through a demonstration Python script for interacting with Azure Files using the Files REST API. We will talk about:Key documentation for API operations Shared key authorization Interacting with Azure FilesUseful links for this lesson:ARM API for Storage Serviecs ARM API Authorization with Shared Key Azure Files REST APIExample code: files_api_demo.py

import requests
import datetime
import hmac
import hashlib
import base64
import xml.dom.minidom

sa_name = "<Your storage account name goes here>"
sa_key = "<Your storage account key goes here>"
share_name = "company"
api_version = "2016-05-31"
request_time = datetime.datetime.utcnow().strftime('%a, %d %b %Y %H:%M:%S GMT')

# Let's use a dictionary to capture the information we need
signature_string_values = {
    'VERB': 'GET',
    'Content-Encoding': '',
    'Content-Language': '', 
    'Content-Length': '',
    'Content-MD5': '',
    'Content-Type': '',  
    'Date': '',
    'If-Modified-Since': '',
    'If-Match': '',  
    'If-None-Match': '',
    'If-Unmodified-Since': '',
    'Range': '',
    'CanonicalizedHeaders': '',
    'CanonicalizedResource': ''
}

# When we just want to list details for the File share, we only need:
signature_string_values['VERB'] = 'GET'
signature_string_values['CanonicalizedHeaders'] = 'x-ms-date:' + request_time + 'nx-ms-version:' + api_version + 'n'
signature_string_values['CanonicalizedResource'] = '/' + sa_name + '/ncomp:list'

# Let's create the string to be signed, noting the specific format as per
# the documentation https://docs.microsoft.com/en-us/rest/api/storageservices/authorize-with-shared-key
string_to_sign = (signature_string_values['VERB'] + 'n'
    + signature_string_values['Content-Encoding'] + 'n'
    + signature_string_values['Content-Language'] + 'n'
    + signature_string_values['Content-Length'] + 'n'
    + signature_string_values['Content-MD5'] + 'n'
    + signature_string_values['Content-Type'] + 'n'
    + signature_string_values['Date'] + 'n'
    + signature_string_values['If-Modified-Since'] + 'n'
    + signature_string_values['If-Match'] + 'n'
    + signature_string_values['If-None-Match'] + 'n'
    + signature_string_values['If-Unmodified-Since'] + 'n'
    + signature_string_values['Range'] + 'n'
    + signature_string_values['CanonicalizedHeaders']
    + signature_string_values['CanonicalizedResource'])

print("nnn===== START ====")
print("nWe are using the following string to sign: '" + string_to_sign + "'")

# We need an to decode our storage account key
key = base64.b64decode(sa_key.encode('utf-8'))

# Now let's sign the string
signature = base64.b64encode(hmac.new(key, msg=string_to_sign.encode('utf-8'), digestmod = hashlib.sha256).digest()).decode()
print("nThe signed string is: '" + signature + "'")

# Setup our headers for the request
headers = {
    'Authorization' : ('SharedKey ' + sa_name + ':' + signature),
    'x-ms-date' : request_time,
    'x-ms-version' : api_version
}

url = ('https://' + sa_name + '.file.core.windows.net/?comp=list')


############################

# Let's perform the request!
xml_res = requests.get(url, headers=headers)

print("nnn======= RESPONSE =======n")
print(xml_res.content.decode('utf-8'))

print("nLet's clean the response:n")
dom = xml.dom.minidom.parseString(xml_res.content.decode('utf-8'))

print(dom.toprettyxml())

Hands-on Labs are real live environments that put you in a real scenario to practice what you have learned without any other extra charge or account to manage.

00:30:00

Hands-on Labs are real live environments that put you in a real scenario to practice what you have learned without any other extra charge or account to manage.

00:30:00

Hands-on Labs are real live environments that put you in a real scenario to practice what you have learned without any other extra charge or account to manage.

00:45:00

Azure File Sync

Azure File Sync Overview

00:06:36

Lesson Description:

Welcome to the Azure File Sync section of this course. Within this lesson, we'll discuss:What Azure Files Sync is How Azure Files Sync differs to Azure Files An overview of the Azure File Sync Architecture

Azure File Sync Security

00:06:54

Lesson Description:

Within this lesson, we will discuss:The flow of data between user and Azure Files share Connectivity requirements for the Azure Files Sync agent Storage Account Firewall exception requirements Access control Encryption of data

Azure File Sync Implementation

00:09:05

Lesson Description:

Within this lesson, we'll walk through a basic implementation of the Azure File Sync service. Through this lesson, we'll cover:Azure File Sync architecture Installation of the sync agent Configuration of a sync group Registration of a server Configuration of a server endpoint

Azure Queue Storage

Azure Queue Storage Overview

00:07:25

Lesson Description:

Welcome to the Azure Queue storage section of this course. Within this lesson, we'll discuss:What Azure Queue storage is Azure Queue storage's role in modern cloud applications An example scenario where this solution can be used

Azure Queue Storage Security

00:11:26

Lesson Description:

Within this lesson, we focus primarily on access control and cover common security features at a high level. Specifically, we'll discuss:Common access control features Identity-based access control Network security considerations Data encryptionUseful links for this lesson:Creating a Service SAS for queues Create custom roles for Azure role-based access control Azure resource provider operations

Developing with Azure Queue Storage

00:14:27

Lesson Description:

Throughout this lesson, we'll walk through a basic example solution using the Python SDK to interact with the Azure queue storage service. This demonstration will include:Registration of an application with Azure Active Directory (AD) Assign data layer permissions to the application (service principal) to interact with queues Perform 'sender' and 'processor' operations Authenticate using our service principal and client secretExample code: queues_demo_py:

#### SENDER ####

import uuid

# For working at the queue level, within a storage account
from azure.storage.queue import QueueClient

processing_queue_name = "processing-queue-id-" + str(uuid.uuid4())[0:4]
connect_str = "<Your storage account connection string (without authentication) goes here>"

# Authentication library
from azure.identity import ClientSecretCredential

# Information for authenticating using a Service Principal (the identity of our application)
tenant_id = "<Your Azure AD tenant ID goes here"
client_id = "<Your registered application ID goes here>"
client_secret ="<Your registered application secret goes here>"

# Get the application credentials
app_credentials = ClientSecretCredential(tenant_id, client_id, client_secret)

# Create a queue client, using the application Azure AD credentials
queue_client = QueueClient.from_connection_string(connect_str, processing_queue_name, credential=app_credentials)
print("Client connected to Queue")

# Create a new queue
queue_client.create_queue()
print("Created queue: " + processing_queue_name)



#### PROCESSING CLIENT ####

# Send messages to the queue
print("nLet's add some messages...")
queue_client.send_message(u"Message 1")
queue_client.send_message(u"Message 2")
saved_message = queue_client.send_message(u"Message 3")

print("nLet's PEEK at the messages in the queue...")

# Peek at messages in the queue
peeked_messages = queue_client.peek_messages(max_messages=5)

for peeked_message in peeked_messages:
    # List the message
    print("Message: " + peeked_message.content)

Hands-on Labs are real live environments that put you in a real scenario to practice what you have learned without any other extra charge or account to manage.

01:00:00

Hands-on Labs are real live environments that put you in a real scenario to practice what you have learned without any other extra charge or account to manage.

00:45:00

Azure Table Storage

Azure Table Storage Overview

00:08:21

Lesson Description:

Welcome to the Azure Table storage section of this course. Within this lesson, we'll discuss:What Azure Table storage is Azure Table storage's role in modern cloud applications (using unstructured data) Resource hierarchy and important design considerations (partitioning) An example scenario where this solution could be usedPlease be aware: The Cosmos DB team now manages the Azure Table service, and so features are subject to change, and documentation can be complicated, at times.

Azure Table Storage Security

00:04:45

Lesson Description:

Within this lesson, we'll walk through the security features that are available for Table storage, as well as some important differences/limitations from what we've discussed so far. We will discuss:Access control Limitations for identity-based access control Additional features for Shared Access Signatures Network security (common functionality) Data encryption (common functionality)

Developing with Azure Table Storage

00:10:56

Lesson Description:

Within this lesson, we will work through an example development scenario. We will cover:Python development using the Cosmos DB Table SDK Important code similarities when using the Cosmos DB Table SDK (minimal change to move to Cosmos DB) Entity object for storing data to be inserted into the Table service Modifications to the schema Batch transactionsDemonstration code: tables_demo.py

import time

new_table_name = "coursecatalog"
connect_str = "<Insert your full connection string here>"

# using cosmos db table api
from azure.cosmosdb.table.tableservice import TableService

# Cosmos DB supports multiple models; we can use the Entity class to simplify the addition of table data (entities)
from azure.cosmosdb.table.models import Entity 

# Connect to the Table service
table_service = TableService(connection_string=connect_str)
print("nConnected to Table service")

# Create a new Table
table_service.create_table(new_table_name)
print("nCreated table: " + new_table_name)

# Give some time for the table to be created
time.sleep(5)

# Add some data (ENTITIES) to the table - we use objects to represent our data
print("nCreating entity...")

course_details = Entity()
course_details.PartitionKey = 'Azure'
course_details.RowKey = '280'
course_details.name = 'AZ-300'
course_details.author = 'James Lee'
course_details.description = 'Certification course for MS AZ-300'

# Insert data
print("nAdding entity to table, '" + new_table_name + "'...")
table_service.insert_entity(new_table_name, course_details)

# Add some data (ENTITIES) to the table - we use objects to represent our data
print("nCreating entity...")

course_details = Entity()
course_details.PartitionKey = 'Azure'
course_details.RowKey = '381'
course_details.name = 'AZ-301'
course_details.author = 'James Lee'
course_details.description = 'Certification course for MS AZ-301'

# Insert data
print("nAdding entity to table, '" + new_table_name + "'...")
table_service.insert_entity(new_table_name, course_details)


# Add some data (ENTITIES) to the table - we use objects to represent our data
print("nCreating entity...")

course_details = Entity()
course_details.PartitionKey = 'Azure'
course_details.RowKey = '441'
course_details.name = 'Azure Storage Deep Dive'
course_details.author = 'James Lee'
course_details.description = 'Deep dive in to the Azure Storage services'

# Insert data
print("nAdding entity to table, '" + new_table_name + "'...")
table_service.insert_entity(new_table_name, course_details)

# Add some data (ENTITIES) to the table - we use objects to represent our data
print("nCreating entity...")

course_details = Entity()
course_details.PartitionKey = 'Azure'
course_details.RowKey = '378'
course_details.name = 'DP-200'
course_details.author = 'Brian Roehm'
course_details.description = 'Microsoft Azure Exam DP-200 - Implementing an Azure Data Solution'

# Insert data
print("nAdding entity to table, '" + new_table_name + "'...")
table_service.insert_entity(new_table_name, course_details)

# Add some data (ENTITIES) to the table - we use objects to represent our data
print("nCreating entity...")

course_details = Entity()
course_details.PartitionKey = 'Azure'
course_details.RowKey = '367'
course_details.name = 'AZ-500'
course_details.author = 'Shawn Johnson'
course_details.description = 'AZ-500: Microsoft Azure Security Technologies'

# Insert data
print("nAdding entity to table, '" + new_table_name + "'...")
table_service.insert_entity(new_table_name, course_details)

# Add some data (ENTITIES) to the table - we use objects to represent our data
print("nCreating entity...")

course_details = Entity()
course_details.PartitionKey = 'DevOps'
course_details.RowKey = '367'
course_details.name = 'DCA'
course_details.author = 'Will Boyd'
course_details.description = 'Docker Certified Associate (DCA)'

# Insert data
print("nAdding entity to table, '" + new_table_name + "'...")
table_service.insert_entity(new_table_name, course_details)

# Add some data (ENTITIES) to the table - we use objects to represent our data
print("nCreating entity...")

course_details = Entity()
course_details.PartitionKey = 'DevOps'
course_details.RowKey = '305'
course_details.name = 'CKAD'
course_details.author = 'Will Boyd'
course_details.description = 'Certified Kubernetes Applicatino Developer (CKAD)'

# Insert data
print("nAdding entity to table, '" + new_table_name + "'...")
table_service.insert_entity(new_table_name, course_details)

print("n===== DONE =====")
Demonstration code: tables_batch.py
import time

table_name = "coursecatalog"
connect_str = "DefaultEndpointsProtocol=https;AccountName=latajlstore1;AccountKey=kU+w12g3WdyPiN8d/QiQvpeFYgIt0yxuSY8uJ6p1cMZR/uARJLrackSiuTqOiVCeOLjWCt0OLyZf+UtISzbeqw==;EndpointSuffix=core.windows.net"

from azure.cosmosdb.table.tableservice import TableService
from azure.cosmosdb.table.models import Entity 

# Let's import the batch library
from azure.cosmosdb.table.tablebatch import TableBatch

# Connect to the Table service
table_service = TableService(connection_string=connect_str)
print("nConnected to Table service")

# Let's perform a batch transaction, and add two course expiry dates
batch = TableBatch()
task001 = {'PartitionKey': 'Azure', 'RowKey': '280',
    'expiryDate': '1 July 2020'}
task002 = {'PartitionKey': 'Azure', 'RowKey': '381',
    'expiryDate': '1 July 2020'}

# Perform batch transaction using MERGE (could be update, insert, etc)
batch.merge_entity(task001)
batch.merge_entity(task002)
table_service.commit_batch(table_name, batch)

print("nBatch transaction complete")
print("=======================")

Hands-on Labs are real live environments that put you in a real scenario to practice what you have learned without any other extra charge or account to manage.

01:00:00

Hands-on Labs are real live environments that put you in a real scenario to practice what you have learned without any other extra charge or account to manage.

00:45:00

Azure Disks

Azure Disks Overview

00:08:07

Lesson Description:

Welcome to the Azure Disks storage section of this course. Within this lesson, we'll discuss:What Azure Disks are used for Managed Disks vs. Unmanaged Disks Disk types (OS, Data, Temporary, Ephemeral)Please be aware: this lesson focuses on Azure Managed Disks, and only mentions Azure Unmanaged Disks from a historical and contextual perspective.

Azure Disk Types and Performance

00:09:14

Lesson Description:

Within this lesson, we'll discuss some important performance concepts, including disk caching and performance characteristics. We will discuss:Disk Caching (read/write, read-only, none) BlobCache architecture (host SSD/memory) Performance tiers (ultra, premium SSD, standard SSD, standard HDD) Performance provisioning (what influences the actual disk performance a VM receives)

Azure Disk Encryption

00:05:57

Lesson Description:

Within this lesson, we'll discuss two important features for encrypting our disk data both at the Microsoft datacentres and on the volume itself. We will discuss:Azure Disk Encryption (ADE) for Windows and Linux Storage Service Encryption (SSE) for encryption at rest Configuration of a Key Vault for ADE Configuring of ADE for Windows using BitLocker

Azure Disk Management and Snapshots

00:10:40

Lesson Description:

Within this lesson, we'll work through some common administration tasks when working with Managed Disks. We will discuss:Resizing disks (size/performance) Impact of size and performance tier on estimated performance Snapshots (full and incremental) Creation of Managed Disks Monitoring and protecting Managed Disks

Hands-on Labs are real live environments that put you in a real scenario to practice what you have learned without any other extra charge or account to manage.

00:30:00

Hands-on Labs are real live environments that put you in a real scenario to practice what you have learned without any other extra charge or account to manage.

00:30:00

Final Thoughts

Choosing the Right Service

00:07:08

Lesson Description:

Within this lesson, we'll cover some common questions that are asked regarding the services we've discussed within this course. We will discuss:Azure Blobs vs. Azure Files vs. Azure Disks Storage Queues vs. Service Bus Storage Tables vs. Cosmos DBHelpful links from this lesson:Deciding when to use Azure Blobs, Azure Files, or Azure Disks Storage queues and Service Bus queues - compared and contrasted

What's Next?

00:02:15

Lesson Description:

Congratulations on completing this course! Great work. You've covered a lot of information about the core Azure Storage services. If you're interested in learning more about storage services, then the following links may help. Related and helpful links:Microsoft Azure Exam DP-200 - Implementing an Azure Data Solution Microsoft Azure Exam DP-201 - Designing an Azure Data Solution Microsoft Certified Azure Developer - Exam AZ-203 Prep Vote for additional content you'd like to see

Quiz: Azure Storage Deep Dive

00:30:00

Take this course and learn a new skill today.

Transform your learning with our all access plan.

Start 7-Day Free Trial