Skip to main content

Working with AWS VPC Flow Logs for Network Monitoring

Hands-On Lab

 

Photo of Miles Baker

Miles Baker

AWS Training Architect II

Length

00:45:00

Difficulty

Advanced

Monitoring network traffic is a critical component of security best practices to meet compliance requirements, investigate security incidents, track key metrics, and configure automated notifications. AWS VPC Flow Logs captures information about the IP traffic going to and from network interfaces in your VPC. In this hands-on lab, we will set up and use VPC Flow Logs published to Amazon CloudWatch, create custom metrics and alerts based on the CloudWatch logs to understand trends and receive notifications for potential security issues, and use Amazon Athena to query and analyze VPC Flow Logs stored in S3.

What are Hands-On Labs?

Hands-On Labs are scenario-based learning environments where learners can practice without consequences. Don't compromise a system or waste money on expensive downloads. Practice real-world skills without the real-world risk, no assembly required.

Working with AWS VPC Flow Logs for Network Monitoring

Introduction

In this hands-on lab, we will set up and use VPC Flow Logs published to Amazon CloudWatch, create custom metrics and alerts based on the CloudWatch logs to understand trends and receive notifications for potential security issues, and use Amazon Athena to query and analyze VPC Flow Logs stored in S3.

Solution

Log in to the live AWS environment using the credentials provided.

Once inside the AWS account, make sure you are using us-east-1 (N. Virginia) as the selected region.

Create S3 Bucket for VPC Flow Logs and VPC Flow Log to S3

Create S3 Bucket for VPC Flow Logs

  1. Navigate to S3.
  2. Copy the cloud_user account ID (the number shown after cloud_user in the menu bar at the top right of the screen) to your favorite text editor, and delete the dashes in the ID. We'll need this number later.
  3. Click Create bucket.
  4. Enter a unique bucket name (e.g., "vpc-flow-logs-ACCOUNT ID-YOUR INITIALS"). If the name you entered exists, try a different name.
  5. Click Next.
  6. Check the box for Default encryption.
  7. Click Next on the Configure options screen.
  8. Click Next on the Set permissions screen.
  9. Click Create bucket.
  10. Select the checkbox next to the name of bucket you just created.
  11. Click Copy Bucket ARN from the popup window. Paste the ARN in your favorite text editor. We will use the S3 ARN in an upcoming task.

Create VPC Flow Log to S3

  1. In a new browser tab, navigate to VPC.
  2. Click Your VPCs in the left-hand menu.
  3. Select the LinuxAcademy VPC.
  4. Select the Flow Logs tab.
  5. Click Create flow log, and set the following values:
    • Filter: All
    • Destination: Send to an S3 bucket
    • S3 bucket ARN: Paste the S3 bucket ARN you copied earlier
  6. Click Create and then Close.
  7. Verify the flow log shows an Active status.
  8. Select the hyperlink in the Destination name field.
  9. In the S3 bucket browser tab that opens, click the Permissions tab.
  10. Click Bucket Policy.
  11. Notice the bucket path in the policy includes AWSLogs. Note: It can take 5-15 minutes before logs start to show up, so let's move on while we wait for that to happen.

Create CloudWatch Log Group and VPC Flow Log to CloudWatch

Create CloudWatch Log Group

  1. In a new browser tab, navigate to CloudWatch.
  2. Click Logs in the left-hand menu.
  3. Click Actions > Create log group.
  4. Enter "VPCFlowLogs" for the name.
  5. Click Create log group.

Create VPC Flow Log to CloudWatch

  1. Back in the VPC browser tab, click Your VPCs in the left-hand menu.
  2. Select the LinuxAcademy VPC.
  3. Select the Flow Logs tab.
  4. Click Create flow log, and set the following values:
    • Filter: All
    • Destination: Send to CloudWatch Logs
    • Destination log group: VPCFlowLogs
    • IAM role: Select the one with DeliverVPCFlowLogsRole in the name
  5. Click Create and then Close.
  6. Verify the flow log shows an Active status.
  7. Select the hyperlink in the Destination name field. Note: It can take 5-15 minutes before logs start to show up, so let's move on while we wait for that to happen.

Generate Traffic

  1. Open a terminal session, and log in to the EC2 instance via SSH using the credentials provided.
  2. Exit the terminal.
  3. Navigate to EC2 in a new browser tab.
  4. Select the instance with the name Web Server.
  5. Next to Security groups, click the view inbound rules link.
  6. Observe that SSH is enabled for all source addresses.
  7. Click Actions > Networking > Change Security Groups.
  8. Uncheck the HTTP and SSH Access security group.
  9. Select the HTTP Access security group, and verify no other security groups are selected.
  10. Click Assign Security Groups.
  11. Next to Security groups, click the view inbound rules link.
  12. Observe that SSH is not enabled.
  13. Attempt to log in to the EC2 instance via SSH using the credentials provided — we expect this to timeout since we just selected a security group with no SSH access.
  14. Cancel the SSH command after 15 seconds.
  15. Return to EC2 in the AWS console.
  16. Select the instance with the name Web Server.
  17. Click Actions > Networking > Change Security Groups.
  18. Uncheck the HTTP Access security group.
  19. Select the HTTP and SSH Access security group, and verify no other security groups are selected.
  20. Click Assign Security Groups.

Create CloudWatch Filters and Alerts

Create CloudWatch Log Metric Filter

  1. Navigate to CloudWatch.

  2. Click Logs in the left-hand menu.

  3. In the VPCFlowLogs row, click on the 0 filters link.

  4. Click Add Metric Filter.

  5. Enter the following in the Filter Pattern field to track failed SSH attempts on port 22:

    [version, account, eni, source, destination, srcport, destport="22", protocol="6", packets, bytes, windowstart, windowend, action="REJECT", flowlogstatus]
  6. In the Select Log Data to Test dropdown, select Custom Log Data.

  7. Enter the following in the text box:

    2 086112738802 eni-0d5d75b41f9befe9e 61.177.172.128 172.31.83.158 39611 22 6 1 40 1563108188 1563108227 REJECT OK
    2 086112738802 eni-0d5d75b41f9befe9e 182.68.238.8 172.31.83.158 42227 22 6 1 44 1563109030 1563109067 REJECT OK
    2 086112738802 eni-0d5d75b41f9befe9e 42.171.23.181 172.31.83.158 52417 22 6 24 4065 1563191069 1563191121 ACCEPT OK
    2 086112738802 eni-0d5d75b41f9befe9e 61.177.172.128 172.31.83.158 39611 80 6 1 40 1563108188 1563108227 REJECT OK
  8. Click Test Pattern.

  9. Click on the Show test results link. You should see 2 of the 4 records matching.

  10. Click Assign Metric, and set the following values:

    • Filter Name: destination-port-22-rejects
    • Metric Name: SSH failures
  11. Click Create Filter.

Create Alarm Based on the Metric Filter

  1. Click on the Create Alarm link in the new destination-port-22-rejects metric filter box.
  2. Set Whenever SSH failures is... to Greater/Equal.
  3. Set Define the threshold value to 1.
  4. Click Next.
  5. On the Notifications page, set the following values:
    • Select an SNS topic: Create new topic
    • Create a new topic...: PROD-ALERT-{YOUR INITIALS}
    • Email endpoints that will receive the notification...: Enter your email address
  6. Click Create topic.
  7. Click Next.
  8. Set the Alarm name as SSH Reject.
  9. Click Next.
  10. Click Create alarm.
  11. You will receive an email notification asking you to confirm your subscription. Click the Confirm Subscription link in the email.

Generate Traffic for Alerts

  1. In the terminal, log in to the EC2 instance via SSH using the credentials provided.
  2. Exit the terminal.
  3. Navigate to EC2 in a new browser tab.
  4. Select the instance with the name Web Server.
  5. Next to Security groups, click the view inbound rules link.
  6. Observe that SSH is enabled for all source addresses.
  7. Click Actions > Networking > Change Security Groups.
  8. Uncheck the HTTP and SSH Access security group.
  9. Select the HTTP Access security group, and verify no other security groups are selected.
  10. Click Assign Security Groups.
  11. Next to Security groups, click the view inbound rules link.
  12. Observe that SSH is not enabled.
  13. Attempt to log in to the EC2 instance via SSH using the credentials provided — we expect this to timeout since we just selected a security group with no SSH access.
  14. Cancel the SSH command after 15 seconds.
  15. Return to EC2 in the AWS console.
  16. Select the instance with the name Web Server.
  17. Click Actions > Networking > Change Security Groups.
  18. Uncheck the HTTP Access security group.
  19. Select the HTTP and SSH Access security group, and verify no other security groups are selected.
  20. Click Assign Security Groups.
  21. You should receive an email notification once there is a failed SSH login after approximately 5-15 min.

Use CloudWatch Insights

  1. Navigate to CloudWatch.
  2. Click Insights in the left-hand menu.
  3. In the Select a log group search window, select VPCFlowLogs.
  4. Click Sample queries > VPC flow log queries > Top 20 source IP addresses with highest number of rejected requests.
  5. Observe the query has changed.
  6. Click Run query. After a few moments, we'll see some data start to populate.

Configure VPC Flow Logs Table and Partition in Athena

Record Reference Information to Be Used in Athena Queries

  1. Navigate to S3 in a new browser tab.
  2. Open your log bucket.
  3. Copy your log bucket name to a text editor for later use.
  4. Open the AWSLogs folder.
  5. Open the {ACCOUNT_ID} folder.
  6. Copy the account ID to a text editor for later use if you didn’t save it earlier.
  7. Open the vpcflowlogs folder.
  8. Open the us-east-1 folder.
  9. Open the {YEAR} folder.
  10. Open the {MONTH} folder.
  11. Write down the {YEAR}/{MONTH}/{DAY} using the latest {DAY} shown.

Create the Athena Table

  1. Navigate to Athena.

  2. If you come to a page with a Get Started button, click on it. Otherwise, skip this step.

  3. If a Tutorial popup window shows up, then click on the X in the upper right of the screen to close it. You can return to the tutorial later by selecting it in the upper right-hand menu.

  4. Paste the following DDL code in the new query window:

    CREATE EXTERNAL TABLE IF NOT EXISTS default.vpc_flow_logs (
      version int,
      account string,
      interfaceid string,
      sourceaddress string,
      destinationaddress string,
      sourceport int,
      destinationport int,
      protocol int,
      numpackets int,
      numbytes bigint,
      starttime int,
      endtime int,
      action string,
      logstatus string
    )  
    PARTITIONED BY (dt string)
    ROW FORMAT DELIMITED
    FIELDS TERMINATED BY ' '
    LOCATION 's3://{your_log_bucket}/AWSLogs/{account_id}/vpcflowlogs/us-east-1/'
    TBLPROPERTIES ("skip.header.line.count"="1");
  5. Update {your_log_bucket} and {account_id} in the query window with the values from this hands-on lab.

  6. Click Run query. You should see a Query successful. message once this has finished executing.

Create Partitions to Be Able to Read the Data

  1. Paste the following code in a new query window:

    ALTER TABLE default.vpc_flow_logs
    ADD PARTITION (dt='{Year}-{Month}-{Day}')
    location 's3://{your_log_bucket}/AWSLogs/{account_id}/vpcflowlogs/us-east-1/{Year}/{Month}/{Day}';
  2. Update the following elements in the query window with the values from this hands-on lab:

    • {your_log_bucket}
    • {account_id}
    • {Year},{Month}
    • {Day}
  3. Click Run query. You should receive a Query successful. message.

Analyze VPC Flow Logs Data in Athena

  1. Run the following query in a new query window:

    SELECT day_of_week(from_iso8601_timestamp(dt)) AS
     day,
     dt,
     interfaceid,
     sourceaddress,
     destinationport,
     action,
     protocol
    FROM vpc_flow_logs
    WHERE action = 'REJECT' AND protocol = 6
    order by sourceaddress
    LIMIT 100;

Conclusion

Congratulations on successfully completing this hands-on lab!