Working with AWS CloudWatch Logs for Incident Response

Hands-On Lab

 

Photo of Mark Richman

Mark Richman

AWS Training Architect II in Content

Length

01:00:00

Difficulty

Intermediate

As part of security best practices, centralizing logs from all your resources is a critical task. In this live AWS environment, you will work with CloudWatch Logs to prepare an environment for security incident response. By the end of this hands-on lab, we will have configured CloudWatch Logs to receive data from a number of sources, including EC2, CloudTrail, Route 53, and VPC Flow Logs. We will also look at analyzing CloudWatch Logs using the Elasticsearch service.

What are Hands-On Labs?

Hands-On Labs are scenario-based learning environments where learners can practice without consequences. Don't compromise a system or waste money on expensive downloads. Practice real-world skills without the real-world risk, no assembly required.

Working with AWS CloudWatch Logs for Incident Response

Introduction

As part of security best practices, centralizing logs from all your resources is a critical task. In this live AWS environment, you will work with CloudWatch Logs to prepare an environment for security incident response.

By the end of this hands-on lab, we will have configured CloudWatch Logs to receive data from a number of sources, including EC2, CloudTrail, Route 53, and VPC Flow Logs. We will also look at analyzing CloudWatch Logs using the Elasticsearch service.

Log in to the AWS Console using the credentials provided on the hands-on lab page.

Create Elasticsearch Cluster

  1. Open the Elasticsearch Service from the AWS Console Services list
    • Click Create new domain
    • Name: linuxacademy
    • Click Next
    • Click Next
    • Select Public access in the Network configuration section
    • Set the domain access policy to: Allow open access to the domain
    • Click I accept the risk and then click OK
    • Click Next and then click Confirm

This will take between 10 and 30 minutes to create. Let's continue on to our next task while that runs.

Install CloudWatch Agent on EC2

Configure the agent:

  1. SSH into EC2 instance using the credentials provided

    ssh cloud_user@<PUBLIC_IP_ADDRESS>
  • Download the agent.zip file in the installation steps above

    wget https://s3.amazonaws.com/amazoncloudwatch-agent/linux/amd64/latest/AmazonCloudWatchAgent.zip
  • Unzip the agent

    unzip AmazonCloudWatchAgent.zip
  • Install the agent

    sudo ./install.sh
    • For the Log file path question, provide /var/log/secure
    • Add an Additional log file path of /var/log/messages
    • We only want to provide these two log file paths. We can now enter 2 (no) for Do you want to specify any additional log files to monitor?
    • Do you want to store the config in the SSM parameter store?: 2
  • Run the configuration wizard

    sudo /opt/aws/amazon-cloudwatch-agent/bin/amazon-cloudwatch-agent-config-wizard
  • Start the agent

    sudo /opt/aws/amazon-cloudwatch-agent/bin/amazon-cloudwatch-agent-ctl -a fetch-config -m ec2 -c file:/opt/aws/amazon-cloudwatch-agent/bin/config.json -s
  • We can verify the new CloudWatch logs were created by navigating to the CloudWatch service in the AWS Console.

  • Click Logs. From here, we can see that two new log groups have been created.

  • Select the checkbox next to the secure log group and click Create Metric Filter.

    • Filter Pattern: session opened
    • Click Assign Metric
    • Filter Name: session-opened
    • Metric Name: SSH Session Opened
    • Click Create Filter

Configure CloudTrail to log to CloudWatch

  1. Navigate to the CloudTrail service
    • Create a new trail by clicking Create Trail
    • Trail name: mytrail
    • Click Add S3 bucket
    • S3 bucket: Provide a globally unique name
    • Click Create
    • Navigate back to the new trail just created by clicking on it's name in the list
    • Scroll down to CloudWatch Logs
    • Click Configure
    • Use default log group name
    • Click Continue
    • In the IAM Role field, select the pre-created IAM policy from the drop-down
    • Click Allow
    • Navigate to the CloudWatch service and click Logs to view the CloudTrail data for our new log group

Configure Route 53 to Log DNS Queries to CloudWatch Logs

  1. Navigate to the Route 53 service

    • Select hosted zone

    • Click the radio button next to the hosted zone name

    • Click Configure query logging

    • Create a log group

    • New log group name: /aws/route53/<HOSTED-ZONE-NAME>

    • Click Create log group

    • Create a new resource policy by clicking the button next to Create a new resource policy in US East (N. Virginia)

    • Resource policy name: route53-query-logging-policy

    • Log groups that the resource policy apply to: /aws/route53/*

    • Click Create policy and test permissions

    • From the CLI, generate DNS traffic using the host command

      host hello.&lt;HOSTED-ZONE-NAME&gt;

    • An example would be:

      host hello.cmcloudlab138.info
  • View DNS logs in CloudWatch logs
    • Navigate to the CloudWatch service in the AWS Console
    • Click Logs
    • Select the log group we created (named like /aws/route53/cmcloudlab138.info)
    • Refresh the page by clicking the Refresh icon in the top-right of the window
    • You should see a new log stream appear that is named with seemingly random numbers and letters. Click on this log stream name.
    • Expand the event to display the log message

Configure VPC Flow Logs to Stream to CloudWatch Logs

  1. Navigate to the VPC service in the AWS Console and select Your VPCs
    • Select the box next to the LinuxAcademy VPC
    • Select the Flow Logs tab
    • Click Create Flow Log
    • Role: Select the "CloudTrailCloudWatchLogs" role
    • Destination Log Group: VPCFlowLogs
    • Click Create Flow Log

From your terminal (where we are logged in to our EC2 instance), generate some traffic to create a few log entries:

curl http://linuxacademy.com

In your browser window, click on the VPCFlowLogs log group name and then click the log stream name in the list. From this page, click through a few of the events to familiarize yourself with the log format.

Analyzing Log Data with Elasticsearch

  1. Navigate to the CloudWatch service in the AWS Console
    • Click Logs
    • Select the box next to VPCFlowLogs
    • Click Actions and then click Stream to Amazon Elasticsearch Service
    • Amazon ES cluster will be the Elasticsearch cluster name we created earlier, linuxacademy
    • Lambda IAM Execution Role: Choose the "CloudTrailCloudWatchLogs" role
    • Click Next
    • Log Format: Amazon VPC Flow Logs
    • Click Next two times and then click Start Streaming
    • Navigate to the Elasticsearch service
    • Click the linuxacademy domain name
    • Click the link for Kibana
    • In the top-right, click Set up index patterns
    • Note: It may take a few minutes for Elasticsearch to start processing data from CloudWatch. Give it a few minutes and then click Check for new data.
    • Index pattern: cwl*
    • Click Next step
    • Time Filter field name: @timestamp
    • Click Create index pattern
    • Click Discover in the left sidebar
    • In the search bar at the top of the page, search for action:REJECT

Conclusion

Congratulations, you've completed this hands-on lab!