Centralized Log management using AWS services

By August 26, 2019 May 18th, 2020 AWS, Cloud, Cloud Assessment

written by Govindhakrishnan, Cloud Engineer and contributed by Raju Banerjee, Cloud Architect at Powerupcloud Technologies.

Centralized Log management using AWS services

Why Log Management?
Every day, the components in your software stack store operational data in the form of logs. Log data contains valuable information about the performance, stability, and security of your infrastructure.

The challenge of logging is in collecting this information, extracting it, and using it to make actionable decisions.

Log management is about more than just collecting log files. It includes centralizing logs, parsing events into individual data points, storing it properly for future use, and analyzing the results.

This provides a more detailed look into system performance, as well as the ability to immediately detect problems or anomalies. Ultimately, it provides you with the insights needed to improve your operations.

Importance of Log Collection:

When you have hundreds or thousands of services running on multiple servers in multiple locations, manually searching through log files is a time taking and burden to remember the log path.

A centralized log management system will aggregate and store logs, allowing you to access, search, and manage event logs from a single location.

In the Market, we have many tools available to do the same kind of process, as our complete Infra is running on AWS we have tried to leverage all the cloud features.

We have designed Log Management by using all the AWS newly launched features which is more cost-effective and easy to manage.

AWS services we have used in our design:

  1. S3
  2. SSM
  3. CloudWatch

We will describe each service detail further in our blog.

In this blog, we have described the windows log management strategy using Windows Powershell script.

Let’s start with our solution:

Before starting, let’s be ready with the below pr-requisites:

Create an EC2 role and attach the below policies:

  1. EC2 role for SSM.
  2. S3 Bucket policy to a particular bucket as per the best practices.
  3. Cloud watch role.

The below application needs to be installed in servers:
For moving the backups from server to S3.

To compress the log files before moving to the S3 bucket, we can create a zip file using any zip utility software. In our solution, we have used utility 7-Zip.

While installing select default Installation path: C:\Program Files\

3. Create one S3 bucket — keeping the backup script, Input file and Storing the log.

NOTE: We can install the below packages in the server using the SSM Run command if SSM is already in place or manually login into the server.

Let’s get started with our implementation phase:

First, we will prepare one CSV input file, where we are going to mention the server details, log path, retention period, backup location, the script use this details and grep the log files from the given servers and move it to a single location as per the backup location mentioned in that Input file.

CSV file content description:

● Instance Id: < mention the Instance ID of the servers which you want to take backup >

● Pathmentioned the log path to take backup > 
Retention Period: < mention the days of file hold in server >

● S3 bucket: mention the bucket Name and path,after the bucket name mentioned the folder names whatever you want >

● Local backup: < before moving the backup files,mentioned the path where you want store the backup logs in local server >

Log Management Implementation

We can implement this Log management using SSM for the below two ways, the first solution is to execute the log backup manually from the AWS SSM console and the second solution is to schedule the backup job like a cron service.

1. AWS-Systems Manager Run Command from AWS-SSM console:

We can configure the inputs(command document & command parameters)and select the target to execute the task. If the task has run once means we can go with AWS-SSM console.

2.AWS Cloud watch Event Scheduler with SSM Agent: AWS managed CRON service to back up the logs directly to S3 bucket, choose Cron expression and specify a Cron expression that defines when the task is to be triggered.

  1. Now, we will deep dive into our first solution using AWS-System Manager console and move the logs to centralized location i.e S3(created earlier)

AWS-Systems Manager Run Command:
AWS Systems Manager Run Command lets you remotely and securely manage the configuration of your managed instances. A managed instance is an Amazon EC2 instance or on-premises machine in your hybrid environment that has been configured for Systems Manager. Run Command enables you to automate common administrative tasks. You can use Run Command from the AWS console, the AWS Command Line Interface, AWS Tools for Windows PowerShell, or the AWS SDKs. Run Command is offered at no additional cost.

Select AWS-Run Remote Script from Command document,
Using the AWS-RunRemoteScript document we are going to execute the scripts to SSM managed servers.

Command Parameters:

Please find the image for SSM run command parameters configuration

Explanation of command parameters

1.Source Type
(Required) Specify the source type. There is two source type available here,GitHub & S3.
For here we download the script file from S3, So select S3.

2.Source Info

(Required) Specify the information required to access the resource from the source. If the source type is GitHub, then you can specify any of the following: ‘owner’, ‘repository’, ‘path’, ‘getOptions’, ‘tokenInfo’. If the source type is S3, then you can specify ‘path’.

3.Command Line

(Required) Specify the command line to be executed. The following formats of commands can be run: ‘pythonMainFile.py argument1 argument2’, ‘ansible-playbook -i “localhost,” -c local example.yml’

4.Working Directory

(Optional) The path where the content will be downloaded and executed from on your instance.

5.Execution Timeout

(Optional) The time in seconds for a command to complete before it is considered to have failed. The default is 3600 (1 hour). The maximum is 28800 (8 hours).

Logmanagement.ps1 — Powershell backup script(Check the script at the end of the blog)

Specify the target by
Specifying a tag or manually selecting instances.

Write command output to an Amazon S3 bucket

Send complete execution output to S3 bucket.

SNS notifications

Configure Systems Manager to send notifications about command statuses using Amazon Simple Notification Service
Specify the IAM role and SNS topic and choose the type of event which you want to be notified,like Success,Failed”.

Finally, Run this task. and we will see the output in the below images.

After execution, we can get Command status.
Here we can find the task status based on the targets

Post execution, check the output in S3 bucket:

Let’s explore our second solution where you can schedule the backup job using AWS Cloud watch Event Scheduler with SSM Agent that Triggers on a Schedule — AWS managed CRON service to backup the logs directly to S3 bucket, choose Cron expression and specify a cron expression that defines when the task is to be triggered.

To create a rule that triggers on a regular schedule

  1. Open the CloudWatch console.
  2. In the navigation pane, choose EventsCreate rule.
  3. For Event source, choose Schedule.
  4. Choose Fixed rate of and specify how often the task is to run, or choose Cron expression and specify a cron expression that defines when the task is to be triggered. For more information about cron expression syntax.

5. For Targets, choose Add Target and choose the AWS service that is to act when an event of the selected type is detected.

6. In the Configure parameter section, enter information specific to this target type

7. For many target types, CloudWatch Events needs permissions to send events to the target. In these cases, CloudWatch Events can create the IAM role needed for your event to run:

● To create an IAM role automatically, choose to Create a new role for this specific resource.

● Choose create new role, if you are going to use the service for the first time as we already have one role defined we have choose the option — use existing role.

8. Choose Configure details. For Rule definition, type a name and description for the rule.

Choose Create rule.

9. Choose Create rule.

Verify the logs in the defined S3 bucket, which we have given as an input form:

Logmanagement.ps1 — Powershell backup script

Leave a Reply