Export AWS CloudWatch logs to S3 using lambda functions in Node.js
Export AWS CloudWatch logs to S3 using lambda functions in Node.js

Export AWS CloudWatch logs to S3 using lambda functions in Node.js

Spread the love
Export AWS Cloudwatch logs using AWS Lambda functions

AWS CloudWatch Logs enables you to centralize the logs from all of your systems, applications, and AWS services that you use, in a single, highly scalable service. You can then easily view them, search them for specific error codes or patterns, filter them based on specific fields, or archive them securely for future analysis.

By default, logs are kept indefinitely and never expire. You can adjust the r

etention policy for each log group, keep indefinite retention, or choose a retention period between one day and 10 years. You can use CloudWatch Logs to store your log data in highly durable storage. The CloudWatch Logs agent makes it easy.

There are two ways to archive the logs:
1. Manual process
2. Automated process

In today’s post, we are going to see both this process.


Prerequisite:

Before starting, follow the below steps to give Cloudwatch logs permission on S3 bucket.

1: Log in to your AWS account.

2: Create an Amazon S3 Bucket with region same as cloud watch logs region.

3: Create an IAM User with Full Access to Amazon S3 and CloudWatch Logs. To learn more about how to create an AWS S3 bucket & create an IAM user read here.

4: Set Permissions on an Amazon S3 Bucket.
a. By default, all Amazon S3 buckets and objects are private. Only the resource owner, the AWS account that created the bucket, can access the bucket and any objects that it contains. However, the resource owner can choose to grant access permissions to other resources and users by writing an access policy.

{
    "Version": "2012-10-17",
    "Statement": [
      {
          "Action": "s3:GetBucketAcl",
          "Effect": "Allow",
          "Resource": "arn:aws:s3:::bucket_name",
          "Principal": { "Service": "logs.selected_region.amazonaws.com" }
      },
      {
          "Action": "s3:PutObject" ,
          "Effect": "Allow",
          "Resource": "arn:aws:s3:::bucket_name/random_string/*",
          "Condition": { "StringEquals": { "s3:x-amz-acl": "bucket-owner-full-control" } },
          "Principal": { "Service": "logs.selected_region.amazonaws.com" }
      }
    ]
}

By setting the above policy inside S3 bucket -> Permissions, Bucket policy -> Bucket Policy Editor, bucket owner allows CloudWatch Logs to export log data to Amazon S3 bucket.


Manual Process:

Step 1: Go to cloud watch -> Log groups -> Select the log group that you want to export -> select Export Data to Amazon S3.

AWS Cloudwatch logs screen
Actions menu option in AWS Cloudwatch logs
Export data to S3 screen

Step 6: Choose the time range & S3 bucket name, For the S3 Bucket prefix, enter the randomly generated string that you specified in the bucket policy. Click Export & you can see logs inside the selected S3 bucket.


Automated Process:

Step 1: Go to AWS Lambda -> Functions.
AWS Lambda is a serverless compute service that runs your code in response to events and automatically manages the underlying compute resources for you. You can use AWS Lambda to extend other AWS services with custom logic or create your own back-end services that operate at AWS scale, performance, and security.

Step 2: Choose to create function -> Choose Author from scratch.
The code you run on AWS Lambda is called a “Lambda function.” After you create your Lambda function it is always ready to run as soon as it is triggered, similar to a formula in a spreadsheet. Each function includes your code as well as some associated configuration information, including the function name and resource requirements. Lambda functions are “stateless,” with no affinity to the underlying infrastructure, so that Lambda can rapidly launch as many copies of the function as needed to scale to the rate of incoming events.

Step 3: Give the function a name. Choose runtime as Node.js 14x & select the permission as “Create a new role with basic Lambda permissions”.

Step 4: Once the lambda function is created. Go to Code and copy-paste the following code.

const AWS = require('aws-sdk')const cloudconfig = {
  apiVersion: '2014-03-28',
  region: 'selected_region', // replace with your region
}
const cloudwatchlogs = new AWS.CloudWatchLogs(cloudconfig)exports.handler =  async (event, context) => {
   const params = {
    destination: 'bucket_name', // replace with your bucket name
    from: new Date().getTime() - 8640000,
    logGroupName: 'log-name',
    to: new Date().getTime(),
    destinationPrefix: 'random_string', // replace with random string used to give permisson on S3 bucket
  };
await cloudwatchlogs.createExportTask(params).promise().then((data) => {
    console.log(data)
    return ({
      statusCode: 200,
        body: data,
    });
  }).catch((err) => {
    console.error(err)
    return ({
      statusCode: 501,
        body: err,
    });
  });
}

In the above code, we are creating a new cloudwatch log instance to call create export task.

Parameter description:

i. destination: The name of S3 bucket for the exported log data. The bucket must be in the same AWS region.

ii. destinationPrefix: The prefix used as the start of the key for every object exported. If you don’t specify a value, the default is exportedlogs.

iii. from: The start time of the range for the request, expressed as the number of milliseconds after Jan 1, 1970, 00:00:00 UTC. Events with a timestamp earlier than this time are not exported.

iv. logGroupName: The name of the log group.

v. logStreamNamePrefix: Export only log streams that match the provided prefix. If you don’t specify a value, no prefix filter is applied. It’s an optional parameter.

vi. taskName: The name of the export task. It’s an optional parameter.

vii. to: The end time of the range for the request, expressed as the number of milliseconds after Jan 1, 1970, 00:00:00 UTC. Events with a timestamp later than this time are not exported.

If you test the above function it will start the export logs task & gives you taskId as a response.

Step 5: Click to “Add Trigger” and choose “Event bridge”.

To run the above function automatically we need to add the trigger event. Amazon EventBridge is a serverless event bus that makes it easier to build event-driven applications.

Set the trigger for lambda functions

Choose the rule name & state the description. Schedule expression will act as CRON which will automatically trigger the event on matching expression. We are going to set the 1-day rate which invokes the lambda function every day.


Thanks for reading.

207 Comments

  1. naturally like your web site however you need to take a look at the spelling on several of your posts. A number of them are rife with spelling problems and I find it very bothersome to tell the truth on the other hand I will surely come again again.

  2. Hmm it seems like your site ate my first comment (it was extremely long) so I guess I’ll just sum it up what I submitted and say, I’m thoroughly enjoying your blog. I as well am an aspiring blog writer but I’m still new to the whole thing. Do you have any tips and hints for beginner blog writers? I’d genuinely appreciate it.

  3. I have read your article carefully and I agree with you very much. This has provided a great help for my thesis writing, and I will seriously improve it. However, I don’t know much about a certain place. Can you help me?

  4. I have read your article carefully and I agree with you very much. This has provided a great help for my thesis writing, and I will seriously improve it. However, I don’t know much about a certain place. Can you help me?

  5. I’m often to blogging and i really appreciate your content. The article has actually peaks my interest. I’m going to bookmark your web site and maintain checking for brand spanking new information.

  6. You’re so awesome! I don’t believe I have read a single thing like that before. So great to find someone with some original thoughts on this topic. Really.. thank you for starting this up. This website is something that is needed on the internet, someone with a little originality!

  7. You’re so awesome! I don’t believe I have read a single thing like that before. So great to find someone with some original thoughts on this topic. Really.. thank you for starting this up. This website is something that is needed on the internet, someone with a little originality!

  8. naturally like your web site however you need to take a look at the spelling on several of your posts. A number of them are rife with spelling problems and I find it very bothersome to tell the truth on the other hand I will surely come again again.

  9. Krystyna Pickrell

    Fortunately, there is a new registrar cheap domain names from $0.99

    Namecheap exists to help EVERYONE get, make, and achieve more online with less cost, hassle, and headaches. We offer everything you need to get online and thrive, from domains to hosting to security to specialist services and products — all with value built in.

    Great prices, world-beating customer support, and extra resources come as standard. ==>> https://zeep.ly/rxViL

Leave a Reply

Your email address will not be published. Required fields are marked *