Amazon S3 as log source
For Amazon S3, Centralized Logging with OpenSearch will ingest logs in a specified S3 location continuously or perform one-time ingestion. You can also filter logs based on S3 prefix or parse logs with custom Log Config.
This article guides you to create a log pipeline that ingests logs from an S3 bucket.
Create a log analytics pipeline (OpenSearch Engine)
Prerequisites
Create log analytics pipeline
-
Sign in to the Centralized Logging with OpenSearch Console.
-
In the left sidebar, under Log Analytics Pipelines, choose Application Log.
-
Choose Create a pipeline.
-
Choose Amazon S3 as Log Source, choose Amazon OpenSearch, and choose Next.
-
Choose the Amazon S3 bucket where your logs are stored. If needed,enter Prefix filter, which is optional.
-
Choose Ingestion mode based on your need. If you want to ingest the log continuously, select On-going; if you only need to ingest the log once, select One-time.
-
Specify Compression format if your log files are compressed, and choose Next.
You have created a log source for the log analytics pipeline. Now you are ready to make further configurations for the log analytics pipeline with Amazon S3 as log source.
-
Select a log config. If you do not find the desired log config from the drop-down list, choose Create New. Refer to Log Config for more information.
-
Choose Next.
-
Specify Index name in lowercase.
-
In the Specify OpenSearch domain section, select an imported domain for Amazon OpenSearch domain.
-
In the Log Lifecycle section, enter the number of days to manage the Amazon OpenSearch Service index lifecycle. The Centralized Logging with OpenSearch will create the associated Index State Management (ISM) policy automatically for this pipeline.
-
In the Log processor settings section, choose Log processor type, and configure the Lambda concurrency if needed, then Next.
-
Enable Alarms if needed and select an exiting SNS topic. If you choose Create a new SNS topic, please provide a name and an email address for the new SNS topic.
-
Add tags if needed.
-
Choose Create.
-
Wait for the application pipeline turning to "Active" state.
Create a log analytics pipeline (Light Engine)
Create a log analytics pipeline
-
Sign in to the Centralized Logging with OpenSearch Console.
-
In the left sidebar, under Log Analytics Pipelines, choose Application Log.
-
Choose Create a pipeline
-
Choose Amazon S3 as Log Source, choose Light Engine, and choose Next.
-
Choose the Amazon S3 bucket where your logs are stored. If needed,enter Prefix filter, which is optional.
-
Choose Ingestion mode based on your need. If you want to ingest the log continuously, select On-going.
You have created a log source for the log analytics pipeline. Now you are ready to make further configurations for the log analytics pipeline with Amazon S3 as log source.
-
Select a log config. If you do not find the desired log config from the drop-down list, choose Create New. Refer to Log Config for more information.
-
Choose Next.
-
In the Specify Light Engine Configuration section, if you want to ingest associated templated Grafana dashboards, select Yes for the sample dashboard.
-
You can choose an existing Grafana, or if you need to import a new one, you can go to Grafana for configuration.
-
Select an S3 bucket to store partitioned logs and define a name for the log table. We have provided a predefined table name, but you can modify it according to your business needs.
-
The log processing frequency is set to 5 minutes by default, with a minimum processing frequency of 1 minute.
-
In the Log Lifecycle section, enter the log merge time and log archive time. We have provided default values, but you can adjust them based on your business requirements.
-
Select Next.
-
Enable Alarms if needed and select an exiting SNS topic. If you choose Create a new SNS topic, please provide a name and an email address for the new SNS topic.
-
If desired, add tags.
-
Select Create.
-
Wait for the application pipeline turning to "Active" state.