- Cloud Collectors Overview
- Administration
- Administrative Access
- Shareable Service Accounts
- Add Accounts for AWS Cloud Collectors
- Add Accounts for Cisco Duo Cloud Collector
- Add Accounts for Google Cloud Collectors
- Add Accounts for Microsoft Cloud Collectors
- Add Accounts for Okta Cloud Collectors
- Add Accounts for Salesforce Cloud Collectors
- Add Accounts for Splunk Cloud Collectors
- Add Accounts for Trend Micro Cloud Collectors
- Add Accounts for Wiz
- Define a Unique Site Name
- Sign Up for the Early Access Program
- Onboard Cloud Collectors
- Abnormal Security Cloud Collector
- AWS CloudTrail Cloud Collectors
- AWS CloudWatch Cloud Collector
- AWS S3 Cloud Collector
- AWS SQS Cloud Collector
- Azure Activity Logs Cloud Collector
- Azure Log Analytics Cloud Collector
- Azure Event Hub Cloud Collector
- Azure Storage Analytics Cloud Collector
- Box Cloud Collector
- Cato Networks Cloud Collector
- Cisco Duo Cloud Collector
- Cisco Umbrella Cloud Collector
- Cribl Cloud Collector
- CrowdStrike Cloud Collectors
- GCP Pub/Sub Cloud Collector
- Microsoft Defender XDR (via Azure Event Hub) Cloud Collector
- Microsoft Entra ID Context Cloud Collector
- Microsoft Entra ID Logs Cloud Collector
- Microsoft 365 Exchange Admin Reports Cloud Collector
- Supported Sources from Microsoft 365 Exchange Admin Reports
- Migrate to the Microsoft 365 Exchange Admin Reports Cloud Collector
- Prerequisites to Configure the Microsoft 365 Exchange Admin Reports Cloud Collector
- Configure the Microsoft 365 Exchange Admin Reports Cloud Collector
- Troubleshooting the Microsoft 365 Exchange Admin Reports Cloud Collector
- Microsoft 365 Management Activity Cloud Collector
- Microsoft Security Alerts Cloud Collector
- Microsoft Sentinel (via Event Hub) Cloud Collector
- Netskope Alerts Cloud Collector
- Netskope Events Cloud Collector
- Okta Cloud Collector
- Okta Context Cloud Collector
- Palo Alto Networks Cortex Data Lake Cloud Collector
- Proofpoint On-Demand Cloud Collector
- Proofpoint Targeted Attack Protection Cloud Collector
- Recorded Future Cloud Collector
- Salesforce Cloud Collector
- SentinelOne Alerts Cloud Collector
- SentinelOne Cloud Funnel Cloud Collector
- SentinelOne Threats Cloud Collector
- SentinelOne Cloud Collector
- Splunk Cloud Collector
- Symantec Endpoint Security Cloud Collector
- Trend Vision One Cloud Collector
- Zscaler ZIA Cloud Collector
- Webhook Cloud Collectors
- Wiz Issues Cloud Collector
- Wiz API Cloud Collector
- Troubleshooting Cloud Collectors
Configure the AWS CloudWatch Cloud Collector
Set up the AWS CloudWatch Cloud Collector to continuously ingest events from the data sources CloudTrail, CloudWatch Logs, and Lambda logs of AWS services.
The configuration workflow includes the following tasks.
Configure the AWS CloudWatch Cloud Collector on the Exabeam Security Operations Platform.
Create a Firehose stream on Amazon Data Firehose console.
Create a log group and set up the subscription filters on the CloudWatch console.
Configure the AWS CloudWatch Cloud Collector
Before you configure the AWS CloudWatch Cloud Collector, ensure that you complete the prerequisites.
Log in to the Exabeam Security Operations Platform with your registered credentials as an administrator.
Navigate to Collectors > Cloud Collectors.
Click New Collector.
Click AWS CloudWatch.
Enter the following information for the cloud collector.
NAME – Specify a name for the Cloud Collector instance.
Format – Displays JSON format in which the logs are collected from AWS Firehose.
(Optional) SITE – Select an existing site or to create a new site with a unique ID, click manage your sites. Adding a site name helps you to ensure efficient management of environments with overlapping IP addresses.
By entering a site name, you associate the logs with a specific independent site. A sitename metadata field is automatically added to all the events that are going to be ingested via this collector. For more information about Site Management, see Define a Unique Site Name.
(Optional) TIMEZONE – Select a time zone applicable to you for accurate detections and event monitoring.
By entering a time zone, you override the default log time zone. A timezone metadata field is automatically added to all events ingested through this collector.
Click Install.
A confirmation message informs you that the new Cloud Collector is created. Record the authentication token to use it while creating Amazon Data Firehose. Ensure that you record the token immediately after configuration because this token cannot be retrieved later.
The confirmation message window also displays the endpoint /firehose to which logs are sent. Record the link to use it while creating an Amazon Data Firehose.
Create a Firehose Stream
To stream all the CloudWatch logs to Amazon Data Firehose, create a Firehose stream on Amazon Data Firehose console using the following steps.
On Amazon Data Firehose console, click Create Firehose stream.
Select the Source as
Direct PUT
and the Destination asHTTP Endpoint
.Specify a name for the Firehose stream.
In the Destination settings section, paste the HTTP endpoint URL that you obtained after configuring the AWS CloudWatch Cloud Collector.
In the Access key box, paste the token that you obtained while configuring the collector.
Retain the default settings for Content encoding. By default, CloudWatch sends logs to Data Firehose in GZIP compressed format.
In the Retry duration specify the duration in which Firehose retries to send data to the selected endpoint in case of failure in reaching the endpoint. After the specified time interval, if the error persists, the Destination Error Logs section in Firehose shows the error details.
In Buffer hints section, specify the buffer size for the data to be collected and specify the buffer interval in seconds until which Firehose waits for data collection from CloudWatch.
In the Backup settings section, specify if you want to push only failed data or all the data to the S3 bucket. It is recommended that you select Failed data only to send the data that failed to reach the HTTP endpoint from the firehose to the S3 bucket.
In S3 backup bucket, browse to the folder where your S3 bucket is configured, in which you want to back up the data.
Retain the other default settings and click Create Firehose stream.
For more information, see Create a Firehose stream in the AWS documentation.
The Firehose stream is created. Click the firehose that you created to view details.
Proceed to create a CloudWatch Log group.
Configure CloudWatch Logs to Send Data to Firehose
To configure the CloudWatch logs to send data to the Amazon Data Firehose, use the following steps.
Create an IAM role named
CWLtoFirehoseRole
that grants CloudWatch Logs permission to stream data into your AWS Data Firehose delivery stream.On the AWS console, navigate to IAM service and click Policies.
Create a new policy.
Select the policy editor as JSON and paste the following policy details.
{ "Version": "2012-10-17", "Statement": [ { "Sid": "Statement1", "Effect": "Allow", "Action": [ "firehose:PutRecord", "firehose:PutRecordBatch" ], "Resource": [ "arn:aws:firehose:<REGION>:<ACCOUNT_NO>:deliverystream/<FIREHOSE_DELIVERY_STREAM_NAME>" ] } ] }
Replace <REGION> with AWS region, <ACCOUNT_NO> with 12 digit account number without any dashes and <FIREHOSE_DELIVERY_STREAM_NAME> with your firehose data delivery stream name.
Specify a name for your policy such as
PermissionsForCWL
and click Create policy.
To create a role using a custom trust policy, on the AWS Management Console, navigate to Roles.
Select Create role.
Select the Custom trust policy role type and paste the following policy details. For more information, see Creating an IAM role and Creating IAM policies in the AWS documentation.
{ "Version": "2012-10-17", "Statement": [ { "Sid": "Statement1", "Effect": "Allow", "Principal": { "Service": "logs.<REGION>.amazonaws.com" }, "Action": "sts:AssumeRole" } ] }
In the Add permission policies section select the name of the policy that you created such as PermissionsForCWL.
Review the policy details and in the Role Name section specify a name:
CWLtoFirehoseRole
.Click Create Role.
Use this role while creating a Subscription Filter.
Create a Log Group and Subscription Filter
To set up the CloudWatch Log group use the following steps.
On Amazon CloudWatch console, navigate to Log groups and click Create log group.
Note
You can use existing log groups and add a subscription filter for each of the log groups. Although you can create and use multiple log groups and create subscription filters for each of them, it is recommended to create only one Data Firehose Subscription Filter per log group. Using multiple subscription filters may lead to data duplication.
In the Log group details section, specify a name for the log group, and enter the required details.
Click Create.
The log group that you created is listed in the Log groups section. You can monitor the logs generated via Lambda or CloudWatch listed in the log group. For more information, see Working with log groups and log streams in the AWS documentation.
Set up Subscription Filters
After you create the log group, you must set up subscription filters. Use the following steps to create the subscription filter.
After creating a log group, in the Log group details section, navigate to the Subscription filters tab.
Click Create Amazon Data Firehose subscription filter.
In the Choose destination section, select the Amazon Data Firehose stream that you created in CloudWatch.
In the Grant permission section, you can select the appropriate IAM role. For example:
CWLtoFirehoseRole
In the Configure log format and filters section, select the Log format as
Other
to stream all the logs from CloudWatch to Data Firehose irrespective of their format. You can select other log formats based on your requirements to narrow down the log volume.Specify a name for the subscription filter.
Click Test pattern to verify if the logs are getting streamed and click Start streaming.
The subscription filter is created.
Use this workflow to effectively stream CloudWatch Logs to AWS CloudWatch via AWS Data Firehose using the AWS Management Console.