- Cloud Collectors Overview
- Administration
- Administrative Access
- Shareable Service Accounts
- Add Accounts for AWS Cloud Collectors
- Add Accounts for Cisco Duo Cloud Collector
- Add Accounts for Google Cloud Collectors
- Add Accounts for Microsoft Cloud Collectors
- Add Accounts for Okta Cloud Collectors
- Add Accounts for Salesforce Cloud Collectors
- Add Accounts for Splunk Cloud Collectors
- Add Accounts for Trend Micro Cloud Collectors
- Add Accounts for Wiz
- Define a Unique Site Name
- Sign Up for the Early Access Program
- Onboard Cloud Collectors
- Abnormal Security Cloud Collector
- AWS CloudTrail Cloud Collectors
- AWS CloudWatch Cloud Collector
- AWS S3 Cloud Collector
- AWS SQS Cloud Collector
- Azure Activity Logs Cloud Collector
- Azure Log Analytics Cloud Collector
- Azure Event Hub Cloud Collector
- Azure Storage Analytics Cloud Collector
- Box Cloud Collector
- Cato Networks Cloud Collector
- Cisco Duo Cloud Collector
- Cisco Umbrella Cloud Collector
- Cribl Cloud Collector
- CrowdStrike Cloud Collectors
- GCP Pub/Sub Cloud Collector
- Microsoft Defender XDR (via Azure Event Hub) Cloud Collector
- Microsoft Entra ID Context Cloud Collector
- Microsoft Entra ID Logs Cloud Collector
- Microsoft 365 Exchange Admin Reports Cloud Collector
- Supported Sources from Microsoft 365 Exchange Admin Reports
- Migrate to the Microsoft 365 Exchange Admin Reports Cloud Collector
- Prerequisites to Configure the Microsoft 365 Exchange Admin Reports Cloud Collector
- Configure the Microsoft 365 Exchange Admin Reports Cloud Collector
- Troubleshooting the Microsoft 365 Exchange Admin Reports Cloud Collector
- Microsoft 365 Management Activity Cloud Collector
- Microsoft Security Alerts Cloud Collector
- Microsoft Sentinel (via Event Hub) Cloud Collector
- Netskope Alerts Cloud Collector
- Netskope Events Cloud Collector
- Okta Cloud Collector
- Okta Context Cloud Collector
- Palo Alto Networks Cortex Data Lake Cloud Collector
- Proofpoint On-Demand Cloud Collector
- Proofpoint Targeted Attack Protection Cloud Collector
- Recorded Future Cloud Collector
- Salesforce Cloud Collector
- SentinelOne Alerts Cloud Collector
- SentinelOne Cloud Funnel Cloud Collector
- SentinelOne Threats Cloud Collector
- SentinelOne Cloud Collector
- Splunk Cloud Collector
- Symantec Endpoint Security Cloud Collector
- Trend Vision One Cloud Collector
- Zscaler ZIA Cloud Collector
- Webhook Cloud Collectors
- Wiz Issues Cloud Collector
- Wiz API Cloud Collector
- Troubleshooting Cloud Collectors
Generic Webhook Cloud Collector
Migrate the Webhook Cloud Collector
The Webhook Cloud Collector was first introduced for early access on the Exabeam Cloud Connectors platform.
With Exabeam Cloud Collectors, the Webhook Cloud Collector is now available for the Exabeam Security Operations Platform. The new cloud collector enables you to ingest logs into the Exabeam Security Operations Platform and use the Exabeam Search to find specific events in those logs. Migration to the new app is recommended.
If you previously used the early access Webhook Cloud Connector, and want to take advantage of the new Cloud Collectors service, you must first migrate the SaaS cloud connector. Before you migrate, consider the following:
Early Access Webhook Cloud Connector | Webhook Cloud Collector | |
---|---|---|
Access and security | Basic authentication | Token-based authentication |
Management | No user interface; provisioning via Exabeam Support | Self-service onboarding and troubleshooting |
Performance | Up to 500 Gb | Unlimited EPS |
Vendor support |
|
For vendors and formats not yet supported it is recommended to remain on the Cloud Connectors platform. A product that uses an HEC format is not yet supported. |
License requirements | Fusion license | No additional license is required. The Cloud Collectors app is included with your existing license. |
Note
Both the SaaS Cloud Connectors and the new Cloud Collectors environments can run in parallel.
When you are ready to migrate, complete the Prerequisites to Configure the Webhook Cloud Collector and follow the steps to Configure the Webhook Cloud Collector.
Prerequisites to Configure the Webhook Cloud Collector
Before you configure the Webhook Cloud Collector to ingest application events, complete the following prerequisites.
Ensure that your platform:
Supports data forwarding to webhook. Webhook pushes other vendors’ data to Exabeam HTTP Endpoint. The API-based Cloud Collectors pull data from the external vendor API. For more information, refer to the documentation specific to your cloud platform or contact the support team.
Allows configuring a standard OAuth2.0 token. Splunk HEC token is not supported.
Supports one of the following format for batching multiple events on a single HTTP request: JSON single object, or JSON Array (compressed and uncompressed data) with a single or multiple objects or Raw (any format) separated by newline (compressed and uncompressed data)
Note
Each batch request is restricted to 32 MB and 2 minutes. For optimal performance batch as many messages as possible within a single HTTP POST request, and request limit of 32 MB. Use the Auto Parser Generator to verify the parsing status and to develop new parsers, because preconfigured content support is not available.
For more information about the Exabeam regions supported for deployment and the relevant instance URL for which you can deploy the Webhook Cloud Collector, see Supported Regions.
Configure the Webhook Cloud Collector
To set up the cloud collector to configure a logging library or an HTTP client with a token to send data to Exabeam in a specific format, use a token-based authentication model.
Complete the Prerequisites to Configure the Webhook Cloud Collector.
Log in to the Exabeam Security Operations Platform with your registered credentials as an administrator.
Navigate to Collectors > Cloud Collectors.
Click New Collector.
Click Webhook.
Specify a name for the Cloud Collector instance.
Select the format, either Json or RAW for receiving data.
JSON – Use JSON format to ingest a cloud log source that can forward logs in JSON format: JSON single object, or JSON Array with a single or multiple objects (compressed and uncompressed data).
Example 1: If the collector receives data in the JSON Array format as follows:
[{“field1”: “abc”, “field2” : “cde”},{“field1”: “mne”, “field2” : “tst”}]
The collector extracts the events in the following format:
Event#1: {“field1”: “abc”, “field2” : “cde”} Event#2: {“field1”: “mne”, “field2” : “tst”}
Example 2: If the collector receives data in the JSON single object format as follows:
{“field1”: “abc”, “field2” : “cde”}
The collector extracts the events in the following format:
{“field1”: “abc”, “field2” : “cde”}
Example 3: If the collector receives data in the JSON single object format as follows:
{ “field1”: “abc”, “field2” : “cde” }
The collector extracts the events in the following format:
{ “field1”: “abc”, “field2” : “cde” }
RAW – Use RAW format to ingest a cloud log source that can forward raw logs delimited by a newline. For example, the collector receives data in the following format:
raw-event-1 raw-event-2
The collector extracts data in the following format:
Event#1: raw-event-1 Event#2 raw-event-2
(Optional) SITE – Select an existing site or to create a new site with a unique ID, click manage your sites. Adding a site name helps you to ensure efficient management of environments with overlapping IP addresses.
By entering a site name, you associate the logs with a specific independent site. A sitename metadata field is automatically added to all the events that are going to be ingested via this collector. For more information about Site Management, see Define a Unique Site Name.
(Optional) TIMEZONE – Select a time zone applicable to you for accurate detections and event monitoring.
By entering a time zone, you override the default log time zone. A timezone metadata field is automatically added to all events ingested through this collector.
Click Install.
A message displays the authentication token and the URL to which logs are sent.
For all the Webhook based cloud collectors, there is a five minute latency before the logs are tagged with the updated site name.
Copy the authentication token and URL.
Record both for later use when you configure the Webhook on your external vendor.
To view the cloud collector summary, click Go to Overview. If you want to add more cloud collector instances, click Add more collectors.
The Overview tab displays the Webhook Cloud Collector instance that you installed.
Proceed to configure the Webhook for your external vendor to forward logs to Exabeam Security Operations Platform.
For instructions on how to configure log forwarding, refer to the specific documentation for your vendor.
To verify the Webhook instance configuration, run the following commands.
RAW – Use the following command to verify sending a gzipped file to the Webhook:
echo "\"hello world\"" > sample.txt | gzip sample.txt curl --location --request POST 'https://api2.<REGION>.exabeam.cloud/cloud-collectors/v1/logs/raw' --header 'Content-Encoding: gzip' --header 'Content-Type: application/gzip' --header "Authorization: Bearer <TOKEN>" --data-binary "@./sample.txt.gz"
Where <REGION> is the region of your cloud collectors instance (for example
https://api2.us-east.exabeam.cloud
) and <TOKEN> is your API token.JSON – Use the following command to verify sending a gzipped JSON file to the Webhook:
echo "[{\"message\":\"Test Message 1\"}, {\"message\":\"Test Message 2\"}, {\"message\":\"Test Message 3\"}]" > sample.json && gzip sample.json curl --location --request POST 'https://api2.<REGION>.exabeam.cloud/cloud-collectors/v1/logs/json' --header 'Content-Encoding: gzip' --header 'Content-Type: application/gzip' --header "Authorization: Bearer <TOKEN>" --data-binary "@./sample.json.gz"
Where <REGION> is the region of your cloud collectors instance (for example
https://api2.us-east.exabeam.cloud
) and <TOKEN> is your API token.
For both RAW and JSON, you can also test using an uncompressed file.
A response of < HTTP/2 200 indicates that the configuration is successful.
Review any errors after running the test commands.
Resolve any identified issues based on the return code and then retry the command.
If you still aren't receiving data, check the status of the cloud collector in the Cloud Collectors app.
The status should be
Running
and notError
orStopped
.If the collector is in an error state, observe the volume metrics graph and error messages section for the cloud collector. Then view error details and follow the recommended mitigation steps.
Troubleshoot the Generic Webhook Cloud Cloud Collector
The following topics describe common issues with the Generic Webhook Cloud Collector and how to troubleshoot them.
The Generic Webhook Cloud Collector is not receiving data
If the cloud collector is not receiving the data, the last log you received appears empty or outdated. To troubleshoot this issue:
To verify if the external vendor meets the requirements to support the Webhook Cloud Collector, refer to the Prerequisites section.
Create a payload by using the following command, as Webhook forwards logs in GZIP format.
echo "Webhook sample log" > sample.txt | gzip sample.txt
Run the following curl command to push data and check the return code.
curl <UI URL> -H “Authorization: Bearer <Token>” -d @path/sample.txt.gz -v
Based on the return code, resolve the identified issue.
If you still aren't receiving data, check the status of the cloud collector in the Cloud Collectors app. The status should be Running and not Error or Stopped. If the collector is in error state, observe the volume metrics graph and error messages section for the cloud collector. Then view error details and follow the recommended mitigation steps.