Skip to main content

CollectorsCloud Collectors Administration Guide

Generic Webhook Cloud Collector

Migrate the Webhook Cloud Collector

The Webhook Cloud Collector was first introduced for early access on the Exabeam Cloud Connectors platform.

With Exabeam Cloud Collectors, the Webhook Cloud Collector is now available for the Exabeam Security Operations Platform. The new cloud collector enables you to ingest logs into the Exabeam Security Operations Platform and use the Exabeam Search to find specific events in those logs. Migration to the new app is recommended.

If you previously used the early access Webhook Cloud Connector, and want to take advantage of the new Cloud Collectors service, you must first migrate the SaaS cloud connector. Before you migrate, consider the following:

Early Access Webhook Cloud Connector

Webhook Cloud Collector

Access and security

Basic authentication

Token-based authentication

Management

No user interface; provisioning via Exabeam Support

Self-service onboarding and troubleshooting

Performance

Up to 500 Gb

Unlimited EPS

Vendor support

  • Zscaler ZIA

  • Palo Alto Networks Cortex Data Lake

  • Additional vendors using the following formats:

    • HEC

    • JSON

    • Raw (multiline)

For vendors and formats not yet supported it is recommended to remain on the Cloud Connectors platform. A product that uses an HEC format is not yet supported.

License requirements

Fusion license

No additional license is required. The Cloud Collectors app is included with your existing license.

Note

Both the SaaS Cloud Connectors and the new Cloud Collectors environments can run in parallel.

When you are ready to migrate, complete the Prerequisites to Configure the Webhook Cloud Collector and follow the steps to Configure the Webhook Cloud Collector.

Prerequisites to Configure the Webhook Cloud Collector

Before you configure the Webhook Cloud Collector to ingest application events, complete the following prerequisites.

  • Ensure that your platform:

    • Supports data forwarding to webhook. Webhook pushes other vendors’ data to Exabeam HTTP Endpoint. The API-based Cloud Collectors pull data from the external vendor API. For more information, refer to the documentation specific to your cloud platform or contact the support team.

    • Allows configuring a standard OAuth2.0 token. Splunk HEC token is not supported.

    • Supports one of the following format for batching multiple events on a single HTTP request: JSON single object, or JSON Array (compressed and uncompressed data) with a single or multiple objects or Raw (any format) separated by newline (compressed and uncompressed data)

      Note

      Each batch request is restricted to 32 MB and 2 minutes. For optimal performance batch as many messages as possible within a single HTTP POST request, and request limit of 32 MB. Use the Auto Parser Generator to verify the parsing status and to develop new parsers, because preconfigured content support is not available.

For more information about the Exabeam regions supported for deployment and the relevant instance URL for which you can deploy the Webhook Cloud Collector, see Supported Regions.

Configure the Webhook Cloud Collector

To set up the cloud collector to configure a logging library or an HTTP client with a token to send data to Exabeam in a specific format, use a token-based authentication model.

  1. Complete the Prerequisites to Configure the Webhook Cloud Collector.

  2. Log in to the Exabeam Security Operations Platform with your registered credentials as an administrator.

  3. Navigate to Collectors > Cloud Collectors.

  4. Click New Collector.

    click_new_collector.png
  5. Click Webhook.

    Webhook_2.png
  6. Specify a name for the Cloud Collector instance.

  7. Select the format, either Json or RAW for receiving data.

    • JSON – Use JSON format to ingest a cloud log source that can forward logs in JSON format: JSON single object, or JSON Array with a single or multiple objects (compressed and uncompressed data).

      Example 1: If the collector receives data in the JSON Array format as follows:

      [{“field1”: “abc”, “field2” : “cde”},{“field1”: “mne”, “field2” : “tst”}]

      The collector extracts the events in the following format:

      Event#1:
      
      {“field1”: “abc”, “field2” : “cde”}
      
      Event#2:
      
      {“field1”: “mne”, “field2” : “tst”}
      

      Example 2: If the collector receives data in the JSON single object format as follows:

      {“field1”: “abc”, “field2” : “cde”}

      The collector extracts the events in the following format:

      {“field1”: “abc”, “field2” : “cde”}

      Example 3: If the collector receives data in the JSON single object format as follows:

      {
      “field1”: “abc”,
      “field2” : “cde”
      }

      The collector extracts the events in the following format:

      {
      “field1”: “abc”,
      “field2” : “cde”
      }
    • RAW – Use RAW format to ingest a cloud log source that can forward raw logs delimited by a newline. For example, the collector receives data in the following format:

      raw-event-1
      raw-event-2
      

      The collector extracts data in the following format:

      Event#1:
      
      raw-event-1 
      
      Event#2
      
      raw-event-2
      
  8. (Optional) SITE – Select an existing site or to create a new site with a unique ID, click manage your sites. Adding a site name helps you to ensure efficient management of environments with overlapping IP addresses.

    By entering a site name, you associate the logs with a specific independent site. A sitename metadata field is automatically added to all the events that are going to be ingested via this collector. For more information about Site Management, see Define a Unique Site Name.

  9. (Optional) TIMEZONE – Select a time zone applicable to you for accurate detections and event monitoring.

    By entering a time zone, you override the default log time zone. A timezone metadata field is automatically added to all events ingested through this collector.

    Timezone_sitename_site_management_1.png
  10. Click Install.

    A message displays the authentication token and the URL to which logs are sent.

    For all the Webhook based cloud collectors, there is a five minute latency before the logs are tagged with the updated site name.

  11. Copy the authentication token and URL.

    Record both for later use when you configure the Webhook on your external vendor.

    install_4.png
  12. To view the cloud collector summary, click Go to Overview. If you want to add more cloud collector instances, click Add more collectors.

    The Overview tab displays the Webhook Cloud Collector instance that you installed.

  13. Proceed to configure the Webhook for your external vendor to forward logs to Exabeam Security Operations Platform.

    For instructions on how to configure log forwarding, refer to the specific documentation for your vendor.

  14. To verify the Webhook instance configuration, run the following commands.

    • RAW – Use the following command to verify sending a gzipped file to the Webhook:

      echo "\"hello world\"" > sample.txt | gzip sample.txt
      
      curl --location --request POST 'https://api2.<REGION>.exabeam.cloud/cloud-collectors/v1/logs/raw' --header 'Content-Encoding: gzip' --header 'Content-Type: application/gzip' --header "Authorization: Bearer <TOKEN>" --data-binary "@./sample.txt.gz"

      Where <REGION> is the region of your cloud collectors instance (for example https://api2.us-east.exabeam.cloud) and <TOKEN> is your API token.

    • JSON – Use the following command to verify sending a gzipped JSON file to the Webhook:

      echo "[{\"message\":\"Test Message 1\"}, {\"message\":\"Test Message 2\"}, {\"message\":\"Test Message 3\"}]" > sample.json && gzip sample.json
      
      curl --location --request POST 'https://api2.<REGION>.exabeam.cloud/cloud-collectors/v1/logs/json' --header 'Content-Encoding: gzip' --header 'Content-Type: application/gzip' --header "Authorization: Bearer <TOKEN>" --data-binary "@./sample.json.gz"

      Where <REGION> is the region of your cloud collectors instance (for example https://api2.us-east.exabeam.cloud) and <TOKEN> is your API token.

    For both RAW and JSON, you can also test using an uncompressed file.

    A response of < HTTP/2 200 indicates that the configuration is successful.

  15. Review any errors after running the test commands.

    1. Resolve any identified issues based on the return code and then retry the command.

    2. If you still aren't receiving data, check the status of the cloud collector in the Cloud Collectors app.

      The status should be Running and not Error or Stopped.

    3. If the collector is in an error state, observe the volume metrics graph and error messages section for the cloud collector. Then view error details and follow the recommended mitigation steps.

Troubleshoot the Generic Webhook Cloud Cloud Collector

The following topics describe common issues with the Generic Webhook Cloud Collector and how to troubleshoot them.

The Generic Webhook Cloud Collector is not receiving data

If the cloud collector is not receiving the data, the last log you received appears empty or outdated. To troubleshoot this issue:

  1. To verify if the external vendor meets the requirements to support the Webhook Cloud Collector, refer to the Prerequisites section.

  2. Create a payload by using the following command, as Webhook forwards logs in GZIP format.

    echo "Webhook sample log" > sample.txt | gzip sample.txt
  3. Run the following curl command to push data and check the return code.

    curl <UI URL> -H “Authorization: Bearer <Token>” -d @path/sample.txt.gz -v
    
  4. Based on the return code, resolve the identified issue.

  5. If you still aren't receiving data, check the status of the cloud collector in the Cloud Collectors app. The status should be Running and not Error or Stopped. If the collector is in error state, observe the volume metrics graph and error messages section for the cloud collector. Then view error details and follow the recommended mitigation steps.