Skip to main content

CollectorsCloud Collectors Administration Guide

Configure the Azure Log Analytics Cloud Collector

Set up the Azure Log Analytics Cloud Collector to continuously ingest security events from your Azure Log Analytics workspace.

  1. Before you configure the Azure Log Analytics Cloud Collector, ensure that you complete the prerequisites.

  2. Log in to the Exabeam Security Operations Platform with your registered credentials as an administrator.

  3. Navigate to Collectors > Cloud Collectors.

  4. Click New Collector.

    Azure_Log_Analytics_1.png
  5. Click Azure Log Analytics.

  6. Enter the following information for the cloud collector:

    Azure_Log_Analytics_2.png
    • NAME – Specify a name for the Cloud Collector instance.

    • Account – Click New Account to add a new Microsoft service account or select an existing account. You can use the same account information across multiple Microsoft cloud collectors. For more information, see Add Accounts for Microsoft Cloud Collectors.

    • WORKSPACE ID – Enter the Azure Log Analytics workspace ID that you obtained while completing the prerequisites.

    • QUERY – Enter the KQL query. For example, search "AZMSOperationalLogs", search "StorageQueueLogs" or "StorageBlobLogs" or "StorageTableLogs" or "StorageFileLogs", and search "Syslog" or "Perf" or "Heartbeat".

      Based on your query, the cloud collector pulls logs. You can use the default query search *. If you keep the field blank, the default query search* is applied.

      Note

      Ensure that your query is a valid KQL query that can run on the Log Analytics workspace and the query does not contain the time field for example, search* | where TimeGenerated > ago (1h).

  7. (Optional) SITE – Select an existing site or to create a new site with a unique ID, click manage your sites. Adding a site name helps you to ensure efficient management of environments with overlapping IP addresses.

    By entering a site name, you associate the logs with a specific independent site. A sitename metadata field is automatically added to all the events that are going to be ingested via this collector. For more information about Site Management, see Define a Unique Site Name.

  8. (Optional) TIMEZONE – Select a time zone applicable to you for accurate detections and event monitoring.

    By entering a time zone, you override the default log time zone. A timezone metadata field is automatically added to all the events that are going to be ingested via this collector.

    Timezone_sitename_site_management_1.png
  9. To confirm that the Exabeam Security Operations Platform communicates with the service, click Test Connection

  10. Click Install.

    AWS_S3_2.png

    A confirmation message informs you that the new Cloud Collector is created.

Troubleshoot the Azure Log Analytics Cloud Collector Common Issues

After configuring the Azure Log Analytics Cloud Collector, if the volume of data returned by the query from Log Analytics Workspace is more than 100 MB the Cloud Collector displays the error message 'Response Size is too large for batch'. This error can occur when data comes from CloudTrail S3 data source through Sentinel to the Log Analytics Workspace.

log_analytics_error_message.png

To troubleshoot this error, consider the following recommendations.

  • Edit the configuration for the cloud collector instance to modify the query to filter data for bringing the response size within the Azure API response size limit of 100MB.

    You may experience collection latency of up to 15 minutes in data ingestion. For more information about the delays in pulling the logs from source, see Data collection endpoints and Log data ingestion time in Azure Monitor in Azure documentation.

  • Create multiple cloud collector instances for each type of data source. Microsoft Azure provides specific query for separate tables in the Log Analytics Workspace. For example, the query search "AZMSOperationalLogs" collects data from the table AZMSOperationalLogs.