Skip to main content

Responses are generated using AI and may contain mistakes.

CollectorsCloud Collectors Administration Guide

Configure the Azure Blob Storage Cloud Collector

Set up the Azure Blob Storage Cloud Collector to continuously ingest logs from Azure data sources such as threat detections, security alerts, and defender OTP logs.

  1. Before you configure the Azure Blob Storage Cloud Collector, ensure that you complete the prerequisites.

  2. Log in to the New-Scale Security Operations Platform with your registered credentials as an administrator.

  3. Navigate to Collectors > Cloud Collectors.

  4. Click New Collector.

  5. Click Azure Storage Logs.

  6. NAME – Specify a name for the Cloud Collector instance.

  7. AUTHENTICATION METHOD – Select the authentication method: Shared Access Signatures or Role-Based Access Control.

    SHARED ACCESS SIGNATURES

    • BLOB STORAGE SAS TOKEN – Enter the BLOB SAS token that you obtained while completing prerequisites.

    • QUEUE ENDPOINT – Enter the Queue Endpoint name that you obtained while completing prerequisites.

    • QUEUE SAS TOKEN – Enter the Queue SAS Token value that you obtained while completing prerequisites.

    • QUEUE NAME – Enter the Queue Name that you obtained while completing prerequisites.

    ROLE-BASED ACCESS CONTROL

    • TENANT ID – Enter the Tenant ID value that you obtained while completing prerequisites.

    • CLIENT ID – Enter the Client ID value that you obtained while completing prerequisites.

    • CLIENT SECRET – Enter the Client Secret value that you obtained while completing prerequisites.

    • QUEUE ENDPOINT – Enter the Queue Endpoint name that you obtained while completing prerequisites.

  8. SYNC STRATEGY – Select Once or Continuous based on how you want the collector to collect and synchronize data. Select Once if the blobs added to storage will remain unchanged after upload. Select Continuous if the blobs put into the storage are in a format of a duration for example, PT1H.json or PT15M.json.These blobs are updated by the service sending the data for a long duration and needs to be synced continuously for a period of time. By selecting this option, the application will auto-detect the duration for which each blob needs to be monitored for changes, and only collect the added delta to prevent duplicates.

    Note

    Only services that add a new line for each event are compatible with this collector. Services like Azure virtual network flow logs, which batch multiple events into single JSON file, are not compliant with this collector.

  9. FILE PROCESSING – By default, the cloud collector processes Events per Line, however you can change the file processing to process events embedded in a JSON Array. If you choose JSON Array, you can optionally specify the JSON Path To Array (Period Delimited).

    For example, k3.k3_2 is the JSON path to the embedded JSON array below:

    {
        "k1": "v1",
        "k2": [
            "v2_1",
            "v2_2"
        ],
        "k3": {
            "k3_1": "v3_1",
            "k3_2": [
                {
                    "obj1Key": "obj1Val"
                },
                {
                    "obj2Key": "obj2Val"
                }
            ]
        }
    }

    If you do not specify a path for the JSON Array, the cloud connector assumes the the events are the array’s elements.

    [
        {
            "obj1Key": "obj1Val"
        },
        {
            "obj2Key": "obj2Val"
        }
    ]
  10. (Optional) SITE – Select an existing site or to create a new site with a unique ID, click manage your sites. Adding a site name helps you to ensure efficient management of environments with overlapping IP addresses.

    By entering a site name, you associate the logs with a specific independent site. A sitename metadata field is automatically added to all the events that are going to be ingested via this collector. For more information about Site Management, see Define a Unique Site Name.

  11. (Optional) TIMEZONE – Select a time zone applicable to you for accurate detections and event monitoring.

    By entering a time zone, you override the default log time zone. A timezone metadata field is automatically added to all events ingested through this collector.

    Timezone_sitename_site_management_1.png
  12. To confirm that the New-Scale Security Operations Platform communicates with the service, click Test Connection

  13. Click Install.

    AWS_S3_2.png

    A confirmation message informs you that the new Cloud Collector is created.