- Cloud Collectors Overview
- Administration
- Administrative Access
- Shareable Service Accounts
- Add Accounts for AWS Cloud Collectors
- Add Accounts for Cisco Duo Cloud Collector
- Add Accounts for Google Cloud Collectors
- Add Accounts for Microsoft Cloud Collectors
- Add Accounts for Okta Cloud Collectors
- Add Accounts for Salesforce Cloud Collectors
- Add Accounts for Splunk Cloud Collectors
- Add Accounts for Trend Micro Cloud Collectors
- Add Accounts for Wiz
- Define a Unique Site Name
- Sign Up for the Early Access Program
- Onboard Cloud Collectors
- Abnormal Security Cloud Collector
- Anomali Cloud Collector
- AWS CloudTrail Cloud Collectors
- AWS CloudWatch Cloud Collector
- AWS CloudWatch Alarms Cloud Collector
- AWS GuardDuty Cloud Collector
- AWS S3 Cloud Collector
- AWS Security Lake Cloud Collector
- AWS SQS Cloud Collector
- Azure Activity Logs Cloud Collector
- Azure Log Analytics Cloud Collector
- Azure Event Hub Cloud Collector
- Azure Storage Analytics Cloud Collector
- Box Cloud Collector
- Cato Networks Cloud Collector
- Cisco Duo Cloud Collector
- Cisco Meraki Cloud Collector
- Cisco Secure Endpoint Cloud Collector
- Cisco Umbrella Cloud Collector
- Cloudflare Cloud Collector
- Cribl Cloud Collector
- CrowdStrike Cloud Collectors
- Cylance Protect (now Arctic Wolf) Cloud Collector
- DataBahn Cloud Collector
- Dropbox Cloud Collector
- GCP Cloud Logging Cloud Collector
- GCP Pub/Sub Cloud Collector
- GCP Security Command Center Cloud Collector
- GitHub Cloud Collector
- Google Workspace Cloud Collector
- LastPass Cloud Collector
- Microsoft Defender XDR (via Azure Event Hub) Cloud Collector
- Microsoft Entra ID Context Cloud Collector
- Microsoft Entra ID Logs Cloud Collector
- Microsoft 365 Exchange Admin Reports Cloud Collector
- Supported Sources from Microsoft 365 Exchange Admin Reports
- Migrate to the Microsoft 365 Exchange Admin Reports Cloud Collector
- Prerequisites to Configure the Microsoft 365 Exchange Admin Reports Cloud Collector
- Configure the Microsoft 365 Exchange Admin Reports Cloud Collector
- Troubleshooting the Microsoft 365 Exchange Admin Reports Cloud Collector
- Microsoft 365 Management Activity Cloud Collector
- Microsoft Security Alerts Cloud Collector
- Microsoft Sentinel (via Event Hub) Cloud Collector
- Mimecast Cloud Collector
- Netskope Alerts Cloud Collector
- Netskope Events Cloud Collector
- Okta Cloud Collector
- Okta Context Cloud Collector
- Palo Alto Networks Cortex Data Lake Cloud Collector
- Progress ShareFile Cloud Collector
- Proofpoint On-Demand Cloud Collector
- Proofpoint Targeted Attack Protection Cloud Collector
- Qualys Cloud Collector
- Recorded Future Cloud Collector
- Recorded Future Context Cloud Collector
- Rest API Cloud Collector
- Salesforce Cloud Collector
- Salesforce EventLog Cloud Collector
- SentinelOne Alerts Cloud Collector
- SentinelOne Cloud Funnel Cloud Collector
- SentinelOne Threats Cloud Collector
- SentinelOne Cloud Collector
- ServiceNow Cloud Collector
- Slack Cloud Collector
- Snowflake Cloud Collector
- Sophos Central Cloud Collector
- Splunk Cloud Collector
- STIX/TAXII Cloud Collector
- Symantec Endpoint Security Cloud Collector
- Tenable Cloud Collector
- Trend Vision One Cloud Collector
- Trellix Endpoint Security Cloud Collector
- Vectra Cloud Collector
- Zoom Cloud Collector
- Zscaler ZIA Cloud Collector
- Webhook Cloud Collectors
- Wiz Issues Cloud Collector
- Wiz API Cloud Collector
- Troubleshooting Cloud Collectors
Prerequisites to Configure the Snowflake Cloud Collector
To integrate Snowflake Cloud Collector with Snowflake for log or event retrieval across datasets, it is recommended to create a separate warehouse. To start with, use the smallest warehouse to assess associated costs. Snowflake Cloud Collector connects to Snowflake using a JDBC driver and supports Basic, and Key Pair authentication methods. Each Snowflake database maintains distinct login and query history endpoints for every account.
Before configuring the Snowflake Cloud Collector complete the following prerequisites:
Obtain the full account name for your account by contacting Snowflake support. The full account name may include the region and cloud platform where your account is hosted. You can find the account name by referring to your Snowflake URL. For example, if the Snowflake URL is https://xy12345.us-east-1.snowflakecomputing.com, the account name for configuring the cloud collector is https://xy12345.us-east-1.snowflakecomputing.com.
Create a user specifically for Exabeam integration for the Snowflake account and provide a unique username for the user. Assign read permissions to the user to view all the datasets that are required to be pulled.
Create a password if you use the basic authentication method.
Create a private key and public key to use the Key Pair authentication method by performing the following steps:
To generate the private key from terminal, use this command: openssl genrsa 2048 | openssl pkcs8 -topk8 -inform PEM -out rsa_key.p8 -nocrypt.
To generate the public key by referencing the private key, use this command on the terminal: openssl rsa -in rsa_key.p8 -pubout -out rsa_key.pub.
To assign the public key to a Snowflake user, on the Snowflake portal, use this command: alter user <username> set rsa_public_key='<public key>';.
For example, alter user test_user set rsa_public_key='MIIBIjANBgkqh...';
Note
For more information see Key Pair Authentication in the Snowflake documentation.
Understand the Dataset
Snowflake’s predefined audit tables, the login history table and the query history table contain logs related to internal Snowflake activity. The audit tables have a predefined name and table structure. The cloud collector pulls the audit tables by default. Additionally, the Exabeam Cloud Collector for Snowflake can pull logs from any table or view if the tables follow the predefined format.
If you want to use the Snowflake cloud collector to pull additional datasets, ensure that the dataset has the following format:
Each table view must have only two columns.
Each table view must have a timestamp column. The timestamp column has one of these datatypes - TIMESTAMP_LTZ, TIMESTAMP_NTZ, and, TIMESTAMP_TZ. The timestamp column must contain the timestamp from the original event. If the ingestion time and the event occurrence time are very close, the timestamp column can contain the ingestion time. Specify any name for the column.
Each table must have a VARIANT/VARCHAR column that includes the event in its original form, JSON or string.
For every dataset that you want to send to Exabeam to analyse, you must follow the format.
Snowflake Audit: Login History
Snowflake cloud collector pulls data from the LOGIN_HISTORY table for each database discovered by the collector. For example: SNOWFLAKE_SAMPLE_DB.LOGIN_HISTORY
Snowflake Audit: Query History
The Exabeam cloud collector for Snowflake creates an endpoint that pulls data from QUERY_HISTORY table for each database discovered by the collector. For example: SNOWFLAKE_SAMPLE_DB.QUERY_HISTORY
Custom Table(s)
The Exabeam cloud collector for Snowflake ingests data from a table or a view in the following format in which each table contains only two columns:
Timestamp column (TIMESTAMP_LTZ, TIMESTAMP_NTZ, TIMESTAMP_TZ)
Textual data column (VARIANT/VARCHAR)
To create a view with a specific timestamp column and a specific textual column using an existing table run the following command:
CREATE OR REPLACE VIEW MY_VIEW AS select ts_col as timestamp_col, num_col || ',' || string_col || ',' || date_col as textual_col from tab;
Note
Enabling ingestion from a table or a view without specifying a timestamp and textual data column results in sync failure.