- Exabeam Site Collector
- Exabeam Site Collector Network Ports
- Exabeam Site Collector Specifications
- Install Exabeam Site Collector
- Upgrade Exabeam Site Collector
- Advanced Exabeam Site Collector Customizations
- Supported Exabeam Site Collector Changes
- Configure Transport Layer Security (TLS) Syslog Ingestion
- Direct Kafka Input to Exabeam Site Collector
- Add a Secondary Syslog Destination
- Remove a Syslog Destination
- Filter Incoming Syslog Events in Exabeam Site Collector
- Filtering Outbound Logs in Exabeam Site Collector
- Metadata Collected by Site Collector and Supported Agents
- Add OpenVPN After Exabeam Site Collector Installation
- Supported Exabeam Site Collector Changes
- Troubleshoot for Exabeam Site Collector
- Scenario 1: Collector or its status does not appear in the console and no logs reach destination
- Scenario 2: Collector is healthy but no logs are transmitted or received
- Scenario 3: Exabeam Advanced Analyticsunable to pull LDAP data
- Scenario 4: Kafka Google Cloud Storage (GCS) collectors have not appeared on Data Lake
- Scenario 5: If logs are not uploaded to GCS where logs are not on Data Lake
- Scenario 6: Unable to accept incoming syslog, active directory context, Splunk logs, or Incident Responder integrations
- Scenario 7: Cannot send after transport endpoint shutdown
- Scenario 8: Too many arguments in command /tools/config.parser.sh
- Other scenarios
- Capture Site Collector Diagnostics Using Exabeam Support Package
- Install and Upgrade Exabeam Site Collector for On-premises and Legacy Deployments
- Prerequisites
- Install Site Collector for Exabeam Data Lake On-premises Deployments
- Installing Site Collector for Exabeam Advanced Analytics On-premises Deployments
- Upgrade Site Collector for Exabeam Data Lake On-premises Deployments
- Upgrade Site Collector for Exabeam Advanced Analytics On-premises Deployments
- Uninstall Exabeam Site Collector
- Migrate to the New-Scale Site Collectors Service
- A. Glossary of Terms
Exabeam Site Collector Specifications
The number of site collectors needed in your deployment depends on where the data is located, the data volumes and resilience requirements. Exabeam Site Collector hosts need to meet one of the following specifications to support the expected data volumes. Site collectors in parallel behind a load-balancer should be sized so there is no impact to data collection, if one of them is taken out of service. Please review the specifications in the table below that applies to your environment:
Maximum Events Per Second (EPS) | Minimum CPU and Memory1 | Maximum Agents/Collectors | Operating System Volume | Storage Volume2 |
---|---|---|---|---|
1 - 1,000 | 4 CPU, 8 GiB RAM | 100 | 100 GB | 600 GB |
1,000 - 5,000 | 4 CPU, 8 GiB RAM | 100 | 100 GB | 3 TB |
5,000 - 20,000 | 8 CPU, 16 GiB RAM | 200 | 100 GB | 12 TB |
20,000 - 30,000 | 16 CPU, 32 GiB RAM | 500 | 100 GB | 18 TB |
1 GiB is Gibibyte, the unit on the Google Cloud Platform (GCP), and is equal to 1.07374 GB.
2 Data Storage is based on an average message size of 1500 bytes and 24 hours data retention (default).
Additionally, please ensure the following storage requirements and permissions are met:
CentOS 7.x or RedHat 7.x/8.x
/
must have a minimum 100 GB is required for site collector operations/tmp
must have fullroot
permissionsEnsure
/
and/opt
are configured for disk usage/data
is storage for Kafka data (sizing is based on the Site Collector Specifications above) with 300 GB or higher per EPSDefault local retention is 24 hours or available free disk space in
/data
allocation
Important
For capacity specifications that are not shown, please contact your Exabeam technical representative for assistance in calculating retention and EPS rates.
Where possible we recommend there is at least 2 site collectors deployed behind a load balancer for high availability performance. You can deploy as many site collectors as required for your logs processing. One site collector must have OpenVPN if your ingestion is to support LDAP polling, database logs, eStreamer logs and fetching by Advanced Analytics or Incident Responder accessing local endpoints.