- Exabeam Site Collector
- Network Ports
- Install the Exabeam Site Collector
- Filtering Incoming Syslog Events in Exabeam Site Collector
- Filtering Outbound Logs in Exabeam Site Collector
- How to Direct Kafka Input to Exabeam Site Collector
- Supported Exabeam Site Collector Changes
- Troubleshoot the Exabeam Site Collector
- Capture Site Collector Diagnostics Using Exabeam Support Package
- Scenario 1: No logs are transmitted nor received
- Scenario 2: Kafka Google Cloud Storage (GCS) collectors have not appeared on Data Lake UI
- Scenario 3: If logs are not uploaded to GCS where logs are not on Data Lake
- Scenario 4: Unable to accept incoming syslog, active directory context, Splunk logs, or Incident Responder integrations
- Scenario 5: Unable to pull LDAP from SaaS
- Scenario 6: Cannot send after transport endpoint shutdown
- Scenario 8: Too many arguments in command /tools/config.parser.sh
- Other scenarios
- How to Migrate to New Exabeam SaaS Site Collector
- How to Uninstall Exabeam Site Collector
- Exabeam Site Collector Services
Install the Exabeam Site Collector
Site collectors to lets you upload log data from your data centers or VPCs to Exabeam. Site collectors in the Exabeam SaaS cloud are designed to support most data centers with a single site collector, along with on-premises deployments and unmanaged Exabeam Advanced Analytics nodes. The following instructions will help you:
Install a SaaS site collector with Data Lake in the environment
Install a SaaS site collector for Advanced Analytics-only deployments
Install a site collector for on-premises with Data Lake in the environment
Install a site collector for on-premises with Advanced Analytics (site collector in an unmanaged node)
Network proxies are not supported where an on-premises endpoint is the log destination.
Important
If there is a syslog source in your deployment, Exabeam strongly recommends having a load balancer with two site collectors behind it to mitigate any potential data loss.
Warning
For CentOS deployments -- As CentOS 8.x will be reaching its End-of-Life (December 31, 2021), we strongly recommend deploying site collectors on CentOS 7.x.
Install Site Collector for Exabeam SaaS Data Lake
Follow these instructions for a fresh Exabeam Site Collector (SC) installation if your logs are to be sent to Exabeam's SaaS Data Lake.
Prerequisites
Ensure your environment has the these prerequisites met before running a SC installation:
Opened firewall routes to allow network traffic with your SaaS instance
If you have a proxy,
Ensure that the proxy does not require SC traffic to be authenticated
Configure the route for bi-directional
https
traffic in one of two ways:On-premises to SaaS for data flow
SaaS to On-premises via OpenVPN for data such as LDAP polling
If you have deployed SELinux on the SC host, run it in permissive mode
The
/tmp
partition on the SC host is executable for rootEnsure there is enough space for SC installation
The
firewalld
service is runningNTP client must be active
Important
If there is a syslog source in your deployment, Exabeam strongly recommends having a load balancer with two site collectors behind it to mitigate any potential data loss.
Fresh Site Collector Installation
Log into your Data Lake UI.
Navigate to Settings > SaaS Management > SaaS Site Collectors.
Download the SaaS authentication (
*-auth-package.tar.gz
) and Exabeam Site Collector installation packages using the links embedded in Step 1 of Installing the Site Collector. This package contains all required cofigurations and authentication data needed to access your SaaS tenant and installation package.Place the files in the
/tmp
directory.Unpack the downloaded files.
tar -xzf <filename>.tar.gz
Go to the
Exabeam_Site_Collector
directory.Make the files executable.
chmod +x site-collector-installer.sh
Based on your deployment environment, please execute one of the following installation commands:
Installing SC behind the proxy with OpenVPN
sudo ./site-collector-installer.sh -v --dl-saas --config=/tmp/<instanceID>-auth-package.tgz --openvpn --proxy=<proxy_host_ip|proxy_hostname> --proxy-port=<proxy_port>
Installing SC behind the proxy without OpenVPN
sudo ./site-collector-installer.sh -v --dl-saas --config=/tmp/<instanceID>-auth-package.tgz --proxy=<proxy_host_ip|proxy_hostname> --proxy-port=<proxy_port>
Installing SC without proxy but with OpenVPN
sudo ./site-collector-installer.sh -v --dl-saas --config=/tmp/<instanceID>-auth-package.tgz --openvpn
Installing SC without proxy and without OpenVPN
sudo ./site-collector-installer.sh -v --dl-saas --config=/tmp/<instanceID>-auth-package.tgz
Note
OpenVPN we must be used for:
1. Passing LDAP poll data
2. Using a DBlog collector in your deployment
3. Using eStreamer in your deployment
4. Fetching any on-premises SIEM / sources by Advanced Analytics
5. Connecting to on-premises endpoints by Incident Responder Actions
Limitation:
Only one OpenVPN connection can be active at a time. You need to have it installed onto more than one SC (for active/standby option), disable the active service manually after installation.
For CentOS/RHEL 7.x and earlier, use commands:
sudo systemctl stop openvpn@<instanceID> sudo systemctl disable openvpn@<instanceID>
For CentOS/RHEL 8.x and later, use commands:
sudo systemctl stop openvpn-client@<instanceID> sudo systemctl disable openvpn-client@<instanceID>
Post-Installation Verification
To verify that the site collector source has been installed, log into the Data Lake UI and navigate to Settings > Collector Management > Collectors to see the list of configured collectors.
![]() |
Note
It is normal to find the SC Data Forwarder service is shown as Stopped
while another service is shown as Running
. One of these services will show non-zero messages in the graph if there is ongoing ingestion, which would be the indicator to verify with.
If your collector does not appear in the list, verify that the following services are running and there is throughput:
Run the following command to check all Exabeam Site Collector Services:
sudo /opt/exabeam/tools/sc-services-check.sh
Alternatively, you can check services individually on the SC.
Check that
logstash
is running.sudo systemctl status logstash
Check that
zookeeper
is running.sudo systemctl status zookeeper
Check that
kafka
is running.sudo systemctl status kafka
Check SaaS data upload flow on the SC.
Check that
gcs
manager is running.sudo systemctl status exabeam-kafka-gcs1-log-manager
Check that
gcs
collector is running.sudo systemctl status exabeam-kafka-gcs1-collector
Check that
openvpn
service is running.For CentOS/RHEL 7.x and older, use the command:
sudo systemctl status openvpn@<instanceID>
For CentOS/RHEL 8.x, use the command:
sudo systemctl status openvpn-client@<instanceID>
Check that the management service (
rsc-forwarder
) is running.sudo systemctl status exabeam-rsc-forwarder
Check that the local monitoring service (
watchdog
) is running.sudo systemctl status exabeam-rsc-watchdog
Run a test message via Syslog and confirm it arrived at the destination via Data Lake UI after several minutes.
echo "test message" | nc localhost 514
Install Site Collector for Exabeam SaaS Advanced Analytics-only Deployment
Exabeam Site Collector in Unmanaged Mode
In unmanaged mode, you should implement your own monitoring solution as SC will not appear anywhere in Advanced Analytics UI. The only available information is the status of SC ingestion into SaaS on the Status Page.
Follow these instructions for a fresh Exabeam Site Collector (SC) installation if your logs are to be sent to Exabeam's SaaS Advanced Analytics that there is no Exabeam Data Lake deployed.
Prerequisites
Ensure your environment has the these prerequisites met before running a SC installation:
Opened firewall routes to allow network traffic with your SaaS instance
If you have a proxy,
Ensure that the proxy does not require SC traffic to be authenticated
Configure the route for bi-directional
https
traffic in one of two ways:On-premises to SaaS for data flow
SaaS to On-premises via OpenVPN for data such as LDAP polling
If you have deployed SELinux on the SC host, run it in permissive mode
The
/tmp
partition on the SC host is executable for rootEnsure there is enough space for SC installation
The
firewalld
service is runningNTP client must be active
Important
If there is a syslog source in your deployment, Exabeam strongly recommends having a load balancer with two site collectors behind it to mitigate any potential data loss.
Fresh Site Collector Installation
Download SaaS Site Collector installation files from the Exabeam Community.
Download your authentication file package using the following URL template based on your
<instanceID>
.https://<instanceID>.aa.exabeam.com/api/setup/saas/authPackage
Place the files in the
/tmp
directory.Unpack the downloaded files.
tar -xzf <filename>.tar.gz
Go to the
Exabeam_Site_Collector
directory.Make the files executable.
chmod +x site-collector-installer.sh
Based on your deployment environment, please execute one of the following installation commands:
Installing SC behind the proxy with OpenVPN
sudo ./site-collector-installer.sh -v --aa-saas --config=/tmp/<instanceID>-auth-package.tgz --openvpn --proxy=<proxy_host_ip|proxy_hostname> --proxy-port=<proxy_port>
Installing SC behind the proxy without OpenVPN
sudo ./site-collector-installer.sh -v --aa-saas --config=/tmp/<instanceID>-auth-package.tgz --proxy=<proxy_host_ip|proxy_hostname> --proxy-port=<proxy_port>
Installing SC without proxy with OpenVPN
sudo ./site-collector-installer.sh -v --aa-saas --config=/tmp/<instanceID>-auth-package.tgz --openvpn
Installing SC without proxy and without OpenVPN
sudo ./site-collector-installer.sh -v --aa-saas --config=/tmp/<instanceID>-auth-package.tgz
Note
OpenVPN we must be used for:
1. Passing LDAP poll data
2. Using a DBlog collector in your deployment
3. Using eStreamer in your deployment
4. Fetching any on-premises SIEM / sources by Advanced Analytics
5. Connecting to on-premises endpoints by Incident Responder Actions
Limitation:
Only one OpenVPN connection can be active at a time. You need to have it installed onto more than one SC (for active/standby option), disable the active service manually after installation.
For CentOS/RHEL 7.x and older, use commands:
sudo systemctl stop openvpn@<instanceID> sudo systemctl disable openvpn@<instanceID>
For CentOS/RHEL 8.x and later, use commands:
sudo systemctl stop openvpn-client@<instanceID> sudo systemctl disable openvpn-client@<instanceID>
Data Throughput Verification
This SC installation is at an unmanaged node. SC operational checks must be made via SaaS Status Page for your instance. The Status Page is intended to show errors only and should not be used to verify throughput immediately after installation.
Additional verification must be run at the SC host. Verify that the following services are running and there is throughput:
Run the following command to check all Exabeam Site Collector Services:
sudo /opt/exabeam/tools/sc-services-check.sh
Alternatively, you can check services individually on the SC.
Check that
logstash
is running.sudo systemctl status logstash
Check that
zookeeper
is running.sudo systemctl status zookeeper
Check that
kafka
is running.sudo systemctl status kafka
Check SaaS data upload flow on the SC.
Check that
gcs
manager is running.sudo systemctl status exabeam-kafka-gcs1-log-manager
Check that
gcs
collector is running.sudo systemctl status exabeam-kafka-gcs1-collector
Check that
openvpn
service is running.For CentOS/RHEL 7.x and older, use the command:
sudo systemctl status openvpn@<instanceID>
For CentOS/RHEL 8.x, use the command:
sudo systemctl status openvpn-client@<instanceID>
Check that the management service (
rsc-forwarder
) is running.sudo systemctl status exabeam-rsc-forwarder
Check that the local monitoring service (
watchdog
) is running.sudo systemctl status exabeam-rsc-watchdog
Run a test message via Syslog and confirm it arrived at the destination via Data Lake UI after several minutes.
echo "test message" | nc localhost 514
Install Site Collector for Exabeam Data Lake On-premises Deployment
For Data Lake in Appliance or Virtual Deployments
Follow these instructions for a fresh Exabeam Site Collector (SC) installation if your logs are to be sent to Exabeam Data Lake destination deployed on an appliance or virtual platform (excluding Exabeam SaaS).
Prerequisites
Ensure your environment has the these prerequisites met before running a SC installation:
Please ensure you do not haveIf you have a network proxy between your SC and Data Lake
If you have deployed SELinux on the SC host, run it in permissive mode
The
/tmp
partition on the SC host is executable for rootEnsure there is enough space for SC installation
The
firewalld
service is runningNTP client must be active
Important
If there is a syslog source in your deployment, Exabeam strongly recommends having a load balancer with two site collectors behind it to mitigate any potential data loss.
Fresh Site Collector Installation
Log into your Data Lake UI.
Navigate to Settings > Collector Management > Collectors.
Download the SaaS authentication (
sc-auth-package.tar.gz
) and Exabeam Site Collector installation packages using the links embedded in Step 1 of Installing the Site Collector. This package contains all required configurations and authentication data needed to access your SaaS tenant and installation package.Place the files in the
/tmp
directory.Unpack the downloaded files.
tar -xzf <filename>.tar.gz
Go to the
Exabeam_Site_Collector
directory.Make the files executable.
chmod +x site-collector-installer.sh
Run following installation commands:
sudo ./site-collector-installer.sh -v --dl-on-prem --config=/tmp/sc-auth-package.tgz
Post-Installation Verification
To verify that the site collector source has been installed, log into the Data Lake UI and navigate to Settings > Collector Management > Collectors to see the list of configured collectors.
![]() |
Note
It is normal to find the SC Data Forwarder service is shown as Stopped
while another service is shown as Running
. One of these services will show non-zero messages in the graph if there is ongoing ingestion, which would be the indicator to verify with.
If your collector does not appear in the list, verify that the following services are running and there is throughput:
Run the following command to check all Exabeam Site Collector Services:
sudo /opt/exabeam/tools/sc-services-check.sh
Alternatively, you can check services individually on the SC.
Check that
logstash
is running.sudo systemctl status logstash
Check that
zookeeper
is running.sudo systemctl status zookeeper
Check that
kafka
is running.sudo systemctl status kafka
Check data upload flow on the SC.
Check that
lms
manager is running.sudo systemctl status exabeam-kafka-lms1-log-manager
Check that
lms
collector is running.sudo systemctl status exabeam-kafka-lms1-collector
Check that the local monitoring service (
watchdog
) is running.sudo systemctl status exabeam-rsc-watchdog
Run a test message via Syslog and confirm it arrived at the destination via Data Lake UI after several minutes.
echo "test message" | nc localhost 514
Install Site Collector for Exabeam Advanced Analytics On-premises Deployment
For Advanced Analytics in Appliance or Virtual Deployments, in Unmanaged Mode
Follow these instructions for a fresh Exabeam Site Collector (SC) installation if your logs are to be sent to Exabeam Advanced Analytics destination deployed on an appliance or virtual platform (excluding Exabeam SaaS).
Prerequisites
Ensure your environment has the these prerequisites met before running a SC installation:
Opened firewall routes to allow network traffic with your on-premises Advanced Analytics
Please ensure you do not have a proxy between your SC and Advanced Analytics
If you have deployed SELinux on the SC host, run it in permissive mode
The
/tmp
partition on the SC host is executable for rootEnsure there is enough space for SC installation
The
firewalld
service is runningNTP client must be active
Important
If there is a syslog source in your deployment, Exabeam strongly recommends having a load balancer with two site collectors behind it to mitigate any potential data loss.
Fresh Site Collector Installation
Download SaaS Site Collector installation files from the Exabeam Community.
Place the files in the
/tmp
directory.Unpack the downloaded files.
tar -xzf <filename>.tar.gz
Go to the
Exabeam_Site_Collector
directory.Make the files executable.
chmod +x site-collector-installer.sh
Based on your expected load, execute one of the following installation commands:
Installing SC without EPS limit
sudo ./site-collector-installer.sh -v --aa-on-prem --aa-listener=<listener_ip>:514
Installing SC with EPS limit
sudo ./site-collector-installer.sh -v --aa-on-prem --aa-listener=<listener_ip>:514 --eps-limit=2048
Post-Installation Verification
This SC installation is at an unmanaged node. SC operational checks must be run at the SC host. Verify that the following services are running and there is throughput:
Run the following command to check all Exabeam Site Collector Services:
sudo /opt/exabeam/tools/sc-services-check.sh
Alternatively, you can check services individually on the SC.
Check that
logstash
is running.sudo systemctl status logstash
Check that
zookeeper
is running.sudo systemctl status zookeeper
Check that
kafka
is running.sudo systemctl status kafka
Check SaaS data upload flow on the SC.
Check that
uba
manager is running.sudo systemctl status exabeam-kafka-uba1-log-manager
Check that
uba
collector is running.sudo systemctl status exabeam-kafka-uba1-collector
Check that the local monitoring service (
watchdog
) is running.sudo systemctl status exabeam-rsc-watchdog
Run a test message via Syslog and confirm it arrived at the destination via Advanced Analytics UI after several minutes.
echo "test message" | nc localhost 514