Exabeam Site Collector for SaaSExabeam Site Collector

Table of Contents

Install the Exabeam Site Collector

Site collectors to lets you upload log data from your data centers or VPCs to Exabeam. Site collectors in the Exabeam SaaS cloud are designed to support most data centers with a single site collector, along with on-premises deployments and unmanaged Exabeam Advanced Analytics nodes. The following instructions will help you:

Network proxies are not supported where an on-premises endpoint is the log destination.

Important

If there is a syslog source in your deployment, Exabeam strongly recommends having a load balancer with two site collectors behind it to mitigate any potential data loss.

Warning

For CentOS deployments -- As CentOS 8.x will be reaching its End-of-Life (December 31, 2021), we strongly recommend deploying site collectors on CentOS 7.x.

Install Site Collector for Exabeam SaaS Data Lake

Follow these instructions for a fresh Exabeam Site Collector (SC) installation if your logs are to be sent to Exabeam's SaaS Data Lake.

Prerequisites

Ensure your environment has the these prerequisites met before running a SC installation:

  • Opened firewall routes to allow network traffic with your SaaS instance

  • If you have a proxy,

    • Ensure that the proxy does not require SC traffic to be authenticated

    • Configure the route for bi-directional https traffic in one of two ways:

      • On-premises to SaaS for data flow

      • SaaS to On-premises via OpenVPN for data such as LDAP polling

  • If you have deployed SELinux on the SC host, run it in permissive mode

  • The /tmp partition on the SC host is executable for root

  • Ensure there is enough space for SC installation

  • The firewalld service is running

  • NTP client must be active

Important

If there is a syslog source in your deployment, Exabeam strongly recommends having a load balancer with two site collectors behind it to mitigate any potential data loss.

Fresh Site Collector Installation

  1. Log into your Data Lake UI.

  2. Navigate to Settings > SaaS Management > SaaS Site Collectors.

    saas management site collectors selection
  3. Download the SaaS authentication (*-auth-package.tar.gz) and Exabeam Site Collector installation packages using the links embedded in Step 1 of Installing the Site Collector. This package contains all required cofigurations and authentication data needed to access your SaaS tenant and installation package.

    installing site collector step 1
  4. Place the files in the /tmp directory.

  5. Unpack the downloaded files.

    tar -xzf <filename>.tar.gz
  6. Go to the Exabeam_Site_Collector directory.

  7. Make the files executable.

    chmod +x site-collector-installer.sh
    
  8. Based on your deployment environment, please execute one of the following installation commands:

    1. Installing SC behind the proxy with OpenVPN

      sudo ./site-collector-installer.sh -v --dl-saas --config=/tmp/<instanceID>-auth-package.tgz --openvpn --proxy=<proxy_host_ip|proxy_hostname> --proxy-port=<proxy_port>
    2. Installing SC behind the proxy without OpenVPN

      sudo ./site-collector-installer.sh -v --dl-saas --config=/tmp/<instanceID>-auth-package.tgz --proxy=<proxy_host_ip|proxy_hostname> --proxy-port=<proxy_port>
    3. Installing SC without proxy but with OpenVPN

      sudo ./site-collector-installer.sh -v --dl-saas --config=/tmp/<instanceID>-auth-package.tgz --openvpn
    4. Installing SC without proxy and without OpenVPN

      sudo ./site-collector-installer.sh -v --dl-saas --config=/tmp/<instanceID>-auth-package.tgz

    Note

    OpenVPN we must be used for:

    1. Passing LDAP poll data

    2. Using a DBlog collector in your deployment

    3. Using eStreamer in your deployment

    4. Fetching any on-premises SIEM / sources by Advanced Analytics

    5. Connecting to on-premises endpoints by Incident Responder Actions

    Limitation:

    Only one OpenVPN connection can be active at a time. You need to have it installed onto more than one SC (for active/standby option), disable the active service manually after installation.

    • For CentOS/RHEL 7.x and earlier, use commands:

      sudo systemctl stop openvpn@<instanceID>
      sudo systemctl disable openvpn@<instanceID>
    • For CentOS/RHEL 8.x and later, use commands:

      sudo systemctl stop openvpn-client@<instanceID>
      sudo systemctl disable openvpn-client@<instanceID>

Post-Installation Verification

To verify that the site collector source has been installed, log into the Data Lake UI and navigate to Settings > Collector Management  > Collectors to see the list of configured collectors.

site collector management UI

Note

It is normal to find the SC Data Forwarder service is shown as Stopped while another service is shown as Running. One of these services will show non-zero messages in the graph if there is ongoing ingestion, which would be the indicator to verify with.

If your collector does not appear in the list, verify that the following services are running and there is throughput:

  1. Run the following command to check all Exabeam Site Collector Services:

    sudo /opt/exabeam/tools/sc-services-check.sh

    Alternatively, you can check services individually on the SC.

    1. Check that logstash is running.

      sudo systemctl status logstash
    2. Check that zookeeper is running.

      sudo systemctl status zookeeper
    3. Check that kafka is running.

      sudo systemctl status kafka
  2. Check SaaS data upload flow on the SC.

    1. Check that gcs manager is running.

      sudo systemctl status exabeam-kafka-gcs1-log-manager
    2. Check that gcs collector is running.

      sudo systemctl status exabeam-kafka-gcs1-collector
  3. Check that openvpn service is running.

    1. For CentOS/RHEL 7.x and older, use the command:

      sudo systemctl status openvpn@<instanceID>
    2. For CentOS/RHEL 8.x, use the command:

      sudo systemctl status openvpn-client@<instanceID>
  4. Check that the management service (rsc-forwarder) is running.

    sudo systemctl status exabeam-rsc-forwarder
  5. Check that the local monitoring service (watchdog) is running.

    sudo systemctl status exabeam-rsc-watchdog
  6. Run a test message via Syslog and confirm it arrived at the destination via Data Lake UI after several minutes.

    echo "test message" | nc localhost 514

Install Site Collector for Exabeam SaaS Advanced Analytics-only Deployment

Exabeam Site Collector in Unmanaged Mode

In unmanaged mode, you should implement your own monitoring solution as SC will not appear anywhere in Advanced Analytics UI. The only available information is the status of SC ingestion into SaaS on the Status Page.

Follow these instructions for a fresh Exabeam Site Collector (SC) installation if your logs are to be sent to Exabeam's SaaS Advanced Analytics that there is no Exabeam Data Lake deployed.

Prerequisites

Ensure your environment has the these prerequisites met before running a SC installation:

  • Opened firewall routes to allow network traffic with your SaaS instance

  • If you have a proxy,

    • Ensure that the proxy does not require SC traffic to be authenticated

    • Configure the route for bi-directional https traffic in one of two ways:

      • On-premises to SaaS for data flow

      • SaaS to On-premises via OpenVPN for data such as LDAP polling

  • If you have deployed SELinux on the SC host, run it in permissive mode

  • The /tmp partition on the SC host is executable for root

  • Ensure there is enough space for SC installation

  • The firewalld service is running

  • NTP client must be active

Important

If there is a syslog source in your deployment, Exabeam strongly recommends having a load balancer with two site collectors behind it to mitigate any potential data loss.

Fresh Site Collector Installation

  1. Download SaaS Site Collector installation files from the Exabeam Community.

  2. Download your authentication file package using the following URL template based on your <instanceID>.

    https://<instanceID>.aa.exabeam.com/api/setup/saas/authPackage
  3. Place the files in the /tmp  directory.

  4. Unpack the downloaded files.

    tar -xzf <filename>.tar.gz
  5. Go to the Exabeam_Site_Collector directory.

  6. Make the files executable.

    chmod +x site-collector-installer.sh
  7. Based on your deployment environment, please execute one of the following installation commands:

    1. Installing SC behind the proxy with OpenVPN

      sudo ./site-collector-installer.sh -v --aa-saas --config=/tmp/<instanceID>-auth-package.tgz --openvpn --proxy=<proxy_host_ip|proxy_hostname> --proxy-port=<proxy_port>
    2. Installing SC behind the proxy without OpenVPN

      sudo ./site-collector-installer.sh -v --aa-saas --config=/tmp/<instanceID>-auth-package.tgz --proxy=<proxy_host_ip|proxy_hostname> --proxy-port=<proxy_port>
    3. Installing SC without proxy with OpenVPN

      sudo ./site-collector-installer.sh -v --aa-saas --config=/tmp/<instanceID>-auth-package.tgz --openvpn
    4. Installing SC without proxy and without OpenVPN

      sudo ./site-collector-installer.sh -v --aa-saas --config=/tmp/<instanceID>-auth-package.tgz

    Note

    OpenVPN we must be used for:

    1. Passing LDAP poll data

    2. Using a DBlog collector in your deployment

    3. Using eStreamer in your deployment

    4. Fetching any on-premises SIEM / sources by Advanced Analytics

    5. Connecting to on-premises endpoints by Incident Responder Actions

    Limitation:

    Only one OpenVPN connection can be active at a time. You need to have it installed onto more than one SC (for active/standby option), disable the active service manually after installation.

    • For CentOS/RHEL 7.x and older, use commands:

      sudo systemctl stop openvpn@<instanceID>
      sudo systemctl disable openvpn@<instanceID>
    • For CentOS/RHEL 8.x and later, use commands:

      sudo systemctl stop openvpn-client@<instanceID>
      sudo systemctl disable openvpn-client@<instanceID>

Data Throughput Verification

This SC installation is at an unmanaged node. SC operational checks must be made via SaaS Status Page for your instance. The Status Page is intended to show errors only and should not be used to verify throughput immediately after installation.

Additional verification must be run at the SC host. Verify that the following services are running and there is throughput:

  1. Run the following command to check all Exabeam Site Collector Services:

    sudo /opt/exabeam/tools/sc-services-check.sh

    Alternatively, you can check services individually on the SC.

    1. Check that logstash is running.

      sudo systemctl status logstash
    2. Check that zookeeper is running.

      sudo systemctl status zookeeper
    3. Check that kafka is running.

      sudo systemctl status kafka
  2. Check SaaS data upload flow on the SC.

    1. Check that gcs manager is running.

      sudo systemctl status exabeam-kafka-gcs1-log-manager
    2. Check that gcs collector is running.

      sudo systemctl status exabeam-kafka-gcs1-collector
  3. Check that openvpn service is running.

    1. For CentOS/RHEL 7.x and older, use the command:

      sudo systemctl status openvpn@<instanceID>
    2. For CentOS/RHEL 8.x, use the command:

      sudo systemctl status openvpn-client@<instanceID>
  4. Check that the management service (rsc-forwarder) is running.

    sudo systemctl status exabeam-rsc-forwarder
  5. Check that the local monitoring service (watchdog) is running.

    sudo systemctl status exabeam-rsc-watchdog
  6. Run a test message via Syslog and confirm it arrived at the destination via Data Lake UI after several minutes.

    echo "test message" | nc localhost 514

Install Site Collector for Exabeam Data Lake On-premises Deployment

For Data Lake in Appliance or Virtual Deployments

Follow these instructions for a fresh Exabeam Site Collector (SC) installation if your logs are to be sent to Exabeam Data Lake destination deployed on an appliance or virtual platform (excluding Exabeam SaaS).

Prerequisites

Ensure your environment has the these prerequisites met before running a SC installation:

  • Please ensure you do not haveIf you have a network proxy between your SC and Data Lake

  • If you have deployed SELinux on the SC host, run it in permissive mode

  • The /tmp partition on the SC host is executable for root

  • Ensure there is enough space for SC installation

  • The firewalld service is running

  • NTP client must be active

Important

If there is a syslog source in your deployment, Exabeam strongly recommends having a load balancer with two site collectors behind it to mitigate any potential data loss.

Fresh Site Collector Installation

  1. Log into your Data Lake UI.

  2. Navigate to Settings > Collector Management > Collectors.

    DL-Settings-CollectorManagement-Collectors.png
  3. Download the SaaS authentication (sc-auth-package.tar.gz) and Exabeam Site Collector installation packages using the links embedded in Step 1 of Installing the Site Collector. This package contains all required configurations and authentication data needed to access your SaaS tenant and installation package.

  4. Place the files in the /tmp directory.

  5. Unpack the downloaded files.

    tar -xzf <filename>.tar.gz
  6. Go to the Exabeam_Site_Collector directory.

  7. Make the files executable.

    chmod +x site-collector-installer.sh
    
  8. Run following installation commands:

    sudo ./site-collector-installer.sh -v --dl-on-prem --config=/tmp/sc-auth-package.tgz

Post-Installation Verification

To verify that the site collector source has been installed, log into the Data Lake UI and navigate to Settings > Collector Management  > Collectors to see the list of configured collectors.

site collector management UI

Note

It is normal to find the SC Data Forwarder service is shown as Stopped while another service is shown as Running. One of these services will show non-zero messages in the graph if there is ongoing ingestion, which would be the indicator to verify with.

If your collector does not appear in the list, verify that the following services are running and there is throughput:

  1. Run the following command to check all Exabeam Site Collector Services:

    sudo /opt/exabeam/tools/sc-services-check.sh

    Alternatively, you can check services individually on the SC.

    1. Check that logstash is running.

      sudo systemctl status logstash
    2. Check that zookeeper is running.

      sudo systemctl status zookeeper
    3. Check that kafka is running.

      sudo systemctl status kafka
  2. Check data upload flow on the SC.

    1. Check that lms manager is running.

      sudo systemctl status exabeam-kafka-lms1-log-manager
    2. Check that lms collector is running.

      sudo systemctl status exabeam-kafka-lms1-collector
  3. Check that the local monitoring service (watchdog) is running.

    sudo systemctl status exabeam-rsc-watchdog
  4. Run a test message via Syslog and confirm it arrived at the destination via Data Lake UI after several minutes.

    echo "test message" | nc localhost 514

Install Site Collector for Exabeam Advanced Analytics On-premises Deployment

For Advanced Analytics in Appliance or Virtual Deployments, in Unmanaged Mode

Follow these instructions for a fresh Exabeam Site Collector (SC) installation if your logs are to be sent to Exabeam Advanced Analytics destination deployed on an appliance or virtual platform (excluding Exabeam SaaS).

Prerequisites

Ensure your environment has the these prerequisites met before running a SC installation:

  • Opened firewall routes to allow network traffic with your on-premises Advanced Analytics

  • Please ensure you do not have a proxy between your SC and Advanced Analytics

  • If you have deployed SELinux on the SC host, run it in permissive mode

  • The /tmp partition on the SC host is executable for root

  • Ensure there is enough space for SC installation

  • The firewalld service is running

  • NTP client must be active

Important

If there is a syslog source in your deployment, Exabeam strongly recommends having a load balancer with two site collectors behind it to mitigate any potential data loss.

Fresh Site Collector Installation

  1. Download SaaS Site Collector installation files from the Exabeam Community.

  2. Place the files in the /tmp  directory.

  3. Unpack the downloaded files.

    tar -xzf <filename>.tar.gz
  4. Go to the Exabeam_Site_Collector directory.

  5. Make the files executable.

    chmod +x site-collector-installer.sh
  6. Based on your expected load, execute one of the following installation commands:

    1. Installing SC without EPS limit

      sudo ./site-collector-installer.sh -v --aa-on-prem --aa-listener=<listener_ip>:514
    2. Installing SC with EPS limit

      sudo ./site-collector-installer.sh -v --aa-on-prem --aa-listener=<listener_ip>:514 --eps-limit=2048

Post-Installation Verification

This SC installation is at an unmanaged node. SC operational checks must be run at the SC host. Verify that the following services are running and there is throughput:

  1. Run the following command to check all Exabeam Site Collector Services:

    sudo /opt/exabeam/tools/sc-services-check.sh
  2. Alternatively, you can check services individually on the SC.

    1. Check that logstash is running.

      sudo systemctl status logstash
    2. Check that zookeeper is running.

      sudo systemctl status zookeeper
    3. Check that kafka is running.

      sudo systemctl status kafka
  3. Check SaaS data upload flow on the SC.

    1. Check that uba manager is running.

      sudo systemctl status exabeam-kafka-uba1-log-manager
    2. Check that uba collector is running.

      sudo systemctl status exabeam-kafka-uba1-collector
  4. Check that the local monitoring service (watchdog) is running.

    sudo systemctl status exabeam-rsc-watchdog
  5. Run a test message via Syslog and confirm it arrived at the destination via Advanced Analytics UI after several minutes.

    echo "test message" | nc localhost 514