Skip to main content

Site CollectorSite Collector Administration Guide

Set Up Kafka Collector

The Kafka collector is a set of Site Collector flows, pre-built processors, groups, custom processors, other components, and integrations which pulls logs in any text format from your Kafka server and pushes the logs to Exabeam Security Operations Platform. Set up the Kafka Collector to collect log data from your Kafka server with multiple brokers and topics. One Kafka collector instance pulls log data from up to five brokers and five topics.

To set up a Kafka collector:

  1. Log in to the Exabeam Security Operations Platform with your registered credentials.

  2. Navigate to Collectors > Site Collectors.

  3. Ensure that Site Collector is installed and is running.

  4. On the Site Collector page, click the Collectors Library tab, then click Kafka.

    kafka_1.png
  5. In the Definition section, enter the required information as follows.

    kafka_2.png
    • Collector Name – Specify a name for the Kafka collector.

      Note

      Ensure that you specify different names for Site Collector instance and the collector.

    • Site Collector Instance – Select the site collector instance for which you want to set up the Kafka collector.

    • Broker Hostname or IP – Enter the IP address of the Kafka server from which you want the Kafka collector to pull logs.

      Note

      Ensure that you enter the correct Broker hostname or IP address, and topic name. If you enter incorrect broker hostname or IP address, or topic names, the Kafka Collector does not report an error because the collector cannot differentiate between correct and incorrect broker or topic, or a topic with no messages.

    • Port – Enter the port number of your broker.

    • Add Broker – Click to add more brokers which the collector can use for pulling data. You can add up to five brokers for the collector to pull data from the broker that is available if your Kafka server has a multi-broker setup for scalability and resilience.

      Refer to the following example for entering broker and port details.

      If your Kafka server is set up on a machine with IP Address 123.231.1.23, each broker in this server listens on one port. If you have five brokers for this server, listening on ports 10000, 10001, 10002, 10003, 10004, and you want to configure a Kafka collector to pull logs from three brokers, add IP Address and Port details accordingly as follows.

      • Broker Hostname or IP: 123.231.1.23 Port: 10000

      • Broker Hostname or IP: 123.231.1.23 Port: 10001

      • Broker Hostname or IP: 123.231.1.23 Port: 10002

  6. Click Next.

  7. In the Authentication section, select the protocol for establishing connection with the the Kafka source based on the authentication mechanism your Kafka server is set up with.

    Based on how your Kafka server is setup either with no authentication or with authentication, select No Auth or SASL. The Kafka collector supports the following four types of Kafka server authentication: No Authentication, SASL PLAIN Authentication, SASL SCRAM SHA-256 Authentication, SASL SCRAM SHA-512 Authentication.

    • No Auth – Select if your Kafka server is set up for which no authentication is required.

    • SASL – Enter the following information for the SASL protocol which you obtained from your support team or Kafka administrator.

      • SASL Mechanism – Select the SASL authentication mechanism: PLAIN, SCRAM SHA-256, or SASL SCRAM SHA-512 with which your Kafka server is set up.

      • Username – Enter the user name of Kafka server which you obtained from your support team or Kafka administrator.

      • Password – Enter the password of Kafka server which you obtained from your support team or Kafka administrator.

      kafka_3-2.png
  8. Click Next.

  9. In the Data section, enter the topic name of your Kafka topic in the Kafka cluster for the collector to retrieve logs from that topic. You can add up to five topics.

    kafka_4_topic.png
  10. Click Setup.

    The Kafka collector is set up and is ready to pull logs from your Kafka sources.

    In case of installation failure, the collector is disabled, and the configuration is saved. The status of the collector can be checked on the UI or using the support package.