- Exabeam Data Lake Architecture Overview
- Exabeam Product Deployment in On-premises or Virtual Environments
- Considerations for Installing and Deploying Exabeam Products
- Installation Pre-Check for Exabeam Products
- Install Exabeam Software
- Upgrade an Exabeam Product
- Troubleshooting an Installation
- Administrator Operations
- Exabeam Licenses
- Adding Nodes to a Cluster
- Replicating Logs Across Exabeam Data Lake Clusters
- Ingesting Logs into Exabeam Data Lake
- Exabeam Data Lake Retention Settings
- Remote Archiving NAS and AWS S3 from Data Lake
- Set Up LDAP Import
- User Management
- Exabeam Data Lake Role-based Access Control
- Exabeam Data Lake Object-based Access Control
- Exabeam Data Lake Secured Resources Overview
- Single Sign-on and Multi-factor Authentication Using SAML
- Audit Log Management in Data Lake
- Common Access Card (CAC) Authentication and Limitations
- Adding a User to Exabeam Data Lake
- Exabeam User Password Policy
- User Engagement Analytics Policy
- Index Management
- Index Patterns
- Manage Security Content in Exabeam Data Lake
- Saved Objects in Exabeam Data Lake
- Reindex Operations
- Parser Management
- Forwarding to Other Destinations
- Syslog Forwarding Management in Exabeam Data Lake
- Syslog Forwarding Destinations
- Configure Log Forwarding Rate
- How to Forward Syslog to Exabeam Advanced Analytics from Exabeam Data Lake
- How to Forward Syslog from Exabeam Data Lake to Non-Exabeam External Destinations
- Exabeam Data Lake Selective Forwarding using Conditions
- How to Configure Exabeam Data Lake Log Destinations for Correlation Rule Outcomes
- How to Forward Exabeam Data Lake Incident to Exabeam Incident Responder
- Syslog Forwarding Management in Exabeam Data Lake
- Cluster Operations
- Cross-cluster Search in Exabeam Data Lake
- Prerequisites for Exabeam Data Lake Cross-cluster Search
- Remote Cluster Management for Exabeam Data Lake Cross-cluster Search
- Register a Remote Cluster in Exabeam Data Lake for Cross-cluster Search
- Exabeam Data Lake Cross-cluster Health Monitoring and Handling
- How to Enable/Disable/Delete Exabeam Data Lake Remote Clusters for Cross-cluster Search
- Exabeam Data Lake Remote Cluster Data Access Permissions for Cross-cluster Search
- Exabeam Cloud Telemetry Service
- System Health Page
- Proactive and On-Demand System Health Checks
- Data Lake Cluster Health Status
- How to View Exabeam Data Lake Cluster Status
- Get to Know the Exabeam Data Lake Indexing Metrics Tab
- A. Technical Support Information
- B. List of Exabeam Services
- C. Network Ports
- D. Supported Browsers
Exabeam provides out-of-the-box search indices, labeled with the prefix exabeam-*. You can view their details in the Index Patterns menu. All ingested logs go into exabeam-* indices, as well as correlation rule alerts by default.
Though you can adjust parameters, we strongly recommend that you do not edit Exabeam supplied indices.
New filtered data (for example, after importing updated parsers) and contexts introduced to an existing data set will not display in graphs and search results until the next refresh cycle of a maximum 5 minutes. If you wish to see results immediately post, use Refresh to initiate a refresh of all graphs and search results.
Manage Security Content in Exabeam Data Lake
Parsers come in the form of security content which filters the ingested logs. Parser can change over time or replaced with improved filters. Exabeam offers a curated library of parsers that are constantly updated to address the latest threats. If your organization has a specialized series of parsers, you can upload them to Data Lake also. The Content Updates menu will be the centralized repository for all security parsers, including pre-exising custom parsers, which will be migrated automatically during the upgrade process.
The Content Updates menu facilitates if:
You want to keep your current system while being able to add content
You have a content package with updates to categories/categorization
You want to install a new content package that has improvements to this parser
Manage all your content packages directly in Data Lake under Settings > Admin Operations > Content Updates. Instead of using Content Installer, which requires you to use the command line and manually restart internal engines, you retrieve the latest available content packages from the cloud in real time, including both general Exabeam releases and custom fixes you request.
In these settings, a content package that includes custom fixes you requested is called a custom package. A content package from a general Exabeam release is called a default package. It's important that you update your content with each release because the release may contain new parsers and categories, support new log sources and vendors, and other additions and fixes that keep your system running smoothly.
If you have an environment that can access the internet, you can pull the latest content packages manually or automatically, select a specific content packages to install, or even schedule content packages to automatically install on a daily or weekly basis, all from the cloud. This includes all existing parser packages.
If you have an environment that can't access the internet, you can't connect to the cloud. You must view and download the latest content packages from the Exabeam Community, then upload them.
Schedule Automated Security Content Package Installation
If you subscribe to Exabeam security content, you can configure automatic download and installation of the latest content package.
Select one of the following options:
If you are creating a new schedule, select Install Schedule.
If you want to automate the update of an existing package, click Last Update Checked, toggle Auto Updates on.
Enter an installation interval that works best with your organization. Data Lake ingestion will apply new packages immediately after installation without need for manual service restarts. No logs will be dropped during this process.
Click SAVE to apply the schedule.
Manually Upload and Install a Security Content Package
You may choose to upload a security content package manually.
Please use the menu for based on the package you are installing:
Use the Default Packages tab, if you are installing a security package downloaded from the Exabeam Community
Use the Custom Packages tab, if you have a custom produced security content package.
Click UPLOAD THE PACKAGE to open the menu to select the package file to upload. Click SAVE to upload to Data Lake.
Find the uploaded package in the security content listing and then click INSTALL to apply the package. If the package is a default content package and a newer version of one you previously installed, the newer version will replace the old one. However, if needed, you can roll back to the previous version by uninstalling a given package. All other parsers will switch to older version, if they were in the restored package.. If the package is a custom content package, ensure that you uninstall the older version.
Uninstall a Custom Security Content Package
Navigate to Settings > Admin Operations > Content Updates > Custom Packages tab.
Find the security content package in the listing that you want to remove and click UNINSTALL. After uninstalling, the parsers in the uninstalled package will either disappear from the system and will not be applied during parsing or rollback to their previous or default version.
Migrate Existing Custom Security Content Packages
Hardware and Virtual Deployments Only
It is highly recommended for all content that have been managed externally to be uploaded to Data Lake.
create-custom-mojito-package is a command line tool that allows you to create a custom content package using the content you have in
/opt/exabeam/config/lms/mojito-kafka-connect/ingest-mojito/custom_mojito.conf. Migrating all security contents to Data Lake will simplify package upgrades and management without the need for separate command line executions.
Here are the usage options:
Command and Usage
Create and install a specify the package version if there are other existing custom packages.
The version must be unique and in the form
After successful creation of a content package from custom_mojito.conf, it will be uploaded and installed to Data Lake.
The content service URL defaults to
Create and install a new package with versioned
Create/Install if no package has been uploaded.
URL of content service. This defaults to
Create and install a new package and point to the content service host
List the help menu of command options.
Saved Objects in Exabeam Data Lake
Customized objects are objects you can build using examples and templates provided by Exabeam. "Saved objects" are customized objects stored in the objects library during the build process that can be passed between clusters. Customized objects do not automatically synchronize between clusters. Distributing objects between clusters is a manual process.
To see objects available for export as well as access the import tool, navigate to Settings > Index Management > Saved Objects.
The Edit Saved Objects menu provides helpful actions, including:
Export Everything – Generates and downloads a JSON file to your computer.
Import – Deliver saved objects (JSON files) to your cluster.
Edit – Reconfigure object properties.
Click Save dashboard Object to make the new object available for export. Additionally, you can Delete dashboard Object or View Dashboard.
View – See the output from a given object.
In this example, the object is a visualization. Selecting its view displays in the Chart Builder.
Existing data can be reindexed when a new or revised parser is introduced. This is a manual initiated process. Apply this process only if you need to have new parsed valued in your historical/older data.
The reindexing operation is an expensive operation and might compete for resources with ongoing ingestion. Additionally, reindexing time is directly proportional to the volume of data being reindexed. This means the larger the time window selected for reindexing, the more expensive the operation and the longer the time it takes.
Navigate to Settings > Index Management > Reindex.
Select the start and end dates in the Timeframe for which the data time block you want to reindex.
Narrow the batch to process by selecting the Index and the data set to reprocess with a Search Query.
Click Reindex to initiate reindexing.