Skip to main content

Log StreamLog Stream Guide

Table of Contents

Tokenize Non-Standard Log Files

Some products and environments create non-standard logs. In these cases, you will need to adjust the parsing to the log's format to extract values from a non-standard format or a log utilizing uncommon delimiters.

To manually tokenize non-standard logs, create a custom parser. You can create one that works for both Advanced Analytics and Data Lake or one that works just for Data Lake.

The following procedure lists the basic steps to creating a custom parser. If you need more detail about any of the steps, see Create a Custom Parser.

  1. Log on to the Exabeam Security Operations Platform and select the Log Stream tile.

    The Log Stream homepage appears. You should be on the Parsers Overview tab.

  2. Click +New Parser.

    The New Parser page appears. you will be on the Add Log Sample stage of creating a new parser.

  3. Select sample logs to import:

    • To select a log file from your file system, select Add a file, then drag and drop a file or click Select a File. You can upload a .gz or .tgz file that is no more than 100 MB.

    • To copy and paste logs, select Copy and paste raw logs, then paste the content into the text box. You can enter up to 100 lines.

  4. Click Upload Log Sample.

  5. Click Find Matching Parsers.

  6. Click +New Parser.

  7. Add parser conditions:

    1. Enter a value in the SELECT CONDITIONS bar, or in the list of raw log lines, highlight a string.

    2. Click Add Condition.

    3. Click Next.

    The Parser Info page appears.

  8. Add basic parser information.

    1. Enter the Parser Name.

    2. Under Activity Types, click Select activity types , select all of the activity types (alert or app) that apply to your custom parser, and then click Select activity types.

    3. Under Time Format, select a format that best matches how dates and times are formatted in the sample logs.

    4. Under Vendor, select the vendor that generated the logs you imported.

    5. Under Product, select the product that generated the logs.

  9. Click NEXT.

    The Extract Event Fields page appears.

  10. In this step you will either tokenize key-value pairs based on string selection, or tokenize the entire log.

    To tokenize key-value pairs based on string selection:

    1. Select a non-tokenized key-value pair from the sample log lines by highlighting it, and click Tokenize.

      The Tokenize a New String dialog appears.

      PM-TokenizeaNewString.jpg
    2. In the dialog enter:

      • the character(s) separating key-value pairs from each other.

      • Key—the value of the key.

      • the character(s) separating the key and value.

      • quote marks around the key/value—(Optional) use the quotes field to instruct the system to remove any redundant escape characters, such as "" or """".

      • Token maps to event field —in the drop-down menu, select the field to which you want to map the key.

      The regex and value generate automatically, based on your entries.

    3. Click Generate token.

      The key-value pairs are now tokenized.

    Repeat these step to tokenize other key-value pairs in the data.

    To tokenize the entire log:

    1. Click Manage tokenization.

      The tokenization dialog appears.

      PM-ManageTokenization.jpg
    2. In the dialog:

      • In the Tokenization method drop-down menu, select Custom parameters or Default regex.

      • delimiter—enter the symbol used to separate the key-value pairs in the data.

      • separator—enter the symbol used to separate the key and value in the key-value pairs in the data.

      • quotes—use the quotes field to instruct the system to remove any redundant escape characters, such as "" or """".

      The system will list all of the key-value pairs identified in the log matching those definitions.

      ecp_apg_Tokenize_Entire_Log.png
    3. Click Generate token.

      The key-value pairs are tokenized.

    Repeat these steps to tokenize the entire log using a different set of delimiters and separators.

  11. The customized tokenizations that you add are shown above the Sample Log Lines. Click the icons to edit or delete an individual tokenization.

    ecp_apg_Customized_Tokenizations.png
  12. Continue with the creation of your custom parser, using the instructions found in Create a Custom Parser, starting with step 4. Extract Event Fields.