scanner
  • About Scanner
  • When to use it
  • Architecture
  • Getting Started
  • Playground Guide
    • Overview
    • Part 1: Search and Analysis
    • Part 2: Detection Rules
    • Wrapping Up
  • Log Data Sources
    • Overview
    • List
      • AWS
        • AWS Aurora
        • AWS CloudTrail
        • AWS CloudWatch
        • AWS ECS
        • AWS EKS
        • AWS GuardDuty
        • AWS Lambda
        • AWS Route53 Resolver
        • AWS VPC Flow
        • AWS VPC Transit Gateway Flow
        • AWS WAF
      • Cloudflare
        • Audit Logs
        • Firewall Events
        • HTTP Requests
        • Other Datasets
      • Crowdstrike
      • Custom via Fluentd
      • Fastly
      • GitHub
      • Jamf
      • Lacework
      • Osquery
      • OSSEC
      • Sophos
      • Sublime Security
      • Suricata
      • Syslog
      • Teleport
      • Windows Defender
      • Windows Sysmon
      • Zeek
  • Indexing Your Logs in S3
    • Linking AWS Accounts
      • Manual setup
        • AWS CloudShell
      • Infra-as-code
        • AWS CloudFormation
        • Terraform
        • Pulumi
    • Creating S3 Import Rules
      • Configuration - Basic
      • Configuration - Optional Transformations
      • Previewing Imports
      • Regular Expressions in Import Rules
  • Using Scanner
    • Query Syntax
    • Aggregation Functions
      • avg()
      • count()
      • countdistinct()
      • eval()
      • groupbycount()
      • max()
      • min()
      • percentile()
      • rename()
      • stats()
      • sum()
      • table()
      • var()
      • where()
    • Detection Rules
      • Event Sinks
      • Out-of-the-Box Detection Rules
      • MITRE Tags
    • API
      • Ad hoc queries
      • Detection Rules
      • Event Sinks
      • Validating YAML files
    • Built-in Indexes
      • _audit
    • Role-Based Access Control (RBAC)
    • Beta features
      • Scanner for Splunk
        • Getting Started
        • Using Scanner Search Commands
        • Dashboards
        • Creating Custom Content in Splunk Security Essentials
      • Scanner for Grafana
        • Getting Started
      • Jupyter Notebooks
        • Getting Started with Jupyter Notebooks
        • Scanner Notebooks on Github
      • Detection Rules as Code
        • Getting Started
        • Writing Detection Rules
        • CLI
        • Managing Synced Detection Rules
      • Detection Alert Formatting
        • Customizing PagerDuty Alerts
      • Scalar Functions and Operators
        • coalesce()
        • if()
        • arr.join()
        • math.abs()
        • math.round()
        • str.uriencode()
  • Single Sign On (SSO)
    • Overview
    • Okta
      • Okta Workforce
      • SAML
  • Self-Hosted Scanner
    • Overview
Powered by GitBook
On this page
  • Step 1: Set up Sublime Security to export logs to S3
  • Step 2: Link the S3 bucket to Scanner
  • Step 3: Set up two S3 Import Rules in Scanner, one per log source

Was this helpful?

  1. Log Data Sources
  2. List

Sublime Security

PreviousSophosNextSuricata

Last updated 7 months ago

Was this helpful?

Scanner supports Sublime Security logs. This guide covers integration for two log sources:

  • Audit Logs, which contain information about actions taken in Sublime Security by users or by the system itself.

  • Message Event Logs, which contain information about email security events, analyses, and triggered detection rules.

In order for Scanner to see them, you need to configure Sublime Security to export these logs to an S3 bucket that Scanner is linked to.

Step 1: Set up Sublime Security to export logs to S3

You can follow the Sublime Security documentation to export these logs to an S3 bucket you own. See: .

Step 2: Link the S3 bucket to Scanner

If you haven't done so already, link the S3 bucket containing your Sublime Security logs to Scanner using the Linking AWS Accounts guide.

Step 3: Set up two S3 Import Rules in Scanner, one per log source

Sublime Security exports files to S3 using this directory structure:

<configured_key_prefix>/<log_source_type>/<YYYY>/<MM>/<DD>/<HHMMSSZ>-<ID>.json

  • For Audit Logs, the log_source_type is sublime_platform_audit_log

  • For Message Event Logs, the log_source_type is sublime_platform_message_events.

For each log source type you want Scanner to read, you need to create a separate S3 Import Rule in Scanner. Using the Duplicate button helps streamline this.

Here is how to create an S3 Import Rule for the Audit Logs data source type, i.e. the files in the folder named sublime_platform_audit_log.

  1. Within Scanner, navigate to Settings > S3 Import Rules.

  2. Click Create Rule.

  3. For Rule name, type a name like my_team_name_sublime_audit_logs.

  4. For Destination Index, choose the index where you want these logs to be searchable in Scanner.

  5. For Status, set to Active if you want to start indexing the data immediately.

  6. For Source Type, we recommend sublime:audit, but you are free to choose any name. However, out-of-the-box detection rules will expect sublime:audit.

    1. For Message Event Logs, set the Source Type to sublime:message_events.

  7. For AWS Account, choose the account that contains the S3 bucket containing Sublime Security logs.

  8. For S3 Bucket, choose the S3 bucket containing Sublime Security logs.

  9. For S3 Key Prefix, type the prefix (i.e. directory path) where the Sublime Security is writing your specific log source type.

    1. Example: <configured_key_prefix>/sublime_platform_audit_log/

  10. For File type, choose JsonLines with Gzip compression.

  11. For Timestamp extractors, under Column name, type created_at. This is the field in each log event that contains the timestamp information.

  12. Click Preview rule to try it out. Check that the S3 keys you expect are appearing, and check that the log events inside are being parsed properly with the timestamp detected properly.

  13. When you're ready, click Create.

Once you have created the S3 Import Rule for the Audit Logs, you can duplicate the rule and use find-replace to create an S3 Import Rule for Message Event Logs, making these specific changes:

  • Change Rule name to something like my_team_name_sublime_message_event_logs.

  • Change Source Type to sublime:message_events

  • Change the S3 Key Prefix to <configured_key_prefix>/sublime_platform_message_events/

Export Audit Logs and Message Events