Custom via Fluentd
Step 1: Publish to S3
Since Scanner integrates well with JSON logs in S3, if your custom logs can be written to a Fluentd agent, then your custom logs can be written to an S3 bucket where Scanner can see them.
Fluentd supports multiple log input types, including JSON files and Syslog.
If your custom log source that supports sending logs to Fluentd, whether via writing local JSON files or sending Syslog data, you can create a workflow to get those logs to S3.
You can follow the Fluentd documentation to configure it to receive input logs, like JSON files and Syslog, and write output JSON logs to your S3 bucket. See the Fluentd documentation:
Note: Be sure to configure Fluentd to write the timestamp field to the output. For example, for syslog input, you may need to enable settings like keep_time_key.
Timestamp data is essential for Scanner's indexes. If the timestamp field cannot be found in a log event, Scanner will default to the ingestion time, which could be very different from the time when the log event actually happened.
Step 2: Ingest via Scanner Collect
Follow the instructions here to ingest logs from S3 via Scanner Collect
Last updated
Was this helpful?