scanner
  • About Scanner
  • When to use it
  • Architecture
  • Getting Started
  • Playground Guide
    • Overview
    • Part 1: Search and Analysis
    • Part 2: Detection Rules
    • Wrapping Up
  • Log Data Sources
    • Overview
    • List
      • AWS
        • AWS Aurora
        • AWS CloudTrail
        • AWS CloudWatch
        • AWS ECS
        • AWS EKS
        • AWS GuardDuty
        • AWS Lambda
        • AWS Route53 Resolver
        • AWS VPC Flow
        • AWS VPC Transit Gateway Flow
        • AWS WAF
      • Cloudflare
        • Audit Logs
        • Firewall Events
        • HTTP Requests
        • Other Datasets
      • Crowdstrike
      • Custom via Fluentd
      • Fastly
      • GitHub
      • Jamf
      • Lacework
      • Osquery
      • OSSEC
      • Sophos
      • Sublime Security
      • Suricata
      • Syslog
      • Teleport
      • Windows Defender
      • Windows Sysmon
      • Zeek
  • Indexing Your Logs in S3
    • Linking AWS Accounts
      • Manual setup
        • AWS CloudShell
      • Infra-as-code
        • AWS CloudFormation
        • Terraform
        • Pulumi
    • Creating S3 Import Rules
      • Configuration - Basic
      • Configuration - Optional Transformations
      • Previewing Imports
      • Regular Expressions in Import Rules
  • Using Scanner
    • Query Syntax
    • Aggregation Functions
      • avg()
      • count()
      • countdistinct()
      • eval()
      • groupbycount()
      • max()
      • min()
      • percentile()
      • rename()
      • stats()
      • sum()
      • table()
      • var()
      • where()
    • Detection Rules
      • Event Sinks
      • Out-of-the-Box Detection Rules
      • MITRE Tags
    • API
      • Ad hoc queries
      • Detection Rules
      • Event Sinks
      • Validating YAML files
    • Built-in Indexes
      • _audit
    • Role-Based Access Control (RBAC)
    • Beta features
      • Scanner for Splunk
        • Getting Started
        • Using Scanner Search Commands
        • Dashboards
        • Creating Custom Content in Splunk Security Essentials
      • Scanner for Grafana
        • Getting Started
      • Jupyter Notebooks
        • Getting Started with Jupyter Notebooks
        • Scanner Notebooks on Github
      • Detection Rules as Code
        • Getting Started
        • Writing Detection Rules
        • CLI
        • Managing Synced Detection Rules
      • Detection Alert Formatting
        • Customizing PagerDuty Alerts
      • Scalar Functions and Operators
        • coalesce()
        • if()
        • arr.join()
        • math.abs()
        • math.round()
        • str.uriencode()
  • Single Sign On (SSO)
    • Overview
    • Okta
      • Okta Workforce
      • SAML
  • Self-Hosted Scanner
    • Overview
Powered by GitBook
On this page
  • Step 1: Configure Logpush to write to S3
  • Make sure the time field is exported
  • Step 2: Link the S3 bucket to Scanner
  • Step 3: Set up an S3 Import Rule in Scanner

Was this helpful?

  1. Log Data Sources
  2. List
  3. Cloudflare

HTTP Requests

PreviousFirewall EventsNextOther Datasets

Last updated 7 months ago

Was this helpful?

Scanner supports Cloudflare HTTP Request logs, which contain data about the HTTP requests processed by Cloudflare's edge network. In order for Scanner to see these logs, you can configure Cloudflare to publish them to S3.

Step 1: Configure Logpush to write to S3

You can follow the Cloudflare documentation to configure Cloudflare to write HTTP Request logs to S3. See: .

Make sure the time field is exported

You must make sure the EdgeStartTimestamp field is included in the export of HTTP Request logs. Otherwise, Scanner will not be able to determine the time when the log event occurred and will fall back to ingestion time, which might be incorrect.

Step 2: Link the S3 bucket to Scanner

If you haven't done so already, link the S3 bucket containing your Cloudflare HTTP Request logs to Scanner using the Linking AWS Accounts guide.

Step 3: Set up an S3 Import Rule in Scanner

  1. Within Scanner, navigate to Settings > S3 Import Rules.

  2. Click Create Rule.

  3. For Rule name, type a name like my_team_name_cloudflare_http_request_logs.

  4. For Destination Index, choose the index where you want these logs to be searchable in Scanner.

  5. For Status, set to Active if you want to start indexing the data immediately.

  6. For Source Type, we recommend cloudflare:http_requests, but you are free to choose any name. However, out-of-the-box detection rules will expect cloudflare:http_requests.

  7. For AWS Account, choose the account that contains the S3 bucket containing the Cloudflare logs.

  8. For S3 Bucket, choose the S3 bucket containing the Cloudflare HTTP Request logs.

  9. For S3 Key Prefix, type the prefix (i.e. directory path) where Cloudflare is writing HTTP Request logs. This will be the path you configured in Cloudflare but with /http_requests/ appended to the end.

    1. For example, if you configured Cloudflare to write logs to the S3 path my_team/my_cloudflare_logs/, then the S3 Key Prefix in your S3 Import Rule in Scanner should be my_team/my_cloudflare_logs/http_requests/.

  10. For File type, choose JsonLines with Gzip compression.

  11. For Timestamp extractors, under Column name, type EdgeStartTimestamp. This is the field in each log event that contains the timestamp information.

  12. Click Preview rule to try it out. Check that the S3 keys you expect are appearing, and check that the log events inside are being parsed properly with the timestamp detected properly.

  13. When you're ready, click Create.

Enable Amazon S3