Google Cloud Platform (GCP) Audit
This guide walks through how to set up Google Cloud Platform (GCP) Audit logs in Scanner Collect, using Google Pub/Sub push subscriptions to send logs directly to Scanner’s HTTP receiver.
Step 1: Create a New Source
In the Scanner UI, go to the Collect tab.
Click Create New Source.
Click Select a Source Type.
Choose Google Cloud Platform (GCP).
For Ingest Method, select HTTP Push. This is the most common integration method.
For Destination, select Scanner.
If you only want logs stored in your S3 data lake (without indexing or detection), choose AWS S3 Only. This guide assumes you’re using Scanner as the destination.
Click Next.
Step 2: Configure the Source
Set a Display Name such as
my-org-gcp-logs
.Leave the Payload Format as the default: JSON: Lines.
Click Next.
Step 3: Configure Authentication
Keep the default Authentication Type: JWT Token.
Set JWKS Public Key URL to:
https://www.googleapis.com/oauth2/v3/certs
Under Claims, add the following:
aud
- an audience value of your choice (e.g.gcp-logs-scanner-subscription
)email
- the service account email that will be pushing logs from GCP
You’ll configure these values in GCP shortly. For now, leave Scanner open on this screen.
Step 4: Set Up Google Pub/Sub
In a new browser tab, go to your GCP Console:
Log in at
cloud.google.com
.Navigate to Pub/Sub.
Click Create Topic.
Choose a Topic ID, such as
gcp-logs-topic
.Select Add a default subscription.
Leave all other options unchecked.
Click Create.
Edit the Subscription
Open the newly created default subscription.
Click Edit.
Set Delivery Type to Push.
For Endpoint URL, enter a placeholder (e.g.,
https://example.com
). You’ll replace this later with the Scanner endpoint.Enable Authentication:
Choose a service account. If you don’t have one, follow Google’s guide.
Set the Audience to the value you want to use in the
aud
claim (e.g.gcp-logs-scanner-subscription
).
Enable Payload Unwrapping.
(Optional) Leave Write metadata unchecked.
Scroll down to Retry Policy:
Set to Retry after exponential backoff delay (defaults are fine).
Do not click Update yet — you’ll finish Scanner setup first to get the actual endpoint URL.
Step 5: Complete Authentication in Scanner
Back in the Scanner UI:
Paste the GCP service account email into the
email
claim.Paste the Audience value into the
aud
claim.
Click Next.
Step 6: Configure Destination
Select the S3 bucket where you want raw logs delivered.
(Optional) Enter a bucket prefix. The default is fine for most setups.
Choose the Scanner index where searchable logs should go.
Leave the Source Label as the default:
gcp
.
Click Next.
Step 7: Transform and Enrich
Keep the default transformation step: Normalize to ECS - GCP Audit
This maps GCP log fields to the Elastic Common Schema (ECS) to support cross-source queries and detections.
Keep Parse JSON Columns enabled to automatically extract data from any stringified JSON fields.
(Optional) Add additional transformation or enrichment steps as desired.
Click Next.
Step 8: Timestamp Extraction
Leave the default setting to extract timestamps from the timestamp field.
Click Next.
Step 9: Review and Create
Review your configuration.
Click Create Source.
After creation, Scanner will display a unique Endpoint URL like:
https://collect.your-org-and-region.scanner.dev/receiver/v1/http/<id>
Step 10: Finalize GCP Subscription
Return to the GCP Console, where you are editing your Pub/Sub Push Subscription.
Replace the placeholder Endpoint URL with the URL provided by Scanner.
Click Update to save the subscription.
Step 11: Route Logs to Pub/Sub
To start sending GCP Audit logs to Scanner:
In the GCP Console, go to Logging > Log Router.
Create a new Sink to route logs to your Pub/Sub topic.
Select the appropriate audit logs to include.
Choose your Pub/Sub topic (e.g.,
gcp-logs-topic
) as the sink destination.
Refer to Google’s routing documentation for full setup details.
That’s It
Once routing is complete, logs will flow from GCP → Pub/Sub → Scanner HTTP Receiver → S3 → Scanner index.
Last updated
Was this helpful?