0
0
GCPcloud~5 mins

Data access logs in GCP - Commands & Configuration

Choose your learning style9 modes available
Introduction
Data access logs help you see who looked at or used your data in Google Cloud. They show details about data reads and writes, helping you keep track of access and improve security.
When you want to know who accessed your Cloud Storage files and when.
When you need to audit BigQuery queries that read sensitive data.
When you want to monitor access to your Cloud SQL databases for compliance.
When you suspect unauthorized data access and want to investigate.
When you want to keep a record of data usage for billing or analysis.
Config File - audit_config.yaml
audit_config.yaml
auditConfigs:
- service: storage.googleapis.com
  auditLogConfigs:
  - logType: DATA_READ
  - logType: DATA_WRITE
- service: bigquery.googleapis.com
  auditLogConfigs:
  - logType: DATA_READ
  - logType: DATA_WRITE
- service: sqladmin.googleapis.com
  auditLogConfigs:
  - logType: DATA_READ
  - logType: DATA_WRITE

This file configures audit logging for data access on Cloud Storage, BigQuery, and Cloud SQL services.

auditConfigs: List of services to enable logging for.

service: The Google Cloud service name.

auditLogConfigs: Types of logs to capture, here both data reads and writes.

Commands
Create a logging sink to export data access logs from Cloud Storage to a storage bucket. This helps keep logs for review.
Terminal
gcloud logging sinks create data-access-sink storage.googleapis.com/my-data-access-logs-bucket --log-filter='resource.type="gcs_bucket" AND protoPayload.methodName:("storage.objects.get" OR "storage.objects.list")'
Expected OutputExpected
Created sink [data-access-sink].
--log-filter - Filters logs to only include data read operations on Cloud Storage buckets.
Check the storage bucket to see if data access logs are arriving correctly.
Terminal
gsutil ls gs://my-data-access-logs-bucket
Expected OutputExpected
logs_2024-06-01T00:00:00Z.json logs_2024-06-01T01:00:00Z.json
Read the last 5 BigQuery job completion logs to see data access events like queries run.
Terminal
gcloud logging read 'resource.type="bigquery_resource" AND protoPayload.methodName="jobservice.jobcompleted"' --limit=5 --format='json'
Expected OutputExpected
[ { "protoPayload": { "methodName": "jobservice.jobcompleted", "serviceName": "bigquery.googleapis.com" }, "resource": { "type": "bigquery_resource" } }, { "protoPayload": { "methodName": "jobservice.jobcompleted", "serviceName": "bigquery.googleapis.com" }, "resource": { "type": "bigquery_resource" } } ]
--limit - Limits the number of log entries returned.
--format - Formats the output as JSON for easy reading.
Key Concept

If you remember nothing else from this pattern, remember: data access logs show who used your data and when, helping you keep your cloud data safe and compliant.

Common Mistakes
Not enabling data access audit logs in the IAM policy or audit config.
Without enabling, no data access logs are recorded, so you cannot track data usage.
Always configure auditConfigs for the services you want to monitor before expecting logs.
Using a log filter that excludes data access events.
Logs will not include the data read/write events you want to see, missing important info.
Use filters that include protoPayload.methodName for data read/write operations.
Checking logs in the wrong project or storage bucket.
Logs won't appear if you look in the wrong place, causing confusion.
Verify the project and bucket names match your sink configuration.
Summary
Configure audit logging to capture data read and write events for your cloud services.
Create a logging sink to export these logs to a storage bucket for long-term access.
Use gcloud logging read commands with filters to view recent data access events.