0
0
GCPcloud~30 mins

Data access logs in GCP - Mini Project: Build & Apply

Choose your learning style9 modes available
Enable and Analyze Data Access Logs in GCP
📖 Scenario: You are a cloud administrator for a company using Google Cloud Platform (GCP). You want to enable data access logs for a specific Cloud Storage bucket to monitor who accesses the data. Then, you want to query these logs to find access patterns.
🎯 Goal: Enable data access logs for a Cloud Storage bucket and write a query to analyze the logs in BigQuery.
📋 What You'll Learn
Enable data access logs for a Cloud Storage bucket named my-data-bucket.
Create a sink to export logs to a BigQuery dataset named data_access_logs.
Write a BigQuery SQL query to count the number of accesses per user.
Use exact names: my-data-bucket, data_access_logs, access_logs_sink.
💡 Why This Matters
🌍 Real World
Monitoring data access is critical for security and compliance in cloud environments. This project shows how to enable and analyze data access logs in GCP.
💼 Career
Cloud administrators and security engineers often need to configure logging and analyze access patterns to protect data and meet audit requirements.
Progress0 / 4 steps
1
Enable Data Access Logs for Cloud Storage Bucket
Write a gcloud command to enable data access logs for the Cloud Storage bucket named my-data-bucket by updating its logging configuration to include DATA_READ and DATA_WRITE log types.
GCP
Need a hint?

Use gcloud logging buckets update with a filter for my-data-bucket and include storage.objects.get and storage.objects.create methods.

2
Create a Logging Sink to Export Logs to BigQuery
Write a gcloud command to create a logging sink named access_logs_sink that exports logs from the my-data-bucket data access logs to a BigQuery dataset named data_access_logs.
GCP
Need a hint?

Use gcloud logging sinks create with the sink name access_logs_sink and target BigQuery dataset data_access_logs. Filter logs for my-data-bucket.

3
Write a BigQuery SQL Query to Count Accesses per User
Write a BigQuery SQL query that selects the protoPayload.authenticationInfo.principalEmail as user_email and counts the number of accesses as access_count from the data_access_logs dataset's cloudaudit_googleapis_com_data_access table, grouping by user_email and ordering by access_count descending.
GCP
Need a hint?

Use protoPayload.authenticationInfo.principalEmail as user_email and count rows grouped by user_email.

4
Complete the Logging Sink with Permissions
Write a gcloud command to grant the logging service account the BigQuery Data Editor role on the data_access_logs dataset to allow the sink access_logs_sink to write logs.
GCP
Need a hint?

Use gcloud logging sinks describe access_logs_sink to get the sink's service account, then grant it roles/bigquery.dataEditor on the project.