0
0
GCPcloud~10 mins

Data access logs in GCP - Interactive Code Practice

Choose your learning style9 modes available
Practice - 5 Tasks
Answer the questions below
1fill in blank
easy

Complete the code to enable data access logs for a BigQuery dataset.

GCP
resource "google_bigquery_dataset" "dataset" {
  dataset_id = "my_dataset"
  project    = "my_project"
  location   = "US"

  access_logs_config {
    [1] = true
  }
}
Drag options to blanks, or click blank then click option'
Adata_access_logs_enabled
Benable_data_access_logs
Cdata_access_logging
Dlog_data_access
Attempts:
3 left
💡 Hint
Common Mistakes
Using incorrect attribute names like 'enable_data_access_logs' or 'log_data_access'.
2fill in blank
medium

Complete the code to specify the log type for data access logs in Cloud Audit Logs.

GCP
resource "google_logging_project_sink" "data_access_sink" {
  name        = "data-access-logs"
  destination = "storage.googleapis.com/my-bucket"
  filter      = "logName:[1]"
}
Drag options to blanks, or click blank then click option'
A"cloudaudit.googleapis.com/activity"
B"cloudaudit.googleapis.com/data_access"
C"cloudaudit.googleapis.com/system_event"
D"cloudaudit.googleapis.com/admin_activity"
Attempts:
3 left
💡 Hint
Common Mistakes
Using 'activity' or 'admin_activity' log names instead of 'data_access'.
3fill in blank
hard

Fix the error in the IAM policy binding to allow logging service account to write data access logs.

GCP
resource "google_project_iam_member" "logging_writer" {
  project = "my_project"
  role    = "roles/logging.[1]"
  member  = "serviceAccount:logging@my_project.iam.gserviceaccount.com"
}
Drag options to blanks, or click blank then click option'
Alogs.writer
Blogging.logWriter
Clogging.logWriterRole
DlogWriter
Attempts:
3 left
💡 Hint
Common Mistakes
Adding extra prefixes like 'logging.' or suffixes like 'Role' in the role name.
4fill in blank
hard

Fill both blanks to configure a sink that exports data access logs to a Pub/Sub topic.

GCP
resource "google_logging_project_sink" "data_access_sink" {
  name        = "data-access-logs"
  destination = "pubsub.googleapis.com/[1]"
  filter      = "logName:[2]"
}
Drag options to blanks, or click blank then click option'
Aprojects/my-project/topics/my-topic
Bprojects/my-project/subscriptions/my-subscription
C"cloudaudit.googleapis.com/data_access"
D"cloudaudit.googleapis.com/activity"
Attempts:
3 left
💡 Hint
Common Mistakes
Using a subscription path instead of a topic path for destination.
Using the wrong log name in the filter.
5fill in blank
hard

Fill all three blanks to create a BigQuery dataset with data access logs enabled and a sink exporting logs to Cloud Storage.

GCP
resource "google_bigquery_dataset" "dataset" {
  dataset_id = [1]
  project    = [2]
  location   = "US"

  access_logs_config {
    data_access_logs_enabled = true
  }
}

resource "google_logging_project_sink" "sink" {
  name        = "bq-data-access-logs"
  destination = "storage.googleapis.com/[3]"
  filter      = "logName:cloudaudit.googleapis.com/data_access"
}
Drag options to blanks, or click blank then click option'
A"my_dataset"
B"my_project"
C"my-bucket"
D"dataset_logs"
Attempts:
3 left
💡 Hint
Common Mistakes
Omitting quotes around string values.
Using incorrect bucket names or project IDs.