Complete the code to create a connection in Airflow using the CLI.
airflow connections add my_gcp_conn --conn-type [1] --conn-extra '{"extra__google_cloud_platform__project": "my-project"}'
The connection type for Google Cloud Platform in Airflow is google_cloud_platform. This tells Airflow how to handle the connection.
Complete the code to retrieve a connection object in an Airflow DAG.
from airflow.hooks.base_hook import BaseHook conn = BaseHook.get_connection([1])
You need to pass the connection ID as a string to get_connection. Here, 'my_gcp_conn' matches the connection you want to retrieve.
Fix the error in the code to set the connection password in Airflow's connection object.
conn = BaseHook.get_connection('my_gcp_conn') conn.[1] = 'new_password'
The password for a connection is stored in the password attribute. Setting it updates the connection's password.
Fill both blanks to create a connection dictionary for Google Cloud with project and key path.
conn = Connection(conn_id='my_gcp_conn', conn_type=[1], extra='{"extra__google_cloud_platform__project": "my-project", "extra__google_cloud_platform__key_path": "[2]"}')
The connection type must be 'google_cloud_platform' for GCP. The key path is a string with the path to your JSON key file, like '/path/to/key.json'.
Fill all three blanks to create a connection dictionary with conn_id, conn_type, and extra JSON for AWS.
conn = Connection(conn_id=[1], conn_type=[2], extra='{"extra__aws_access_key_id": "[3]"}')
The connection ID is 'my_aws_conn', the type for AWS is 'aws', and the access key ID is a string like AKIAIOSFODNN7EXAMPLE.