Airflow supports multiple secrets backends like HashiCorp Vault, AWS Secrets Manager, and environment variables. What is the main reason to use a secrets backend in Airflow?
Think about why you wouldn't want to hardcode passwords in your DAG files.
Secrets backends in Airflow are designed to keep sensitive data safe and provide it only when needed during task runs.
Given that Airflow is configured with the environment variables secrets backend, what will this command output?
airflow secrets get --conn-id my_postgres_conn
Check how Airflow environment variable secrets are named and accessed.
The CLI command retrieves the connection URI from the environment variable named AIRFLOW_CONN_MY_POSTGRES_CONN when using the environment variables secrets backend.
Choose the correct airflow.cfg snippet to configure HashiCorp Vault as the secrets backend with the Vault URL at https://vault.example.com and token authentication.
Check the exact import path and parameter names for VaultBackend in Airflow providers.
The correct backend path is airflow.providers.hashicorp.secrets.vault.VaultBackend and the backend_kwargs keys must match the expected parameters like 'url' and 'token'.
An Airflow deployment uses AWS Secrets Manager as the secrets backend. The DAGs fail with errors saying secrets are missing. Which of these is the most likely cause?
Think about AWS permissions needed for Airflow to access secrets.
Airflow needs proper AWS IAM permissions to read secrets from AWS Secrets Manager. Without them, it cannot retrieve secrets and will fail.
Arrange these steps in the correct order to securely use a database password stored in a secrets backend within an Airflow DAG.
Think about what must exist before Airflow can retrieve secrets and how tasks use them.
First, the secret must be stored securely. Then Airflow must be configured to access it. Next, DAGs reference the secret, and finally tasks use it without hardcoding.