Complete the code to specify the Airflow operator that runs a dbt command.
dbt_task = [1](task_id='run_dbt', bash_command='run')
The BashOperator in Airflow is used to run shell commands like dbt CLI commands.
Complete the code to set the Snowflake connection ID in Airflow for dbt.
snowflake_conn_id = '[1]'
The default Snowflake connection ID in Airflow is usually snowflake_default, which dbt can use.
Fix the error in the Airflow DAG code to correctly trigger a dbt run command.
dbt_run = BashOperator(task_id='dbt_run', bash_command='dbt [1]')
The dbt run command executes models and is the correct command to trigger a dbt run in Airflow.
Fill both blanks to define a dbt profile for Snowflake with correct keys.
""" profiles: my_profile: target: dev outputs: dev: type: snowflake account: [1] user: [2] """
The account key should be set to your Snowflake account identifier (e.g., my_account), and user should be your Snowflake username (e.g., my_user).
Fill in the blank to complete the Airflow DAG task dependencies for dbt run and test.
with DAG('dbt_dag', start_date=days_ago(1)) as dag: dbt_run = BashOperator(task_id='dbt_run', bash_command='dbt run') dbt_test = BashOperator(task_id='dbt_test', bash_command='dbt test') [1]
In Airflow, you can set task dependencies using the bitshift operator >> or the method set_downstream. The line dbt_run >> dbt_test sets dbt_test to run after dbt_run.