Complete the code to import the Airflow DAG class.
from airflow.[1] import DAG
The DAG class is imported from airflow.models to define workflows.
Complete the code to create a dbt run task using BashOperator.
dbt_run = BashOperator(task_id='dbt_run', bash_command='dbt [1]')
The 'dbt run' command executes the models in the project.
Fix the error in the DAG definition by completing the default_args dictionary key for retries.
default_args = {
'owner': 'airflow',
'start_date': days_ago(1),
'retries': [1]
}The 'retries' value must be an integer, not a string.
Fill both blanks to set the schedule interval to daily and enable catchup.
dag = DAG('dbt_dag', default_args=default_args, schedule_interval=[1], catchup=[2])
Setting schedule_interval to '@daily' runs the DAG daily. Setting catchup to False disables backfilling.
Fill all three blanks to define task dependencies: dbt_run runs before dbt_test, which runs before dbt_docs.
dbt_run [1] dbt_test dbt_test [2] dbt_docs dag = DAG('dbt_dag', default_args=default_args, schedule_interval='@daily') dbt_docs = BashOperator(task_id='dbt_docs', bash_command='dbt docs generate')
The '>>' operator sets the order of tasks in Airflow DAGs, meaning the left task runs before the right task.