Bird
0
0

You have this Airflow DAG snippet:

medium📝 Debug Q6 of 15
dbt - Production Deployment
You have this Airflow DAG snippet:
dbt_run = BashOperator(task_id='dbt_run', bash_command='dbt run')
dbt_test = BashOperator(task_id='dbt_test', bash_command='dbt test')
dbt_test.set_downstream(dbt_run)

Why does the dbt_run task never execute?
AAirflow requires PythonOperator for dbt commands, not BashOperator
BThe bash_command syntax is incorrect for running dbt commands
CThe task IDs are not unique and cause a conflict
DThe task dependency is reversed; <code>dbt_test</code> should run after <code>dbt_run</code>
Step-by-Step Solution
Solution:
  1. Step 1: Analyze task dependencies

    The code uses dbt_test.set_downstream(dbt_run), which means dbt_run runs after dbt_test.
  2. Step 2: Understand intended order

    Typically, dbt_run should execute before dbt_test because tests validate the run results.
  3. Step 3: Identify the error

    The dependency is reversed, so dbt_run never triggers since dbt_test is not scheduled first.
  4. Final Answer:

    The task dependency is reversed; dbt_test should run after dbt_run -> Option D
  5. Quick Check:

    Check task dependencies for correct order [OK]
Quick Trick: Check if task dependencies reflect intended execution order [OK]
Common Mistakes:
MISTAKES
  • Confusing set_upstream and set_downstream methods
  • Assuming BashOperator cannot run dbt commands
  • Overlooking task ID uniqueness

Want More Practice?

15+ quiz questions · All difficulty levels · Free

Free Signup - Practice All Questions
More dbt Quizzes