0
0
Apache Airflowdevops~10 mins

Trigger rules (all_success, one_success, none_failed) in Apache Airflow - Interactive Code Practice

Choose your learning style9 modes available
Practice - 5 Tasks
Answer the questions below
1fill in blank
easy

Complete the code to set the trigger rule so the task runs only if all upstream tasks succeed.

Apache Airflow
task = PythonOperator(task_id='task1', python_callable=my_func, trigger_rule='[1]')
Drag options to blanks, or click blank then click option'
Anone_failed
Ball_done
Cone_success
Dall_success
Attempts:
3 left
💡 Hint
Common Mistakes
Using 'all_done' which runs regardless of success or failure.
Using 'one_success' which runs if any upstream task succeeds.
2fill in blank
medium

Complete the code to set the trigger rule so the task runs if at least one upstream task succeeds.

Apache Airflow
task = PythonOperator(task_id='task2', python_callable=my_func, trigger_rule='[1]')
Drag options to blanks, or click blank then click option'
Anone_failed
Ball_success
Cone_success
Dall_done
Attempts:
3 left
💡 Hint
Common Mistakes
Choosing 'all_success' which requires all upstream tasks to succeed.
Choosing 'none_failed' which requires no failures but all must be done.
3fill in blank
hard

Fix the error in the code to make the task run only if no upstream tasks have failed.

Apache Airflow
task = PythonOperator(task_id='task3', python_callable=my_func, trigger_rule='[1]')
Drag options to blanks, or click blank then click option'
Anone_failed
Bone_success
Call_success
Dall_done
Attempts:
3 left
💡 Hint
Common Mistakes
Using 'all_success' which requires all upstream tasks to succeed.
Using 'all_done' which runs regardless of failure.
4fill in blank
hard

Fill both blanks to create a PythonOperator that runs if all upstream tasks succeed and has a task_id of 'final_task'.

Apache Airflow
task = PythonOperator(task_id='[1]', python_callable=my_func, trigger_rule='[2]')
Drag options to blanks, or click blank then click option'
Afinal_task
Ball_done
Call_success
Done_success
Attempts:
3 left
💡 Hint
Common Mistakes
Using 'all_done' as trigger_rule which runs regardless of success.
Using wrong task_id names.
5fill in blank
hard

Fill all three blanks to create a PythonOperator with task_id 'check_task', that runs if no upstream tasks failed, and uses a callable named 'check_status'.

Apache Airflow
task = PythonOperator(task_id='[1]', python_callable=[2], trigger_rule='[3]')
Drag options to blanks, or click blank then click option'
Acheck_task
Bcheck_status
Cnone_failed
Dall_success
Attempts:
3 left
💡 Hint
Common Mistakes
Mixing up trigger rules like using 'all_success' instead of 'none_failed'.
Using incorrect callable names.