How to Use none_failed Trigger Rule in Airflow
In Airflow, the
none_failed trigger rule allows a task to run only if none of its upstream tasks have failed. This means the task will execute if all upstream tasks succeeded or were skipped, but it will not run if any upstream task failed.Syntax
The none_failed trigger rule is set on a task using the trigger_rule parameter. It controls when the task should run based on the states of its upstream tasks.
Parts explained:
trigger_rule='none_failed': Runs the task only if no upstream task has failed.- Upstream tasks: Tasks that must complete before this task runs.
python
from airflow import DAG from airflow.operators.dummy import DummyOperator from airflow.utils.trigger_rule import TriggerRule from datetime import datetime dag = DAG('example_none_failed', start_date=datetime(2024, 1, 1), schedule_interval='@daily') start = DummyOperator(task_id='start', dag=dag) success_task = DummyOperator(task_id='success_task', dag=dag) failure_task = DummyOperator(task_id='failure_task', dag=dag) final_task = DummyOperator( task_id='final_task', trigger_rule=TriggerRule.NONE_FAILED, dag=dag ) start >> [success_task, failure_task] >> final_task
Example
This example shows a DAG with three upstream tasks: one that succeeds, one that fails, and a final task with none_failed trigger rule. The final task will only run if none of the upstream tasks fail.
python
from airflow import DAG from airflow.operators.python import PythonOperator from airflow.utils.trigger_rule import TriggerRule from datetime import datetime def succeed(): print('Task succeeded') def fail(): raise Exception('Task failed intentionally') dag = DAG('none_failed_example', start_date=datetime(2024, 1, 1), schedule_interval='@once') success_task = PythonOperator( task_id='success_task', python_callable=succeed, dag=dag ) failure_task = PythonOperator( task_id='failure_task', python_callable=fail, dag=dag ) final_task = PythonOperator( task_id='final_task', python_callable=lambda: print('Final task runs only if none failed'), trigger_rule=TriggerRule.NONE_FAILED, dag=dag ) success_task >> final_task failure_task >> final_task
Output
Task success_task succeeded
Task failure_task failed
Task final_task did NOT run because failure_task failed
Common Pitfalls
Common mistakes when using none_failed trigger rule include:
- Expecting the task to run if any upstream task is skipped or failed. It only runs if no upstream task failed.
- Confusing
none_failedwithall_success. The former allows skipped upstream tasks, the latter requires all upstream tasks to succeed. - Not setting the trigger rule explicitly, so the default
all_successapplies.
python
from airflow.utils.trigger_rule import TriggerRule from airflow.operators.dummy import DummyOperator # Wrong: default trigger_rule (all_success) will not run if any upstream skipped final_task = DummyOperator( task_id='final_task', dag=dag ) # Right: none_failed allows running if upstream tasks are skipped but none failed final_task = DummyOperator( task_id='final_task', trigger_rule=TriggerRule.NONE_FAILED, dag=dag )
Quick Reference
| Trigger Rule | Description |
|---|---|
| none_failed | Runs if no upstream task failed (allows skipped) |
| all_success | Runs only if all upstream tasks succeeded |
| all_failed | Runs only if all upstream tasks failed |
| one_failed | Runs if at least one upstream task failed |
| none_skipped | Runs if no upstream task was skipped |
Key Takeaways
Use
trigger_rule=TriggerRule.NONE_FAILED to run a task only if no upstream task failed.none_failed allows upstream tasks to be skipped but not failed.Do not confuse
none_failed with all_success; they behave differently with skipped tasks.Always set the trigger rule explicitly to avoid default behavior.
Test your DAG to verify task execution flow with different upstream task outcomes.