0
0
AirflowHow-ToBeginner · 3 min read

How to Use all_failed Trigger Rule in Airflow

In Airflow, use the all_failed trigger rule to make a task run only if all its upstream tasks have failed. Set trigger_rule="all_failed" in the task definition to apply this behavior.
📐

Syntax

The all_failed trigger rule is set as a parameter in an Airflow task to control when it runs based on the status of upstream tasks.

Key parts:

  • trigger_rule: The parameter to set the condition.
  • all_failed: The value that makes the task run only if all upstream tasks failed.
python
task = PythonOperator(
    task_id='example_task',
    python_callable=my_function,
    trigger_rule='all_failed',
    dag=dag
)
💻

Example

This example shows a DAG with two upstream tasks that fail, and a downstream task that runs only if both upstream tasks fail, using all_failed.

python
from airflow import DAG
from airflow.operators.python import PythonOperator
from airflow.utils.trigger_rule import TriggerRule
from datetime import datetime

def fail_task():
    raise Exception('Failing task intentionally')

def success_task():
    print('This task succeeds')

def downstream_task():
    print('Downstream task runs only if all upstream tasks failed')

with DAG('all_failed_example', start_date=datetime(2024, 1, 1), schedule_interval=None, catchup=False) as dag:
    task1 = PythonOperator(
        task_id='fail_task1',
        python_callable=fail_task
    )

    task2 = PythonOperator(
        task_id='fail_task2',
        python_callable=fail_task
    )

    downstream = PythonOperator(
        task_id='downstream_task',
        python_callable=downstream_task,
        trigger_rule=TriggerRule.ALL_FAILED
    )

    [task1, task2] >> downstream
Output
Task fail_task1: Failed Task fail_task2: Failed Task downstream_task: Running Output: Downstream task runs only if all upstream tasks failed
⚠️

Common Pitfalls

Common mistakes when using all_failed trigger rule include:

  • Expecting the task to run if only some upstream tasks fail (it runs only if all fail).
  • Not setting the trigger rule explicitly, so the default all_success applies.
  • Confusing all_failed with one_failed which runs if any upstream task fails.
python
from airflow.utils.trigger_rule import TriggerRule

# Wrong: default trigger_rule (all_success) - task won't run if upstream fails
wrong_task = PythonOperator(
    task_id='wrong_task',
    python_callable=lambda: print('Runs only if all upstream succeed'),
    dag=dag
)

# Right: set trigger_rule to all_failed
right_task = PythonOperator(
    task_id='right_task',
    python_callable=lambda: print('Runs only if all upstream fail'),
    trigger_rule=TriggerRule.ALL_FAILED,
    dag=dag
)
📊

Quick Reference

Summary of trigger rules related to failure conditions:

Trigger RuleDescription
all_failedRuns only if all upstream tasks have failed
one_failedRuns if at least one upstream task has failed
all_successRuns only if all upstream tasks succeeded (default)
none_failedRuns if no upstream tasks have failed (success or skipped)

Key Takeaways

Set trigger_rule='all_failed' to run a task only when all upstream tasks fail.
The default trigger_rule is 'all_success', so you must explicitly set 'all_failed' to change behavior.
Use TriggerRule.ALL_FAILED constant for clarity and to avoid typos.
Understand the difference between 'all_failed' and 'one_failed' to avoid logic errors.
Test your DAG to confirm tasks run as expected under failure conditions.