0
0
AirflowHow-ToBeginner · 3 min read

How to Use all_success Trigger Rule in Airflow

In Airflow, use the all_success trigger rule to make a task run only if all its upstream tasks have succeeded. Set this by assigning trigger_rule="all_success" in the task definition, which is the default behavior for most tasks.
📐

Syntax

The all_success trigger rule is set in a task's parameters to control when it runs based on upstream tasks' states.

  • trigger_rule="all_success": Runs the task only if all upstream tasks succeeded.
  • This is the default trigger rule for Airflow tasks.
python
from airflow import DAG
from airflow.operators.dummy import DummyOperator
from airflow.utils.dates import days_ago

with DAG(dag_id='example_all_success', start_date=days_ago(1), schedule_interval='@daily') as dag:
    task1 = DummyOperator(task_id='task1')
    task2 = DummyOperator(task_id='task2')
    final_task = DummyOperator(task_id='final_task', trigger_rule='all_success')

    [task1, task2] >> final_task
💻

Example

This example shows a DAG with two upstream tasks and one downstream task using all_success. The downstream task runs only if both upstream tasks succeed.

python
from airflow import DAG
from airflow.operators.dummy import DummyOperator
from airflow.utils.dates import days_ago

with DAG(dag_id='all_success_example', start_date=days_ago(1), schedule_interval='@daily') as dag:
    task1 = DummyOperator(task_id='task1')
    task2 = DummyOperator(task_id='task2')
    final_task = DummyOperator(task_id='final_task', trigger_rule='all_success')

    task1 >> final_task
    task2 >> final_task
Output
Task final_task runs only if task1 and task2 both succeed.
⚠️

Common Pitfalls

Common mistakes when using all_success include:

  • Assuming the downstream task runs if any upstream task succeeds (use one_success for that).
  • Not setting trigger_rule explicitly when you want to override the default behavior.
  • Confusing all_success with all_done, which runs regardless of success or failure.
python
from airflow import DAG
from airflow.operators.dummy import DummyOperator
from airflow.utils.dates import days_ago

with DAG(dag_id='wrong_trigger_rule', start_date=days_ago(1), schedule_interval='@daily') as dag:
    task1 = DummyOperator(task_id='task1')
    task2 = DummyOperator(task_id='task2')
    # Wrong: using one_success instead of all_success
    final_task = DummyOperator(task_id='final_task', trigger_rule='one_success')

    task1 >> final_task
    task2 >> final_task

# Correct usage:
# final_task = DummyOperator(task_id='final_task', trigger_rule='all_success')
📊

Quick Reference

Trigger RuleDescription
all_successRun task only if all upstream tasks succeeded (default)
one_successRun task if at least one upstream task succeeded
all_doneRun task regardless of upstream task states
none_failedRun task if no upstream tasks failed
none_skippedRun task if no upstream tasks were skipped

Key Takeaways

The all_success trigger rule runs a task only if all upstream tasks succeed.
It is the default trigger rule for Airflow tasks, so explicit setting is optional unless overriding.
Use all_success to ensure downstream tasks run only after complete success upstream.
Avoid confusing all_success with other trigger rules like one_success or all_done.
Check task dependencies carefully to use trigger rules correctly.