0
0
AirflowHow-ToBeginner · 3 min read

How to Use one_success Trigger Rule in Airflow

In Airflow, the one_success trigger rule allows a task to run when at least one of its upstream tasks has succeeded. You set it by assigning trigger_rule="one_success" in the task definition, enabling flexible workflows that proceed after any upstream success.
📐

Syntax

The one_success trigger rule is set as a parameter in an Airflow task to control when it runs based on upstream task states.

  • trigger_rule="one_success": Runs the task if at least one upstream task succeeded.
  • Used in the task constructor or operator arguments.
  • Overrides the default all_success rule which requires all upstream tasks to succeed.
python
from airflow import DAG
from airflow.operators.dummy import DummyOperator
from airflow.utils.trigger_rule import TriggerRule
from datetime import datetime

dag = DAG('example_one_success', start_date=datetime(2024, 1, 1), schedule_interval=None)

upstream_task1 = DummyOperator(task_id='upstream_task1', dag=dag)
upstream_task2 = DummyOperator(task_id='upstream_task2', dag=dag)

# Task runs if at least one upstream task succeeds
triggered_task = DummyOperator(
    task_id='triggered_task',
    dag=dag,
    trigger_rule=TriggerRule.ONE_SUCCESS
)

upstream_task1 >> triggered_task
upstream_task2 >> triggered_task
💻

Example

This example shows a DAG with two upstream tasks and one downstream task that uses one_success. The downstream task runs if either upstream task succeeds, even if the other fails.

python
from airflow import DAG
from airflow.operators.dummy import DummyOperator
from airflow.utils.trigger_rule import TriggerRule
from datetime import datetime

dag = DAG('one_success_example', start_date=datetime(2024, 1, 1), schedule_interval=None)

upstream1 = DummyOperator(task_id='upstream1', dag=dag)
upstream2 = DummyOperator(task_id='upstream2', dag=dag)

# This task runs if at least one upstream task succeeds
downstream = DummyOperator(
    task_id='downstream',
    dag=dag,
    trigger_rule=TriggerRule.ONE_SUCCESS
)

upstream1 >> downstream
upstream2 >> downstream
Output
When running this DAG, the 'downstream' task will execute if either 'upstream1' or 'upstream2' succeeds, even if the other fails or is skipped.
⚠️

Common Pitfalls

  • Not setting trigger_rule explicitly defaults to all_success, so the task won't run if any upstream fails.
  • Using one_success when you expect all upstream tasks to succeed can cause unexpected runs.
  • Mixing trigger rules in complex DAGs can lead to confusing task states.
python
from airflow import DAG
from airflow.operators.dummy import DummyOperator
from airflow.utils.trigger_rule import TriggerRule
from datetime import datetime

dag = DAG('pitfall_example', start_date=datetime(2024, 1, 1), schedule_interval=None)

up1 = DummyOperator(task_id='up1', dag=dag)
up2 = DummyOperator(task_id='up2', dag=dag)

# Wrong: no trigger_rule set, defaults to all_success
wrong_task = DummyOperator(task_id='wrong_task', dag=dag)

# Right: explicitly set one_success
right_task = DummyOperator(task_id='right_task', dag=dag, trigger_rule=TriggerRule.ONE_SUCCESS)

up1 >> wrong_task
up2 >> wrong_task
up1 >> right_task
up2 >> right_task
📊

Quick Reference

Trigger RuleDescription
all_successRuns only if all upstream tasks succeed (default)
one_successRuns if at least one upstream task succeeds
all_failedRuns if all upstream tasks fail
one_failedRuns if at least one upstream task fails
none_failedRuns if no upstream tasks failed (success or skipped)
none_skippedRuns if no upstream tasks were skipped

Key Takeaways

Set trigger_rule="one_success" to run a task when any upstream task succeeds.
The default trigger_rule is all_success, requiring all upstream tasks to succeed.
Use one_success for flexible workflows that tolerate some upstream failures.
Always explicitly set trigger_rule to avoid unexpected task behavior.
Understand other trigger rules to control task execution precisely.