0
0
Apache Airflowdevops~5 mins

Manual triggers and parameters in Apache Airflow - Commands & Configuration

Choose your learning style9 modes available
Introduction
Sometimes you want to start a workflow yourself and give it some information to use. Manual triggers let you do this in Airflow, so you can run tasks on demand with specific details.
When you want to run a data pipeline only after a manual check or approval.
When you need to test a workflow with different input values without changing the code.
When you want to rerun a failed task with updated parameters.
When you want to start a report generation with a chosen date range.
When you want to trigger a workflow from the Airflow UI with custom settings.
Config File - manual_trigger_dag.py
manual_trigger_dag.py
from airflow import DAG
from airflow.operators.python import PythonOperator
from airflow.utils.dates import days_ago

def print_params(**kwargs):
    param1 = kwargs['dag_run'].conf.get('param1', 'default1')
    param2 = kwargs['dag_run'].conf.get('param2', 'default2')
    print(f"Parameter 1: {param1}")
    print(f"Parameter 2: {param2}")

with DAG(
    dag_id='manual_trigger_dag',
    start_date=days_ago(1),
    schedule_interval=None,
    catchup=False
) as dag:
    task = PythonOperator(
        task_id='print_params_task',
        python_callable=print_params
    )

This DAG defines a workflow that does not run on a schedule (schedule_interval=None), so it only runs when manually triggered.

The print_params function reads parameters param1 and param2 from the manual trigger input and prints them.

The PythonOperator runs this function when the DAG is triggered.

Commands
This command manually starts the DAG named 'manual_trigger_dag' and passes parameters 'param1' and 'param2' with values 'hello' and 'world'.
Terminal
airflow dags trigger manual_trigger_dag --conf '{"param1":"hello","param2":"world"}'
Expected OutputExpected
Created <DagRun manual_trigger_dag @ 2024-06-01T12:00:00+00:00: manual__2024-06-01T12:00:00+00:00, externally triggered: True>
--conf - Pass JSON parameters to the DAG run
This command shows the logs of the task 'print_params_task' for the manual run started at the given execution date, so you can see the printed parameters.
Terminal
airflow tasks logs manual_trigger_dag print_params_task --execution-date 2024-06-01T12:00:00+00:00
Expected OutputExpected
Parameter 1: hello Parameter 2: world
--execution-date - Specify the exact run to get logs from
Key Concept

If you remember nothing else from this pattern, remember: manual triggers let you start workflows anytime with custom input parameters.

Common Mistakes
Not setting schedule_interval to None in the DAG definition.
The DAG might run automatically on a schedule, which is not desired for manual triggers.
Always set schedule_interval=None to ensure the DAG only runs when manually triggered.
Passing parameters with incorrect JSON format in the --conf flag.
Airflow will reject the trigger or ignore parameters if JSON is invalid.
Use proper JSON syntax with double quotes and escape characters if needed.
Trying to access parameters without checking if they exist in dag_run.conf.
This causes errors if parameters are missing during manual or scheduled runs.
Use .get() with default values to safely access parameters.
Summary
Define a DAG with schedule_interval=None to allow manual triggering only.
Use the --conf flag with airflow dags trigger to pass parameters as JSON.
Access parameters inside tasks via dag_run.conf safely with default values.
Check task logs to verify parameters were received and used correctly.