Complete the code to trigger a DAG manually using the Airflow CLI.
airflow dags [1] example_dagUse trigger to start a DAG run manually from the CLI.
Complete the code to pass a parameter named 'run_date' when triggering a DAG manually.
airflow dags trigger example_dag --conf '{"[1]": "2024-06-01"}'
The parameter key 'run_date' is commonly used to pass a date when triggering DAGs manually.
Fix the error in the code to correctly access the parameter 'user' passed during manual trigger inside the DAG.
user_param = [1]['user']
Parameters passed during manual trigger are accessed via dag_run.conf inside the DAG.
Fill both blanks to define a PythonOperator that uses a function accessing a manual trigger parameter 'task_id'.
def process_task(**kwargs): task = kwargs['[1]'].conf.get('[2]') process = PythonOperator( task_id='process_task', python_callable=process_task, provide_context=True, dag=dag )
Inside the function, dag_run.conf.get('task_id') accesses the parameter 'task_id' passed during manual trigger.
Fill all three blanks to trigger a DAG named 'data_pipeline' with parameters 'env' set to 'prod' and 'version' set to 'v2' using the CLI.
airflow dags [1] [2] --conf '{"[3]": "prod", "version": "v2"}'
Use trigger to start the DAG data_pipeline and pass the parameter env with value 'prod'.