0
0
Apache Airflowdevops~10 mins

Manual triggers and parameters in Apache Airflow - Interactive Code Practice

Choose your learning style9 modes available
Practice - 5 Tasks
Answer the questions below
1fill in blank
easy

Complete the code to trigger a DAG manually using the Airflow CLI.

Apache Airflow
airflow dags [1] example_dag
Drag options to blanks, or click blank then click option'
Alist
Btrigger
Cpause
Ddelete
Attempts:
3 left
💡 Hint
Common Mistakes
Using 'pause' instead of 'trigger'
Trying to list DAGs instead of triggering
2fill in blank
medium

Complete the code to pass a parameter named 'run_date' when triggering a DAG manually.

Apache Airflow
airflow dags trigger example_dag --conf '{"[1]": "2024-06-01"}'
Drag options to blanks, or click blank then click option'
Adate
Bexecution_date
Cstart_date
Drun_date
Attempts:
3 left
💡 Hint
Common Mistakes
Using 'date' or 'start_date' which are not the expected keys
Using 'execution_date' which is deprecated
3fill in blank
hard

Fix the error in the code to correctly access the parameter 'user' passed during manual trigger inside the DAG.

Apache Airflow
user_param = [1]['user']
Drag options to blanks, or click blank then click option'
Adag_run.conf
Bparams
Ccontext['params']
Ddag_run.conf.get
Attempts:
3 left
💡 Hint
Common Mistakes
Using 'context["params"]' which is for template params
Using 'params' directly which is undefined
4fill in blank
hard

Fill both blanks to define a PythonOperator that uses a function accessing a manual trigger parameter 'task_id'.

Apache Airflow
def process_task(**kwargs):
    task = kwargs['[1]'].conf.get('[2]')

process = PythonOperator(
    task_id='process_task',
    python_callable=process_task,
    provide_context=True,
    dag=dag
)
Drag options to blanks, or click blank then click option'
Adag_run
Bcontext
Ctask_id
Dparams
Attempts:
3 left
💡 Hint
Common Mistakes
Using 'context' instead of 'dag_run'
Using 'params' which is not the correct object here
5fill in blank
hard

Fill all three blanks to trigger a DAG named 'data_pipeline' with parameters 'env' set to 'prod' and 'version' set to 'v2' using the CLI.

Apache Airflow
airflow dags [1] [2] --conf '{"[3]": "prod", "version": "v2"}'
Drag options to blanks, or click blank then click option'
Atrigger
Bdata_pipeline
Cenv
Drun
Attempts:
3 left
💡 Hint
Common Mistakes
Using 'run' instead of 'trigger'
Passing wrong parameter keys