Practice - 5 Tasks
Answer the questions below
1fill in blank
easyComplete the code to import the Airflow DAG class.
Apache Airflow
from airflow.[1] import DAG
Drag options to blanks, or click blank then click option'
Attempts:
3 left
💡 Hint
Common Mistakes
Importing DAG from airflow.operators instead of airflow.models.
✗ Incorrect
The DAG class is imported from airflow.models to define workflows.
2fill in blank
mediumComplete the code to set the default arguments for the DAG.
Apache Airflow
default_args = {'owner': 'airflow', 'start_date': [1](2024, 1, 1)} Drag options to blanks, or click blank then click option'
Attempts:
3 left
💡 Hint
Common Mistakes
Using timedelta instead of datetime for start_date.
✗ Incorrect
The start_date requires a datetime object from the datetime module.
3fill in blank
hardFix the error in the DAG definition by completing the missing argument.
Apache Airflow
dag = DAG('example_dag', default_args=default_args, schedule_interval=[1])
Drag options to blanks, or click blank then click option'
Attempts:
3 left
💡 Hint
Common Mistakes
Using timedelta object instead of a cron string for schedule_interval.
✗ Incorrect
The schedule_interval expects a cron string or preset like '@daily' to run daily.
4fill in blank
hardFill both blanks to create a simple task using BashOperator.
Apache Airflow
from airflow.operators.bash import [1] task1 = [2](task_id='print_date', bash_command='date', dag=dag)
Drag options to blanks, or click blank then click option'
Attempts:
3 left
💡 Hint
Common Mistakes
Using PythonOperator or EmailOperator instead of BashOperator for bash commands.
✗ Incorrect
BashOperator is used to run bash commands as tasks in Airflow.
5fill in blank
hardFill all three blanks to set task dependencies so task1 runs before task2 and task3.
Apache Airflow
task1 [1] task2 task1 [2] task3 task2 [3] task3
Drag options to blanks, or click blank then click option'
Attempts:
3 left
💡 Hint
Common Mistakes
Using '<<' or '==' which do not set dependencies correctly.
✗ Incorrect
The '>>' operator sets the order of tasks in Airflow DAGs, meaning the left task runs before the right.