Complete the code to import the DAG class from Airflow.
from airflow.[1] import DAG
The DAG class is located in the airflow.models module, so importing it from there is correct.
Complete the code to define the DAG's start date using Airflow's datetime utility.
from airflow.utils.dates import [1] start_date = [1](1)
The days_ago function is used to set relative start dates in Airflow DAGs, which helps avoid timezone issues.
Fix the error in the DAG import statement to avoid parsing errors.
from airflow.operators.bash import [1]Operator bash_task = [1]Operator(task_id='print_date', bash_command='date')
The correct class name is BashOperator with uppercase B and O. The import path uses bash module but the class name must be exact.
Fill both blanks to correctly define a DAG with a daily schedule and a default argument for retries.
default_args = {'retries': [1]
dag = DAG('example_dag', default_args=default_args, schedule_interval=[2])Setting retries to 1 means the task will retry once on failure. The schedule_interval '@daily' runs the DAG every day.
Fill all three blanks to create a task using PythonOperator with a callable function and assign it to the DAG.
def my_task(): print('Hello Airflow') from airflow.operators.python import [1] task = [1](task_id=[2], python_callable=[3], dag=dag)
The PythonOperator runs Python functions. The task_id is a string identifier, here 'print_hello'. The python_callable is the function my_task defined above.