Complete the code to set the task failure callback in Airflow.
task = PythonOperator(task_id='my_task', python_callable=my_func, on_failure_callback=[1])
The on_failure_callback parameter is used to specify a function like send_email that triggers when the task fails, helping to monitor failures.
Complete the code to add a sensor that waits for a file before running the task.
wait_for_file = [1](task_id='wait_for_file', filepath='/data/input.csv')
FileSensor waits for a file to appear before the downstream task runs, preventing silent failures due to missing input.
Fix the error in the DAG definition to enable email alerts on failure.
default_args = {'owner': 'airflow', 'email': ['admin@example.com'], 'email_on_failure': [1]
dag = DAG('my_dag', default_args=default_args, schedule_interval='@daily')Setting email_on_failure to True enables sending email alerts when a task fails.
Fill both blanks to create a task that retries 3 times with a 5-minute delay.
task = PythonOperator(task_id='retry_task', python_callable=my_func, retries=[1], retry_delay=timedelta(minutes=[2]))
Setting retries=3 and retry_delay=timedelta(minutes=5) makes the task retry 3 times with 5 minutes between attempts, helping catch transient failures.
Fill all three blanks to create a DAG with a start date, schedule, and catchup disabled.
dag = DAG('example_dag', start_date=datetime([1], 1, 1), schedule_interval='[2]', catchup=[3])
Setting start_date=datetime(2024, 1, 1), schedule_interval='@daily', and catchup=False creates a DAG that runs daily from 2024 without backfilling missed runs, preventing silent failures from unexpected catchups.