Given the following Airflow custom operator code snippet, what will be the output when the operator runs successfully?
from airflow.models import BaseOperator from airflow.utils.decorators import apply_defaults class HelloOperator(BaseOperator): @apply_defaults def __init__(self, name: str, **kwargs): super().__init__(**kwargs) self.name = name def execute(self, context): message = f"Hello, {self.name}!" print(message) return message # Assume this operator is instantiated and run with name='Airflow'
Look at the execute method and what it prints and returns.
The execute method prints and returns the message with the name passed during initialization. Since name='Airflow', the output is Hello, Airflow!.
Which DAG configuration snippet correctly uses a custom operator MyOperator with task_id task1?
from airflow import DAG from datetime import datetime from my_operators import MyOperator default_args = { 'start_date': datetime(2024, 1, 1), } dag = DAG('my_dag', default_args=default_args, schedule_interval='@daily') # Fill in the correct task definition below
Remember that task_id is a required named argument for operators.
The correct way to instantiate an operator in a DAG is to provide task_id as a named argument and assign the dag. Option D does this correctly.
What error will Airflow raise if a custom operator class does not implement the execute method?
from airflow.models import BaseOperator class BrokenOperator(BaseOperator): pass # When Airflow tries to run BrokenOperator, what happens?
BaseOperator uses Python's abstract base class mechanism to enforce execute implementation.
BaseOperator declares execute as an abstract method. Not implementing it causes a TypeError when instantiating the class.
Given a DAG with two custom operators op1 and op2 where op2 depends on op1, what is the correct order of execution?
op1 = MyOperator(task_id='op1', dag=dag) op2 = MyOperator(task_id='op2', dag=dag) op1 >> op2
The >> operator sets task dependencies in Airflow.
op1 >> op2 means op1 must finish before op2 starts, so op1 runs first, then op2.
Which approach is best for unit testing the execute method of a custom Airflow operator?
Unit tests should be automated and isolated from external systems.
Mocking the Airflow context and dependencies allows isolated, automated testing of the execute method without needing a full Airflow environment.