0
0
Apache Airflowdevops~5 mins

Testing custom operators in Apache Airflow - Commands & Configuration

Choose your learning style9 modes available
Introduction
When you create your own custom operator in Airflow, you need to test it to make sure it works correctly before using it in real workflows. Testing helps catch mistakes early and ensures your operator behaves as expected.
When you build a new operator to perform a specific task in your data pipeline.
When you want to verify that your operator handles inputs and outputs correctly.
When you need to check that your operator interacts properly with other Airflow components.
When you want to prevent errors in production by catching bugs during development.
When you update an existing operator and want to confirm it still works as intended.
Config File - test_custom_operator.py
test_custom_operator.py
from airflow.models import DAG
from airflow.utils.dates import days_ago
from airflow.operators.python import PythonOperator
import unittest

# Example custom operator
class MyCustomOperator(PythonOperator):
    def execute(self, context):
        return 'custom operator executed'

class TestMyCustomOperator(unittest.TestCase):
    def setUp(self):
        self.dag = DAG(dag_id='test_dag', start_date=days_ago(1))

    def test_execute(self):
        operator = MyCustomOperator(
            task_id='test_task',
            python_callable=lambda: 'custom operator executed',
            dag=self.dag
        )
        result = operator.execute(context={})
        self.assertEqual(result, 'custom operator executed')

if __name__ == '__main__':
    unittest.main()

This Python file contains a simple custom operator class MyCustomOperator that extends Airflow's PythonOperator. It overrides the execute method to return a fixed string.

The TestMyCustomOperator class uses Python's unittest framework to test the operator. It creates a DAG for context, runs the operator's execute method, and checks that the output matches the expected string.

This test file can be run directly to verify the custom operator works as intended.

Commands
Run the test file to execute the unit test for the custom operator. This checks if the operator returns the expected result.
Terminal
python test_custom_operator.py
Expected OutputExpected
. ---------------------------------------------------------------------- Ran 1 test in 0.001s OK
List all DAGs in Airflow to confirm the test DAG is recognized by Airflow after adding the custom operator.
Terminal
airflow dags list
Expected OutputExpected
dag_id ------- test_dag
Key Concept

If you remember nothing else from this pattern, remember: always write unit tests that run your custom operator's execute method and check its output to catch errors early.

Common Mistakes
Not providing a DAG context when creating the operator instance in tests.
Airflow operators require a DAG context; without it, the operator may fail or behave unexpectedly.
Always create a DAG object and pass it to the operator during testing.
Not running the execute method directly in tests and instead relying on Airflow scheduler.
The scheduler runs asynchronously and is harder to test; unit tests should run execute directly for fast feedback.
Call the execute method directly with a context dictionary in your unit tests.
Ignoring the output of the execute method in tests.
Without checking output, you cannot confirm the operator behaves as expected.
Use assertions to compare the execute method's return value to the expected result.
Summary
Create a test Python file that defines your custom operator and a unittest class.
Write a test method that creates the operator with a DAG context and calls execute.
Run the test file with Python to verify the operator works as expected.
Use airflow CLI commands to check your DAGs are loaded correctly.