0
0
AirflowHow-ToBeginner · 3 min read

How to Use @task Decorator in Airflow for Simple Task Creation

In Airflow, use the @task decorator to turn a Python function into a task within a DAG. This simplifies task creation by avoiding explicit Operators and lets you write clean, readable workflows.
📐

Syntax

The @task decorator is placed above a Python function to convert it into an Airflow task. The function can then be called inside a DAG context to create task instances.

Key parts:

  • @task: Decorator to mark a function as a task.
  • Function definition: Your task logic goes here.
  • Calling the function inside the DAG: Creates the task instance.
python
from airflow.decorators import task
from airflow import DAG
from datetime import datetime

default_args = {
    'start_date': datetime(2024, 1, 1)
}

with DAG('example_task_decorator', default_args=default_args, schedule_interval='@daily') as dag:
    @task
    def greet():
        print('Hello from Airflow task!')

    greet_task = greet()
💻

Example

This example shows a DAG with two tasks created using the @task decorator. The first task returns a message, and the second task prints it. The tasks are linked by calling the second task with the output of the first.

python
from airflow.decorators import task
from airflow import DAG
from datetime import datetime

default_args = {
    'start_date': datetime(2024, 1, 1)
}

with DAG('task_decorator_example', default_args=default_args, schedule_interval='@daily') as dag:

    @task
    def get_message():
        return 'Hello from Airflow!'

    @task
    def print_message(msg):
        print(msg)

    message = get_message()
    print_message(message)
Output
Hello from Airflow!
⚠️

Common Pitfalls

Common mistakes when using @task include:

  • Not calling the decorated function inside the DAG context, so no task is created.
  • Trying to use print statements expecting output in logs without proper Airflow logging.
  • Mixing @task with traditional Operators incorrectly.

Always call the decorated function to create the task instance, and use logging for output in production.

python
from airflow.decorators import task
from airflow import DAG
from datetime import datetime
import logging

default_args = {
    'start_date': datetime(2024, 1, 1)
}

with DAG('common_pitfalls', default_args=default_args, schedule_interval='@daily') as dag:

    @task
    def wrong_task():
        print('This print may not appear in logs')  # Not recommended

    @task
    def right_task():
        logging.info('This is the proper way to log in Airflow')

    # Missing call - no task created
    # wrong_task()

    # Correct call
    right_task()
📊

Quick Reference

Tips for using @task decorator:

  • Use @task to simplify task creation without Operators.
  • Always call the decorated function inside the DAG to create tasks.
  • Use return values to pass data between tasks.
  • Use Airflow's logging module for output instead of print.
  • Combine with @dag decorator for clean DAG definitions.

Key Takeaways

Use @task decorator to convert Python functions into Airflow tasks easily.
Always call the decorated function inside the DAG context to create the task instance.
Return values from @task functions can be used to pass data between tasks.
Use Airflow's logging module for task output instead of print statements.
Combine @task with @dag decorators for clean and readable DAG code.