In Apache Airflow, the Celery Executor allows distributed task execution. Which statement best describes how the Celery Executor distributes tasks?
Think about how tasks are sent to different machines to run at the same time.
The Celery Executor sends tasks to multiple worker nodes through a message broker like RabbitMQ or Redis. This allows tasks to run in parallel on different machines, improving scalability.
What is the expected output when you start an Airflow Celery worker with the command airflow celery worker?
airflow celery worker
Consider what a worker process does when it starts.
When you run airflow celery worker, it starts a Celery worker process that connects to the message broker and waits for tasks. The output shows the worker hostname and readiness.
Which airflow.cfg snippet correctly configures Airflow to use the Celery Executor with Redis as the broker?
Check the executor type and the broker URL format carefully.
The Celery Executor must be set in the [core] section. The broker_url for Redis includes the database number (/0). The result_backend can be a database URL like SQLite.
You start a Celery worker but it fails to connect to the message broker. Which log message indicates the cause?
Look for connection or network errors in logs.
A 'Connection refused' error means the Celery worker cannot reach the Redis broker, possibly because Redis is not running or the URL is incorrect.
Arrange the steps in the correct order to set up the Celery Executor for distributed task execution in Airflow.
Think about what must be ready before starting workers and scheduler.
First, the message broker must be installed and running. Then configure Airflow to use CeleryExecutor with the broker URL. Next, start the Celery workers so they can receive tasks. Finally, start the scheduler to send tasks to workers.