How to Use Celery with FastAPI for Background Tasks
To use
Celery with FastAPI, set up a Celery app with a message broker like Redis, then create tasks in Celery and call them asynchronously from FastAPI endpoints. This lets FastAPI handle requests quickly while Celery processes long-running jobs in the background.Syntax
Here is the basic syntax to integrate Celery with FastAPI:
Celery(app_name, broker='redis://localhost:6379/0'): Creates a Celery instance with Redis as the message broker.@celery_app.task: Decorator to define a background task.task.delay(args): Call to run the task asynchronously.FastAPI(): Creates the FastAPI app.@app.post(): Defines an API endpoint that triggers the Celery task.
python
from fastapi import FastAPI from celery import Celery celery_app = Celery('worker', broker='redis://localhost:6379/0') @celery_app.task def add(x, y): return x + y app = FastAPI() @app.post('/add') def call_add(x: int, y: int): task = add.delay(x, y) return {'task_id': task.id, 'status': 'Task started'}
Example
This example shows a FastAPI app that triggers a Celery task to add two numbers asynchronously. The task runs in the background, letting FastAPI respond immediately with the task ID.
python
from fastapi import FastAPI from celery import Celery celery_app = Celery('worker', broker='redis://localhost:6379/0') @celery_app.task def add(x, y): return x + y app = FastAPI() @app.post('/add') def call_add(x: int, y: int): task = add.delay(x, y) return {'task_id': task.id, 'status': 'Task started'} # To run: # 1. Start Redis server # 2. Run celery worker: celery -A <filename_without_py> worker --loglevel=info # 3. Run FastAPI app: uvicorn <filename_without_py>:app --reload
Output
{"task_id": "some-task-id", "status": "Task started"}
Common Pitfalls
- Not running a message broker: Celery needs Redis or RabbitMQ running; otherwise, tasks won't queue.
- Forgetting to start the Celery worker: Tasks won't execute if the worker process is not running.
- Calling tasks directly: Using
add(x, y)runs synchronously; useadd.delay(x, y)to run asynchronously. - Incorrect broker URL: Make sure the broker URL matches your Redis or RabbitMQ setup.
python
## Wrong way (runs synchronously, blocking FastAPI): # result = add(2, 3) ## Right way (runs asynchronously): # result = add.delay(2, 3)
Quick Reference
Summary tips for using Celery with FastAPI:
- Use Redis or RabbitMQ as the broker.
- Define tasks with
@celery_app.task. - Call tasks with
task.delay()to run asynchronously. - Run a Celery worker process separately.
- Keep FastAPI endpoints fast by offloading heavy work to Celery.
Key Takeaways
Set up Celery with a message broker like Redis to handle background tasks.
Define tasks with @celery_app.task and call them asynchronously using task.delay().
Always run a separate Celery worker process to execute tasks.
Use FastAPI endpoints to trigger tasks without blocking the main thread.
Check broker URL and worker status to avoid common errors.