How to Use Celery with Flask for Background Tasks
To use
Celery with Flask, create a Celery instance configured with a message broker like Redis, then define tasks as functions decorated with @celery.task. Run the Celery worker separately to process tasks asynchronously while your Flask app handles web requests.Syntax
To integrate Celery with Flask, you first create a Celery object and configure it with a broker URL (like Redis). Then, you define tasks using the @celery.task decorator. Finally, you call these tasks asynchronously using .delay().
- Celery(app_name, broker=broker_url): Creates a Celery instance linked to your Flask app.
- @celery.task: Decorates a function to mark it as a background task.
- task.delay(args): Calls the task asynchronously.
python
from celery import Celery def make_celery(app): celery = Celery( app.import_name, broker=app.config['CELERY_BROKER_URL'] ) celery.conf.update(app.config) return celery app.config['CELERY_BROKER_URL'] = 'redis://localhost:6379/0' celery = make_celery(app) @celery.task def add(x, y): return x + y result = add.delay(4, 6)
Example
This example shows a simple Flask app integrated with Celery using Redis as the broker. It defines a task to add two numbers asynchronously and returns the task ID immediately. The Celery worker processes the task in the background.
python
from flask import Flask, jsonify, request from celery import Celery app = Flask(__name__) app.config['CELERY_BROKER_URL'] = 'redis://localhost:6379/0' app.config['CELERY_RESULT_BACKEND'] = 'redis://localhost:6379/0' celery = Celery(app.import_name, broker=app.config['CELERY_BROKER_URL']) celery.conf.update(app.config) @celery.task def add(x, y): return x + y @app.route('/add', methods=['POST']) def add_numbers(): data = request.get_json() task = add.delay(data['x'], data['y']) return jsonify({'task_id': task.id}), 202 @app.route('/result/<task_id>') def get_result(task_id): task = add.AsyncResult(task_id) if task.state == 'PENDING': response = {'state': task.state, 'result': None} elif task.state != 'FAILURE': response = {'state': task.state, 'result': task.result} else: response = {'state': task.state, 'result': str(task.info)} return jsonify(response) if __name__ == '__main__': app.run(debug=True)
Output
Running Flask app on http://127.0.0.1:5000
POST /add with JSON {"x": 5, "y": 7} returns {"task_id": "some-task-id"}
GET /result/some-task-id returns {"state": "SUCCESS", "result": 12}
Common Pitfalls
Common mistakes when using Celery with Flask include:
- Not running the Celery worker separately, so tasks never execute.
- Misconfiguring the broker URL or result backend, causing connection errors.
- Defining tasks inside the Flask app without proper Celery context, leading to import or serialization issues.
- Calling tasks synchronously instead of using
.delay(), blocking the Flask app.
Always start the Celery worker with celery -A your_module.celery worker --loglevel=info in a separate terminal.
python
## Wrong: Calling task function directly (blocks Flask) result = add(4, 6) # This runs synchronously ## Right: Call task asynchronously result = add.delay(4, 6) # Runs in background
Quick Reference
Summary tips for using Celery with Flask:
- Configure
CELERY_BROKER_URLandCELERY_RESULT_BACKENDin Flask config. - Create Celery instance with Flask app context.
- Define tasks with
@celery.task. - Call tasks asynchronously with
.delay(). - Run Celery worker separately to process tasks.
Key Takeaways
Configure Celery with Flask using a message broker like Redis for asynchronous task processing.
Define background tasks with @celery.task and call them using .delay() to avoid blocking Flask requests.
Always run the Celery worker separately to execute tasks in the background.
Ensure broker and result backend URLs are correctly set in Flask config to avoid connection issues.
Use Flask routes to trigger tasks and check their status asynchronously.