Given this Celery task defined in a Flask app, what will result.get() return?
from celery import Celery app = Celery('tasks', broker='redis://localhost:6379/0') @app.task def add(x, y): return x + y result = add.delay(4, 5)
Remember that delay() queues the task and get() waits for the result.
The add task adds two numbers. Calling add.delay(4, 5) queues the task. Using result.get() waits for the task to finish and returns the sum, which is 9.
Choose the correct way to define a Celery task function in a Flask app.
Decorator syntax must be directly above the function definition without parentheses if no arguments.
Option A correctly uses @app.task decorator above the function. Option B places decorator after function which is invalid. Option C uses parentheses which is allowed and valid in Celery 5+. Option D returns None because it prints instead of returning.
result.status immediately after calling add.delay(2, 3)?Consider this Celery task call:
result = add.delay(2, 3)
What is the value of result.status right after this line?
from celery import Celery app = Celery('tasks', broker='redis://localhost:6379/0') @app.task def add(x, y): return x + y result = add.delay(2, 3)
Think about the task lifecycle immediately after queuing.
When a task is queued with delay(), its initial status is PENDING until a worker picks it up. So immediately after calling delay(), result.status is "PENDING".
Given this task definition, why does calling multiply.delay(3, 4) raise an error?
from celery import Celery app = Celery('tasks', broker='redis://localhost:6379/0') def multiply(x, y): return x * y app.task(multiply)
Check how the task decorator is used in Celery.
Option A is correct because app.task(multiply) returns a task but does not replace the original function reference. The function is not decorated properly. The correct way is to use @app.task decorator above the function definition.
In a Flask app using Celery, how does Celery find and register tasks defined in different modules?
Think about Python import behavior and how decorators work.
Celery does not auto-discover tasks by scanning files. Tasks must be imported so their decorators run and register them with Celery. Without importing, Celery won't know about tasks in other modules.