Function score query in Elasticsearch - Time & Space Complexity
When using a function score query in Elasticsearch, it's important to know how the query's execution time changes as the data grows.
We want to understand how the scoring functions affect the total work Elasticsearch does.
Analyze the time complexity of the following function score query.
{
"query": {
"function_score": {
"query": { "match": { "field": "value" } },
"functions": [
{ "weight": 2 },
{ "random_score": {} }
],
"score_mode": "sum"
}
}
}
This query matches documents by a field and then adjusts their scores using multiple functions combined by summing.
Look for repeated work inside the query execution.
- Primary operation: Scoring each matched document by applying all scoring functions.
- How many times: Once for every document that matches the inner query.
As the number of matched documents grows, the scoring work grows too.
| Input Size (n) | Approx. Operations |
|---|---|
| 10 | 10 documents x 2 functions = 20 scoring operations |
| 100 | 100 documents x 2 functions = 200 scoring operations |
| 1000 | 1000 documents x 2 functions = 2000 scoring operations |
Pattern observation: The total scoring work grows directly with the number of matched documents.
Time Complexity: O(n)
This means the time to score grows in a straight line with the number of matched documents.
[X] Wrong: "Adding more scoring functions does not affect query time much."
[OK] Correct: Each scoring function runs for every matched document, so more functions multiply the work.
Understanding how scoring functions affect query time helps you design efficient searches and explain performance clearly.
What if we added a filter that reduces matched documents by half? How would the time complexity change?