Perplexity for research and fact-checking in AI for Everyone - Time & Space Complexity
We want to understand how the effort to check facts or research using Perplexity grows as the questions or data get bigger.
How does the time needed change when we ask more or more complex questions?
Analyze the time complexity of the following code snippet.
function fetchAnswers(questions) {
let results = [];
for (let q of questions) {
let answer = queryPerplexityAPI(q);
results.push(answer);
}
return results;
}
This code sends each question to Perplexity's API one by one and collects the answers.
- Primary operation: Sending a query to Perplexity API for each question.
- How many times: Once for each question in the input list.
Each new question adds one more API call, so the total work grows directly with the number of questions.
| Input Size (n) | Approx. Operations |
|---|---|
| 10 | 10 API calls |
| 100 | 100 API calls |
| 1000 | 1000 API calls |
Pattern observation: The time grows in a straight line as the number of questions increases.
Time Complexity: O(n)
This means the time needed grows directly in proportion to how many questions you ask.
[X] Wrong: "Adding more questions won't affect the total time much because the API is fast."
[OK] Correct: Even if each call is fast, doing many calls one after another adds up, so more questions mean more total time.
Understanding how time grows with input size helps you explain how your code will behave with bigger data, a key skill in real projects and interviews.
"What if we sent all questions in one batch request instead of one by one? How would the time complexity change?"