Best, average, and worst case analysis in Data Structures Theory - Time & Space Complexity
When we analyze time complexity, we want to know how long a task takes depending on the input. Sometimes, the time changes based on the input details.
We ask: What is the fastest, average, and slowest time this task can take?
Analyze the time complexity of the following code snippet.
function findNumber(arr, target) {
for (let i = 0; i < arr.length; i++) {
if (arr[i] === target) {
return i;
}
}
return -1;
}
This code searches for a number in a list and stops when it finds it or reaches the end.
Identify the loops, recursion, array traversals that repeat.
- Primary operation: Checking each item in the list one by one.
- How many times: Up to the length of the list, depending on when the target is found.
Explain the growth pattern intuitively.
| Input Size (n) | Approx. Operations |
|---|---|
| 10 | Between 1 and 10 checks |
| 100 | Between 1 and 100 checks |
| 1000 | Between 1 and 1000 checks |
Pattern observation: The number of checks depends on where the target is. It can be very fast if found early or take longer if found late or not at all.
Time Complexity: O(n)
This means in the worst case, the time grows linearly with the size of the input list.
[X] Wrong: "The search always takes the same time no matter what."
[OK] Correct: The time depends on where the target is. If it is at the start, it's quick; if not found, it checks everything.
Understanding best, average, and worst cases helps you explain how your code behaves in different situations. This shows you think about real-world use, not just theory.
"What if the list was sorted and you used a different search method? How would the time complexity change?"