Log file analysis in SEO Fundamentals - Time & Space Complexity
When analyzing log files, it's important to understand how the time to process grows as the log size increases.
We want to know how the work changes when there are more log entries to check.
Analyze the time complexity of the following code snippet.
// Assume logs is a list of log entries
for (let i = 0; i < logs.length; i++) {
if (logs[i].includes('ERROR')) {
console.log('Error found:', logs[i]);
}
}
This code checks each log entry to find and print those containing the word "ERROR".
Identify the loops, recursion, array traversals that repeat.
- Primary operation: Looping through each log entry once.
- How many times: Exactly once for each log entry, so as many times as there are logs.
As the number of log entries grows, the time to check all of them grows at the same rate.
| Input Size (n) | Approx. Operations |
|---|---|
| 10 | 10 checks |
| 100 | 100 checks |
| 1000 | 1000 checks |
Pattern observation: Doubling the log entries roughly doubles the work needed.
Time Complexity: O(n * m)
This means the time to analyze logs grows directly in proportion to the number of log entries and the length of each entry.
[X] Wrong: "Checking for errors is constant time no matter how many logs there are."
[OK] Correct: Each log entry must be checked, so more logs mean more work, not the same amount.
Understanding how processing time grows with log size helps you explain efficiency clearly and shows you can reason about real data tasks.
"What if the code searched for errors inside each log entry multiple times? How would the time complexity change?"