Identifying content decay in SEO Fundamentals - Time & Space Complexity
When checking for content decay, we want to know how much work it takes to find outdated or underperforming pages.
How does the effort grow as the website gets bigger?
Analyze the time complexity of the following SEO process.
// For each page in the website
for page in website_pages:
// Check page metrics (traffic, rankings)
analyze_metrics(page)
// Compare with past data
compare_with_history(page)
// Flag if performance dropped
if performance_dropped(page):
add_to_decay_list(page)
This code checks every page's performance to find which ones have decayed over time.
- Primary operation: Looping through each page in the website.
- How many times: Once for every page, so as many times as there are pages.
As the number of pages grows, the checks grow at the same pace.
| Input Size (n) | Approx. Operations |
|---|---|
| 10 | About 10 checks |
| 100 | About 100 checks |
| 1000 | About 1000 checks |
Pattern observation: The work grows directly with the number of pages.
Time Complexity: O(n)
This means the time to identify content decay grows in a straight line with the number of pages.
[X] Wrong: "Checking one page means the whole process is always fast regardless of site size."
[OK] Correct: Because you must check every page, the total time grows as the site grows, not stays the same.
Understanding how work grows with site size helps you plan SEO tasks and explain your approach clearly in discussions.
"What if we only checked pages updated in the last 6 months? How would the time complexity change?"