Log cleanup automation in PowerShell - Time & Space Complexity
When automating log cleanup, it is important to understand how the time to delete files grows as the number of log files increases.
We want to know how the script's work changes when there are more logs to clean.
Analyze the time complexity of the following code snippet.
# Get all log files older than 7 days
$oldLogs = Get-ChildItem -Path 'C:\Logs' -Filter '*.log' | Where-Object { $_.LastWriteTime -lt (Get-Date).AddDays(-7) }
# Delete each old log file
foreach ($log in $oldLogs) {
Remove-Item $log.FullName
}
This script finds all log files older than 7 days and deletes them one by one.
- Primary operation: Looping through each old log file to delete it.
- How many times: Once for each log file older than 7 days.
As the number of old log files increases, the script deletes more files one by one.
| Input Size (n) | Approx. Operations |
|---|---|
| 10 | About 10 delete operations |
| 100 | About 100 delete operations |
| 1000 | About 1000 delete operations |
Pattern observation: The work grows directly with the number of files to delete.
Time Complexity: O(n)
This means the time to clean logs grows linearly with the number of old log files.
[X] Wrong: "Deleting files all at once is instant and does not depend on how many files there are."
[OK] Correct: Each file must be deleted separately, so more files mean more work and more time.
Understanding how scripts scale with input size helps you write efficient automation and explain your code clearly in interviews.
"What if we used a command that deletes all files in one call instead of looping? How would the time complexity change?"