Scheduled scripts with Task Scheduler in PowerShell - Time & Space Complexity
When we schedule scripts with Task Scheduler, we want to know how the script's run time changes as the work it does grows.
We ask: How does the script's execution time grow when it runs on bigger tasks?
Analyze the time complexity of the following code snippet.
# Example scheduled script
$files = Get-ChildItem -Path "C:\Logs" -File
foreach ($file in $files) {
$content = Get-Content $file.FullName
$lines = $content | Where-Object { $_ -match "ERROR" }
Write-Output "$($file.Name): $($lines.Count) errors found"
}
This script runs through all files in a folder, reads each file, and counts lines containing "ERROR".
- Primary operation: Looping through each file and reading its content line by line.
- How many times: Once for each file, and inside that, once for each line in the file.
The time grows with the number of files and the number of lines in each file.
| Input Size (n files) | Approx. Operations |
|---|---|
| 10 files (100 lines each) | About 1,000 line checks |
| 100 files (100 lines each) | About 10,000 line checks |
| 1000 files (100 lines each) | About 100,000 line checks |
Pattern observation: The work grows roughly by multiplying the number of files by the number of lines per file.
Time Complexity: O(n * m)
This means the script's run time grows with both the number of files (n) and the number of lines per file (m).
[X] Wrong: "The script only depends on the number of files, so time grows linearly with files only."
[OK] Correct: Each file's size matters too because reading lines inside files takes time, so both files and lines affect total time.
Understanding how scheduled scripts scale helps you explain how your automation handles growing data, a useful skill in many real-world tasks.
"What if the script only reads the first 10 lines of each file? How would the time complexity change?"