git restore to discard working changes - Time & Space Complexity
We want to understand how the time to discard changes in files grows as the number of files changes.
How does git handle undoing changes when many files are involved?
Analyze the time complexity of the following git command.
git restore file1.txt file2.txt file3.txt
# or to discard all changes:
git restore .
This command discards changes in the working directory by restoring files to their last committed state.
Identify the loops, recursion, array traversals that repeat.
- Primary operation: Restoring each file by reading its last committed version and overwriting the working copy.
- How many times: Once per file specified or once per changed file if restoring all.
As the number of files to restore increases, the total work grows roughly in direct proportion.
| Input Size (n files) | Approx. Operations |
|---|---|
| 10 | 10 file restores |
| 100 | 100 file restores |
| 1000 | 1000 file restores |
Pattern observation: Doubling the number of files doubles the work needed to restore them.
Time Complexity: O(n)
This means the time to discard changes grows linearly with the number of files you restore.
[X] Wrong: "Restoring many files happens instantly no matter how many files there are."
[OK] Correct: Each file must be individually restored, so more files mean more work and more time.
Understanding how commands scale with input size helps you explain performance and make better decisions in real projects.
"What if we used git restore with a large directory instead of individual files? How would the time complexity change?"