Working directory state in Git - Time & Space Complexity
We want to understand how checking the working directory state in git grows as the number of files changes.
How does git's work to find changes scale when more files exist?
Analyze the time complexity of the following git command.
git status --short
This command shows which files in the working directory are changed, added, or deleted compared to the last commit.
Git checks each file in the working directory to see if it differs from the last commit.
- Primary operation: Comparing each file's current state to its committed state.
- How many times: Once for every file in the directory.
As the number of files grows, git must check more files one by one.
| Input Size (n) | Approx. Operations |
|---|---|
| 10 | 10 file checks |
| 100 | 100 file checks |
| 1000 | 1000 file checks |
Pattern observation: The work grows directly with the number of files.
Time Complexity: O(n)
This means the time to check the working directory grows in a straight line with the number of files.
[X] Wrong: "Git checks only changed files, so time is constant no matter how many files exist."
[OK] Correct: Git must look at every file to know if it changed, so more files mean more checks.
Understanding how git checks file changes helps you explain performance in real projects and shows you know how tools work under the hood.
"What if git used a cache to remember unchanged files? How would that affect the time complexity?"