Git LFS for large files - Time & Space Complexity
When working with large files in Git, it is important to understand how operations scale as file size grows.
We want to see how Git LFS handles large files differently from normal Git commands.
Analyze the time complexity of the following Git LFS commands.
git lfs track "*.psd"
git add .gitattributes
git add largefile.psd
git commit -m "Add large PSD file"
git push origin main
This code tracks large files with Git LFS, adds them, commits, and pushes to remote.
Identify the loops, recursion, array traversals that repeat.
- Primary operation: Uploading large file data to remote storage.
- How many times: Once per large file during push, but file size affects upload time.
As file size grows, the time to upload the file grows roughly in direct proportion.
| Input Size (MB) | Approx. Operations (Upload Time) |
|---|---|
| 10 | Short upload time |
| 100 | About 10 times longer upload |
| 1000 | About 100 times longer upload |
Pattern observation: Upload time grows linearly with file size.
Time Complexity: O(n)
This means the time to push large files grows directly with the file size.
[X] Wrong: "Git LFS makes pushing large files instant regardless of size."
[OK] Correct: Git LFS stores large files outside Git but still uploads the full file, so bigger files take longer to push.
Understanding how Git LFS handles large files helps you explain efficient version control in real projects.
What if we used Git LFS with many small files instead of one large file? How would the time complexity change?