Why file operations matter in C Sharp (C#) - Performance Analysis
When working with files, the time it takes to read or write data can change a lot depending on how much data there is.
We want to understand how the time needed grows as the file size grows.
Analyze the time complexity of the following code snippet.
using System.IO;
void WriteNumbersToFile(string path, int n)
{
using StreamWriter writer = new StreamWriter(path);
for (int i = 0; i < n; i++)
{
writer.WriteLine(i);
}
}
This code writes numbers from 0 up to n-1 into a file, one number per line.
Identify the loops, recursion, array traversals that repeat.
- Primary operation: Writing a line to the file inside a loop.
- How many times: The loop runs n times, so the write happens n times.
As the number n grows, the number of write operations grows the same way.
| Input Size (n) | Approx. Operations |
|---|---|
| 10 | 10 writes |
| 100 | 100 writes |
| 1000 | 1000 writes |
Pattern observation: The time grows directly with the number of lines written.
Time Complexity: O(n)
This means the time to write grows in a straight line with the number of lines you write.
[X] Wrong: "Writing to a file is always instant and does not depend on how much data is written."
[OK] Correct: Writing takes more time as you write more lines because each line is a separate operation that the computer must handle.
Understanding how file operations scale helps you write programs that handle data efficiently and avoid slowdowns when files get big.
"What if we buffered multiple lines before writing to the file? How would the time complexity change?"