Why file handling is required in C - Performance Analysis
We want to understand how the time to run file handling operations changes as the size of the file grows.
How does reading or writing more data affect the time it takes?
Analyze the time complexity of the following code snippet.
#include <stdio.h>
int main() {
FILE *file = fopen("data.txt", "r");
int ch;
while ((ch = fgetc(file)) != EOF) {
putchar(ch);
}
fclose(file);
return 0;
}
This code reads a file character by character and prints it to the screen.
Identify the loops, recursion, array traversals that repeat.
- Primary operation: Reading each character from the file one by one.
- How many times: Once for every character in the file until the end.
As the file size grows, the program reads more characters, so it takes more time.
| Input Size (characters) | Approx. Operations |
|---|---|
| 10 | About 10 reads |
| 100 | About 100 reads |
| 1000 | About 1000 reads |
Pattern observation: The time grows directly with the number of characters in the file.
Time Complexity: O(n)
This means the time to read the file grows in a straight line with the file size.
[X] Wrong: "Reading a file always takes the same time no matter how big it is."
[OK] Correct: The program reads each character one by one, so bigger files take more time.
Understanding how file size affects reading time helps you write efficient programs that handle data well.
"What if we read the file in larger chunks instead of one character at a time? How would the time complexity change?"