Using printf for output in C - Time & Space Complexity
We want to understand how the time taken by a program changes when it uses printf to show output.
How does the number of printf calls affect the program's running time?
Analyze the time complexity of the following code snippet.
#include <stdio.h>
int main() {
int n = 1000;
for (int i = 0; i < n; i++) {
printf("%d\n", i);
}
return 0;
}
This code prints numbers from 0 up to n-1, one per line.
Identify the loops, recursion, array traversals that repeat.
- Primary operation: The
printfcall inside the loop. - How many times: Exactly
ntimes, once for each loop iteration.
Each time n increases, the number of printf calls grows the same way.
| Input Size (n) | Approx. Operations |
|---|---|
| 10 | 10 calls to printf |
| 100 | 100 calls to printf |
| 1000 | 1000 calls to printf |
Pattern observation: The number of output operations grows directly with n.
Time Complexity: O(n)
This means the time to run the program grows in a straight line as the number of printed lines increases.
[X] Wrong: "Using printf inside a loop is always very slow and constant time."
[OK] Correct: The time depends on how many times printf runs, so it grows with the loop size, not fixed.
Understanding how output commands affect running time helps you explain program speed clearly and shows you know how loops and output work together.
"What if we replaced printf with a function that prints only once after the loop? How would the time complexity change?"