Why input and output are required in C - Performance Analysis
When a program takes input and gives output, it does some work based on that input. We want to know how the time it takes changes as the input size changes.
How does the program's work grow when we give it more or bigger inputs?
Analyze the time complexity of the following code snippet.
#include <stdio.h>
int main() {
int n, sum = 0;
scanf("%d", &n);
for (int i = 1; i <= n; i++) {
sum += i;
}
printf("%d\n", sum);
return 0;
}
This code reads a number n, adds all numbers from 1 to n, and prints the result.
Identify the loops, recursion, array traversals that repeat.
- Primary operation: The for-loop adding numbers from 1 to n.
- How many times: It runs exactly n times, once for each number.
As n gets bigger, the loop runs more times, so the work grows with n.
| Input Size (n) | Approx. Operations |
|---|---|
| 10 | 10 additions |
| 100 | 100 additions |
| 1000 | 1000 additions |
Pattern observation: The number of operations grows directly with the input size.
Time Complexity: O(n)
This means the time to run the program grows in a straight line as the input number grows.
[X] Wrong: "Input and output do not affect time because they are just one step each."
[OK] Correct: Input size controls how many times the loop runs, so bigger input means more work and longer time.
Understanding how input size affects time helps you explain program efficiency clearly and confidently in interviews.
"What if we changed the loop to run from 1 to n squared? How would the time complexity change?"