Basic data types - Time & Space Complexity
We want to see how the time to run code changes when using basic data types.
How does the program's work grow as the input size changes?
Analyze the time complexity of the following code snippet.
#include <stdio.h>
int main() {
int a = 5;
float b = 3.14f;
char c = 'x';
double d = 2.71828;
return 0;
}
This code declares variables of basic data types and assigns values.
Look for any repeated actions or loops.
- Primary operation: Variable declarations and assignments.
- How many times: Each happens once, no loops or repetition.
Since there are no loops or repeated steps, the work stays the same no matter the input size.
| Input Size (n) | Approx. Operations |
|---|---|
| 10 | 4 (fixed declarations) |
| 100 | 4 (still fixed declarations) |
| 1000 | 4 (no change) |
Pattern observation: The number of operations does not grow with input size.
Time Complexity: O(1)
This means the program takes the same amount of time no matter how big the input is.
[X] Wrong: "Declaring more variables makes the program slower as input grows."
[OK] Correct: Variable declarations happen once and do not depend on input size, so they do not slow down the program as input grows.
Understanding that simple variable declarations run in constant time helps you explain how programs start and manage data efficiently.
"What if we added a loop that initializes an array of size n? How would the time complexity change?"