Define macro - Time & Space Complexity
Let's see how using a macro affects the time it takes for a program to run.
We want to know how the program's steps grow when using a macro.
Analyze the time complexity of the following code snippet.
#include <stdio.h>
#define SQUARE(x) ((x) * (x))
int main() {
int n = 5;
int result = SQUARE(n);
printf("%d\n", result);
return 0;
}
This code defines a macro to square a number and uses it once.
Identify the loops, recursion, array traversals that repeat.
- Primary operation: Single multiplication done by the macro.
- How many times: Exactly once in this example.
Since the macro just replaces code, the number of operations stays the same no matter the input size.
| Input Size (n) | Approx. Operations |
|---|---|
| 10 | 1 multiplication |
| 100 | 1 multiplication |
| 1000 | 1 multiplication |
Pattern observation: The work does not increase with input size here.
Time Complexity: O(1)
This means the program does a fixed amount of work regardless of input size.
[X] Wrong: "Macros slow down the program because they add extra steps at runtime."
[OK] Correct: Macros are replaced before running, so they don't add runtime steps themselves.
Understanding how macros work helps you explain how code runs efficiently and how compile-time changes affect runtime.
What if we used the macro inside a loop that runs n times? How would the time complexity change?