Why timers are needed in Embedded C - Performance Analysis
Timers help control when and how often certain actions happen in embedded systems.
We want to understand how the cost of using timers changes as we adjust their settings or use.
Analyze the time complexity of the following code snippet.
// Simple timer delay loop
void delay_ms(int ms) {
for (int i = 0; i < ms; i++) {
for (int j = 0; j < 1000; j++) {
// wait roughly 1 microsecond
}
}
}
int main() {
delay_ms(500); // delay 500 milliseconds
return 0;
}
This code creates a delay by running loops to wait a certain number of milliseconds.
Identify the loops, recursion, array traversals that repeat.
- Primary operation: Nested loops that run to create delay.
- How many times: Outer loop runs 'ms' times; inner loop runs 1000 times each outer loop.
Explain the growth pattern intuitively.
| Input Size (ms) | Approx. Operations |
|---|---|
| 10 | 10,000 |
| 100 | 100,000 |
| 1000 | 1,000,000 |
Pattern observation: The number of operations grows directly with the delay time; doubling delay doubles work.
Time Complexity: O(n)
This means the time to complete the delay grows linearly with the input delay value.
[X] Wrong: "The delay time stays the same no matter the input value."
[OK] Correct: The loops run more times as the input increases, so the delay grows longer.
Understanding how timers affect program timing helps you write reliable embedded code that works well in real devices.
"What if we replaced the nested loops with a hardware timer interrupt? How would the time complexity change?"