Stack overflow detection (method 1 and 2) in FreeRTOS - Time & Space Complexity
We want to understand how checking for stack overflow costs time as tasks run in FreeRTOS.
How does the time to detect overflow grow when the stack size or number of tasks changes?
Analyze the time complexity of these two stack overflow detection methods.
// Method 1: Check stack watermark
if (uxTaskGetStackHighWaterMark(NULL) == 0) {
// Stack overflow detected
}
// Method 2: Check stack fill pattern
for (int i = 0; i < stackSize; i++) {
if (stack[i] != STACK_FILL_PATTERN) {
break; // Stack used up to i
}
}
Method 1 uses a built-in function to get minimum free stack. Method 2 scans the stack memory to find usage.
Look at what repeats in each method.
- Primary operation in Method 1: A single function call that returns a stored value.
- Primary operation in Method 2: A loop scanning each stack element until a pattern changes.
- How many times: Method 1 runs once per check; Method 2 may scan up to the entire stack size.
Method 1's time stays about the same no matter the stack size.
Method 2's time grows as the stack size grows, because it may check each stack element.
| Input Size (stackSize) | Method 1 Ops | Method 2 Ops |
|---|---|---|
| 10 | 1 | up to 10 |
| 100 | 1 | up to 100 |
| 1000 | 1 | up to 1000 |
Pattern observation: Method 1 is constant time; Method 2 grows linearly with stack size.
Time Complexity: O(1) for Method 1, O(n) for Method 2
Method 1 takes the same time regardless of stack size; Method 2 takes longer as stack size grows.
[X] Wrong: "Both methods take the same time because they both detect overflow."
[OK] Correct: Method 1 uses a stored value and is quick; Method 2 scans memory and takes longer as stack grows.
Understanding how different detection methods scale helps you explain trade-offs clearly in real projects.
What if Method 2 used a binary search on the stack pattern instead of a linear scan? How would the time complexity change?