Why bitwise operations are essential in embedded in Embedded C - Performance Analysis
Bitwise operations are very common in embedded programming because they work directly with bits, the smallest data units.
We want to understand how the time to run bitwise operations changes as input size grows.
Analyze the time complexity of the following code snippet.
unsigned int set_bit(unsigned int num, int pos) {
return num | (1U << pos);
}
unsigned int clear_bit(unsigned int num, int pos) {
return num & ~(1U << pos);
}
unsigned int toggle_bit(unsigned int num, int pos) {
return num ^ (1U << pos);
}
This code sets, clears, or toggles a single bit in an integer using bitwise operations.
Identify the loops, recursion, array traversals that repeat.
- Primary operation: Single bitwise operation per function call.
- How many times: Each function runs a fixed number of steps regardless of input size.
Each bitwise operation works on fixed-size data (like 32 bits), so the time does not grow with input size.
| Input Size (bits) | Approx. Operations |
|---|---|
| 8 | 3 |
| 16 | 3 |
| 32 | 3 |
Pattern observation: The number of operations stays the same no matter how many bits the number has.
Time Complexity: O(1)
This means the time to perform bitwise operations is constant and does not increase with input size.
[X] Wrong: "Bitwise operations take longer as the number gets bigger because there are more bits to handle."
[OK] Correct: Bitwise operations work on fixed-size registers, so they take the same time regardless of the number's size.
Understanding that bitwise operations run in constant time helps you write efficient embedded code and answer questions about low-level performance.
"What if we used a loop to set multiple bits one by one? How would the time complexity change?"