Transmitting a byte over UART in Embedded C - Time & Space Complexity
We want to understand how long it takes to send a byte using UART communication.
How does the time to send data change as we send more bytes?
Analyze the time complexity of the following code snippet.
void UART_TransmitByte(unsigned char data) {
while (!(UART_STATUS & UART_TX_READY)) {
// wait until transmitter is ready
}
UART_DATA = data; // send the byte
}
This code waits until the UART transmitter is ready, then sends one byte.
Identify the loops, recursion, array traversals that repeat.
- Primary operation: The while loop waiting for the UART transmitter to be ready.
- How many times: It repeats until the hardware signals readiness, usually a small fixed number of times per byte.
Sending one byte takes some fixed time waiting for readiness and transmitting.
| Input Size (n bytes) | Approx. Operations |
|---|---|
| 10 | About 10 waits and sends |
| 100 | About 100 waits and sends |
| 1000 | About 1000 waits and sends |
Pattern observation: The time grows directly with the number of bytes sent.
Time Complexity: O(n)
This means the time to send data grows linearly with the number of bytes.
[X] Wrong: "Sending one byte takes the same time no matter how many bytes I send."
[OK] Correct: Each byte requires waiting and sending, so total time adds up as you send more bytes.
Understanding how hardware communication time scales helps you write efficient embedded code and explain performance clearly.
"What if we used interrupts instead of waiting in a loop? How would the time complexity change?"