Why ADC is needed in Embedded C - Performance Analysis
We want to understand how the time to convert analog signals to digital changes as input size or precision changes.
How does the ADC process time grow when we change its settings or input?
Analyze the time complexity of the following ADC conversion code snippet.
// Start ADC conversion
ADC_StartConversion();
// Wait until conversion is complete
while(!ADC_ConversionComplete()) {
; // busy wait
}
// Read ADC result
int result = ADC_ReadResult();
This code starts an ADC conversion, waits for it to finish, then reads the digital value.
Look for loops or repeated checks in the code.
- Primary operation: The while loop that waits for conversion to complete.
- How many times: It runs until the ADC finishes converting one analog value.
The waiting time depends on the ADC resolution and clock speed, not on input size like arrays.
| Input Size (n) | Approx. Operations |
|---|---|
| 10 (low resolution) | Short wait, few loop checks |
| 100 (medium resolution) | Longer wait, more loop checks |
| 1000 (high resolution) | Even longer wait, many loop checks |
Pattern observation: The time grows roughly with the ADC resolution setting, not with data size.
Time Complexity: O(1)
This means the conversion time is constant for each reading, independent of data size.
[X] Wrong: "ADC conversion time grows with the amount of data to convert."
[OK] Correct: ADC converts one analog value at a time, so each conversion takes a fixed time regardless of data size.
Understanding ADC timing helps you explain how embedded systems handle real-world signals efficiently and predictably.
"What if we changed the ADC resolution to a higher bit count? How would the conversion time change?"