Donut chart variation in Matplotlib - Time & Space Complexity
We want to understand how the time to draw a donut chart changes as the amount of data grows.
How does adding more slices affect the work matplotlib does?
Analyze the time complexity of the following code snippet.
import matplotlib.pyplot as plt
sizes = [15, 30, 45, 10]
labels = ['A', 'B', 'C', 'D']
plt.pie(sizes, labels=labels, wedgeprops={'width':0.3})
plt.show()
This code draws a donut chart with 4 slices, each slice sized by the values in sizes.
Identify the loops, recursion, array traversals that repeat.
- Primary operation: Drawing each slice of the donut chart.
- How many times: Once for each slice in the
sizeslist.
As the number of slices increases, the work to draw each slice adds up.
| Input Size (n) | Approx. Operations |
|---|---|
| 10 | 10 drawing steps |
| 100 | 100 drawing steps |
| 1000 | 1000 drawing steps |
Pattern observation: The work grows directly with the number of slices.
Time Complexity: O(n)
This means the time to draw the donut chart grows in a straight line as you add more slices.
[X] Wrong: "Drawing a donut chart always takes the same time no matter how many slices there are."
[OK] Correct: Each slice requires separate drawing steps, so more slices mean more work.
Understanding how drawing time grows with data size helps you explain performance in data visualization tasks clearly and confidently.
"What if we added animation to the donut chart? How would that affect the time complexity?"