Consider the following Python code that creates and closes a large figure repeatedly. What will be the output regarding memory usage?
import matplotlib.pyplot as plt import tracemalloc tracemalloc.start() for i in range(3): fig, ax = plt.subplots(figsize=(20, 20)) ax.plot(range(100000)) plt.close(fig) current, peak = tracemalloc.get_traced_memory() print(f"Current memory usage: {current / 10**6:.2f} MB") print(f"Peak memory usage: {peak / 10**6:.2f} MB") tracemalloc.stop()
Think about how closing figures affects memory in matplotlib.
Closing figures with plt.close() releases memory, so current usage stays low. Peak memory usage reflects the temporary memory used during plotting.
When plotting large datasets repeatedly, why should you close figures explicitly in matplotlib?
Think about what happens if figures stay open in memory.
Open figures consume memory. Not closing them causes memory to fill up, slowing down or crashing the program.
Look at this code snippet that plots data in a loop. Memory usage increases after each iteration. What is the cause?
import matplotlib.pyplot as plt for i in range(5): fig, ax = plt.subplots() ax.plot(range(10000)) # Missing plt.close(fig) print(f"Iteration {i} done")
Think about what happens to figures that are not closed.
Not closing figures keeps them in memory, causing memory usage to grow with each loop iteration.
After running the following code, how many figures remain open in matplotlib?
import matplotlib.pyplot as plt for _ in range(3): plt.figure() plt.close(1) plt.close(3)
Figure numbers start at 1 and increase with each new figure.
Three figures are created with numbers 1, 2, and 3. Figures 1 and 3 are closed, so only figure 2 remains open.
You need to generate 100 large plots in a script. Which approach best manages memory to avoid crashes?
Think about when memory is freed during plotting.
Closing each figure immediately after saving frees memory, preventing accumulation and crashes.