Physical constants (speed of light, Planck) in SciPy - Time & Space Complexity
We want to see how the time to get physical constants changes as we ask for more constants.
How does the work grow when we retrieve many constants like speed of light or Planck's constant?
Analyze the time complexity of the following code snippet.
from scipy.constants import physical_constants
constants_to_get = ["speed of light in vacuum", "Planck constant", "electron mass"]
results = []
for name in constants_to_get:
value = physical_constants[name]
results.append(value)
This code fetches values for a list of physical constants one by one from scipy.
- Primary operation: Looping over the list of constant names and fetching each constant.
- How many times: Once for each constant in the list.
Each new constant adds one more fetch operation, so the work grows directly with the number of constants.
| Input Size (n) | Approx. Operations |
|---|---|
| 10 | 10 fetches |
| 100 | 100 fetches |
| 1000 | 1000 fetches |
Pattern observation: The time grows in a straight line as we ask for more constants.
Time Complexity: O(n)
This means the time to get constants grows directly with how many constants you ask for.
[X] Wrong: "Fetching multiple constants is instant and does not depend on how many I ask for."
[OK] Correct: Each constant requires a separate lookup, so more constants mean more work and more time.
Understanding how work grows with input size helps you explain efficiency clearly and shows you can think about performance in real tasks.
"What if we stored all constants in a dictionary once and then accessed them repeatedly? How would the time complexity change?"