Global optimization techniques in Compiler Design - Time & Space Complexity
Global optimization techniques improve the whole program's performance by analyzing all parts together.
We want to know how the time to optimize grows as the program size increases.
Analyze the time complexity of the following global optimization process.
repeat until no changes:
for each function in program:
build control flow graph
for each basic block in function:
analyze data flow
apply optimizations
This code shows a global optimizer scanning all functions and blocks repeatedly to improve the program.
Look for loops and repeated analysis steps.
- Primary operation: Iterating over all functions and their blocks to analyze and optimize.
- How many times: Multiple passes until no further changes happen, which depends on program size and complexity.
As the program grows, the number of functions and blocks grows, increasing work each pass.
| Input Size (n) | Approx. Operations |
|---|---|
| 10 functions | Few hundred operations |
| 100 functions | Thousands of operations |
| 1000 functions | Hundreds of thousands of operations |
Pattern observation: Work grows roughly with the number of functions times blocks, multiplied by the number of passes.
Time Complexity: O(n * m * p)
This means the time grows with the number of functions (n), blocks per function (m), and passes needed (p).
[X] Wrong: "Global optimization always takes the same time regardless of program size."
[OK] Correct: Larger programs have more functions and blocks, so the optimizer must do more work, increasing time.
Understanding how optimization time grows helps you explain compiler performance and trade-offs clearly in interviews.
What if the optimizer only made one pass instead of repeating until no changes? How would the time complexity change?