Available expressions analysis in Compiler Design - Time & Space Complexity
Analyzing time complexity helps us understand how the cost of available expressions analysis grows as the program size increases.
We want to know how the number of operations changes when the program has more expressions or statements.
Analyze the time complexity of the following pseudocode for available expressions analysis.
for each basic block B in the program:
IN[B] = empty set
OUT[B] = all expressions
repeat
for each basic block B:
IN[B] = intersection of OUT[P] for all predecessors P of B
OUT[B] = GEN[B] union (IN[B] - KILL[B])
until IN and OUT sets do not change
This code computes which expressions are available at each point by repeatedly updating sets until they stabilize.
Look for loops and repeated updates in the code.
- Primary operation: The repeated loop that updates IN and OUT sets for each basic block.
- How many times: This loop runs until no changes occur, which depends on the number of expressions and blocks.
As the number of basic blocks and expressions grows, the number of updates increases.
| Input Size (n = number of expressions) | Approx. Operations |
|---|---|
| 10 | Few hundred updates |
| 100 | Thousands of updates |
| 1000 | Hundreds of thousands of updates |
Pattern observation: The work grows roughly with the product of expressions and blocks times the number of iterations until stabilization.
Time Complexity: O(n * b * i)
This means the time grows with the number of expressions (n), the number of basic blocks (b), and the number of iterations (i) needed to reach stability.
[X] Wrong: "The analysis finishes in just one pass over the blocks."
[OK] Correct: The sets depend on each other and usually require multiple passes until no changes happen.
Understanding this time complexity shows you can reason about iterative data flow analyses, a key skill in compiler design and optimization.
"What if we used a worklist algorithm that only updates changed blocks? How would the time complexity change?"