0
0
Compiler Designknowledge~5 mins

Local optimization (peephole) in Compiler Design - Time & Space Complexity

Choose your learning style9 modes available
Time Complexity: Local optimization (peephole)
O(n)
Understanding Time Complexity

Local optimization, or peephole optimization, improves small parts of code to run faster.

We want to know how the time to optimize grows as the code size grows.

Scenario Under Consideration

Analyze the time complexity of the following peephole optimization code snippet.


for i in range(len(instructions) - window_size + 1):
    window = instructions[i:i+window_size]
    if can_optimize(window):
        optimized = optimize(window)
        instructions[i:i+window_size] = optimized

This code looks at small groups of instructions and tries to replace them with faster ones.

Identify Repeating Operations
  • Primary operation: Sliding over the instruction list with a fixed-size window.
  • How many times: Once for each instruction minus the window size plus one.
How Execution Grows With Input

As the number of instructions grows, the optimizer checks more windows, but each window is small and fixed.

Input Size (n)Approx. Operations
10About 10 checks
100About 100 checks
1000About 1000 checks

Pattern observation: The number of checks grows directly with the number of instructions.

Final Time Complexity

Time Complexity: O(n)

This means the time to optimize grows in a straight line as the code gets longer.

Common Mistake

[X] Wrong: "Peephole optimization takes constant time no matter how big the code is."

[OK] Correct: The optimizer must check each part of the code, so more code means more checks.

Interview Connect

Understanding how local optimization scales helps you explain compiler efficiency clearly and confidently.

Self-Check

"What if the window size grew with the input size? How would the time complexity change?"