Left recursion elimination in Compiler Design - Time & Space Complexity
When we remove left recursion from grammar rules, we want to know how the work grows as the grammar size increases.
How much effort does it take to rewrite rules without left recursion?
Analyze the time complexity of this left recursion elimination process.
// Given a grammar rule: A -> A alpha | beta
// Eliminate left recursion:
A -> beta A'
A' -> alpha A' | epsilon
// Algorithm steps:
for each non-terminal A:
for each production of A:
if production starts with A:
remove left recursion by creating A'
else:
keep production as is
This code transforms grammar rules to remove left recursion by rewriting productions.
Look for loops or repeated checks in the process.
- Primary operation: Checking each production of each non-terminal for left recursion.
- How many times: Once for every production in the grammar.
As the number of grammar rules and productions grows, the work grows roughly in proportion.
| Input Size (n productions) | Approx. Operations |
|---|---|
| 10 | About 10 checks |
| 100 | About 100 checks |
| 1000 | About 1000 checks |
Pattern observation: The work grows linearly as the number of productions increases.
Time Complexity: O(n)
This means the time to eliminate left recursion grows directly with the number of grammar productions.
[X] Wrong: "Eliminating left recursion takes exponential time because of recursive rules."
[OK] Correct: The process only scans each production once and rewrites it, so it grows linearly, not exponentially.
Understanding how left recursion elimination scales helps you explain grammar transformations clearly and confidently in interviews.
"What if the grammar has indirect left recursion? How would that affect the time complexity of elimination?"