Armstrong's axioms in DBMS Theory - Time & Space Complexity
We want to understand how the time to find all functional dependencies grows when using Armstrong's axioms.
How does the number of steps change as the number of attributes and dependencies increases?
Analyze the time complexity of applying Armstrong's axioms to find all implied functional dependencies.
-- Given a set of functional dependencies (FDs)
-- Apply Armstrong's axioms: reflexivity, augmentation, transitivity
-- Repeat until no new FDs are found
WHILE new FDs are added DO
FOR each FD X -> Y DO
FOR each FD W -> Z DO
IF Y includes W THEN
Add FD X -> Z
END IF
END FOR
END FOR
END WHILE
This process finds all functional dependencies implied by the initial set using Armstrong's axioms.
Look for loops and repeated checks in the process.
- Primary operation: Nested loops comparing pairs of functional dependencies.
- How many times: Repeated until no new dependencies are added, potentially many times.
As the number of attributes and dependencies grows, the number of pairs to check grows quickly.
| Input Size (number of FDs) | Approx. Operations |
|---|---|
| 10 | About 100 checks per iteration |
| 100 | About 10,000 checks per iteration |
| 1000 | About 1,000,000 checks per iteration |
Pattern observation: The number of operations grows roughly with the square of the number of dependencies.
Time Complexity: O(n^3)
This means the time to find all implied dependencies grows roughly with the cube of the number of initial dependencies.
[X] Wrong: "Applying Armstrong's axioms takes linear time because we just check each dependency once."
[OK] Correct: The process repeats many times and compares pairs of dependencies, causing the time to grow much faster than linear.
Understanding how Armstrong's axioms scale helps you reason about database design and normalization efficiently in real projects.
"What if we had a way to avoid rechecking all pairs every time? How would that affect the time complexity?"