Compile-time polymorphism in Java - Time & Space Complexity
Let's explore how the time cost grows when using compile-time polymorphism in Java.
We want to see how method overloading affects the number of operations as input changes.
Analyze the time complexity of the following code snippet.
class Calculator {
int add(int a, int b) {
return a + b;
}
int add(int a, int b, int c) {
return a + b + c;
}
}
public class Main {
public static void main(String[] args) {
Calculator calc = new Calculator();
int result = calc.add(5, 10);
int result2 = calc.add(5, 10, 15);
}
}
This code shows method overloading where the same method name handles different numbers of inputs.
Identify the loops, recursion, array traversals that repeat.
- Primary operation: Simple addition operations inside overloaded methods.
- How many times: Each method runs once per call; no loops or recursion.
Since each method just adds a fixed number of integers, the work stays almost the same regardless of input size.
| Input Size (n) | Approx. Operations |
|---|---|
| 10 | About 1 or 2 additions per call |
| 100 | Still about 1 or 2 additions per call |
| 1000 | Still about 1 or 2 additions per call |
Pattern observation: The number of operations does not grow with input size; it stays constant.
Time Complexity: O(1)
This means the time to run these methods stays the same no matter how big the input is.
[X] Wrong: "More overloaded methods mean slower performance because the program checks all methods each time."
[OK] Correct: The compiler decides which method to call before running the program, so no extra checks happen at runtime.
Understanding how compile-time polymorphism works helps you explain efficient method calls and shows you know how Java handles method selection quickly.
"What if the overloaded methods used loops inside? How would that change the time complexity?"