Function declaration syntax in Swift - Time & Space Complexity
We want to understand how the time it takes to run a function changes as the input size changes.
Specifically, we ask: how does declaring and calling a simple function affect execution time?
Analyze the time complexity of the following code snippet.
func greet(name: String) {
print("Hello, \(name)!")
}
greet(name: "Alice")
This code defines a function that prints a greeting and then calls it once.
Identify the loops, recursion, array traversals that repeat.
- Primary operation: A single function call that prints a message.
- How many times: Exactly once in this example.
Since the function runs once and does a fixed amount of work, the time does not grow with input size.
| Input Size (n) | Approx. Operations |
|---|---|
| 10 | 1 |
| 100 | 1 |
| 1000 | 1 |
Pattern observation: The work stays the same no matter how big the input is.
Time Complexity: O(1)
This means the function takes the same amount of time no matter what.
[X] Wrong: "Declaring a function makes the program slower as input grows."
[OK] Correct: Declaring a function itself does not repeat work or depend on input size; only what happens inside matters.
Understanding that simple function calls run in constant time helps you explain code efficiency clearly and confidently.
"What if the function contained a loop over an array of size n? How would the time complexity change?"