Numeric literal formats in Swift - Time & Space Complexity
Let's see how the time it takes to read and use different numeric literals changes as the numbers get bigger.
We want to know how the program's work grows when using various numeric formats.
Analyze the time complexity of the following code snippet.
let decimal = 12345
let binary = 0b11000000111001
let octal = 0o30071
let hexadecimal = 0x3039
print(decimal)
print(binary)
print(octal)
print(hexadecimal)
This code defines numbers in different formats and prints them.
Identify the loops, recursion, array traversals that repeat.
- Primary operation: Converting each numeric literal from its format to a usable number.
- How many times: Once per literal, four times total here.
As the number of digits in the literal grows, the work to convert it grows too.
| Input Size (digits) | Approx. Operations |
|---|---|
| 5 | 5 steps to read digits |
| 10 | 10 steps to read digits |
| 20 | 20 steps to read digits |
Pattern observation: The work grows directly with the number of digits in the number.
Time Complexity: O(n)
This means the time to process a numeric literal grows in a straight line with the number of digits it has.
[X] Wrong: "All numeric literals take the same time to process no matter their size or format."
[OK] Correct: Larger numbers or different formats need more steps to read each digit, so they take more time.
Understanding how numeric literals are processed helps you explain how programs handle data efficiently, a useful skill in many coding situations.
"What if we changed the numeric literals to include underscores for readability? How would the time complexity change?"