Consider the following Swift code snippet. What will be printed?
let number = 42 let message = "The answer is \(number)" print(message)
Look at how Swift infers the type of number and uses string interpolation.
Swift infers number as an Int. The string interpolation inserts the value 42 into the string, so the output is "The answer is 42".
Given this Swift code, what type does the compiler infer for value?
let value = 3.14
Swift defaults to a certain floating-point type when no explicit type is given.
Swift infers value as a Double by default for floating-point literals.
Examine the code below. Why does the compiler produce an error?
let x = nil print(x)
Think about what type nil represents and how Swift infers types.
Swift cannot infer the type of x because nil alone does not specify a type. You must provide a type annotation.
Consider this Swift code snippet. What is the inferred type of increment and what will be printed?
let increment = { (number: Int) in number + 1 } print(increment(5))
Look at the closure parameter and return expression.
The closure takes an Int and returns an Int by adding 1. Calling increment(5) returns 6.
What is the inferred type of result in the following Swift code?
let result = 5 + 3.0
Swift promotes numeric types to the more general type in mixed expressions.
Swift promotes the Int 5 to Double to match 3.0, so the result is inferred as Double.