Which of the following operations typically runs in constant time, O(1)?
Think about operations that take the same amount of time regardless of input size.
Accessing an element by index in an array takes the same time no matter how big the array is, so it is O(1). Searching or traversing depends on the number of elements, so they are not constant time.
Which scenario best represents an algorithm with linear time complexity O(n)?
Consider how the time grows as the input size increases.
Scanning each element to find a value means the time grows directly with the number of elements, which is linear O(n). Checking even/odd or accessing fixed positions are constant time.
Which of the following algorithms typically runs in logarithmic time O(log n)?
Think about algorithms that repeatedly divide the problem size in half.
Binary search divides the search space in half each step, so it runs in O(log n). Linear search and bubble sort take longer as input grows, factorial calculation is not logarithmic.
Which example best illustrates an algorithm with quadratic time complexity O(n²)?
Consider how many times the inner operations run as input size grows.
Nested loops each running n times result in n × n = n² operations, which is quadratic time. Single loops are linear, recursive halving is logarithmic, and hash table access is constant time.
What is the time complexity of the following code snippet?
def example_function(arr):
for i in range(len(arr)):
j = len(arr)
while j > 0:
j //= 2
print(arr[i])Analyze how many times the inner while loop runs for each iteration of the outer for loop.
The outer loop runs n times. The inner while loop divides j by 2 each time, running about log n times. Multiplying gives O(n log n).