O(1) vs O(n) vs O(log n) vs O(n^2): Key Differences and Usage
O(1) time complexity means an operation takes constant time regardless of input size. O(n) means the time grows linearly with input size, O(log n) grows logarithmically (much slower), and O(n^2) grows quadratically, becoming very slow for large inputs.Quick Comparison
Here is a quick overview of how these time complexities compare in terms of speed, scalability, and example operations.
| Time Complexity | Growth Rate | Speed | Example Operation | Scalability |
|---|---|---|---|---|
| O(1) | Constant | Fastest | Accessing an array element by index | Excellent |
| O(log n) | Logarithmic | Very fast | Binary search in a sorted array | Very good |
| O(n) | Linear | Moderate | Finding an item in an unsorted list | Fair |
| O(n^2) | Quadratic | Slow | Nested loops over the same data | Poor for large n |
Key Differences
O(1) means the operation takes the same amount of time no matter how big the input is. For example, getting the first item in a list is always quick.
O(n) means the time grows directly with the input size. If you have twice as many items, it takes twice as long. Searching an unsorted list is like this because you might check every item.
O(log n) grows much slower than O(n). It means each step cuts the problem size roughly in half, like in binary search. This makes it very efficient for large data.
O(n^2) means the time grows with the square of the input size. This happens with nested loops, where for each item you do a full pass over all items again. It becomes very slow as data grows.
Code Comparison
Here is an example of O(n) time complexity: searching for a number in an unsorted list by checking each element.
def linear_search(arr, target): for i, value in enumerate(arr): if value == target: return i return -1 numbers = [5, 3, 8, 4, 2] index = linear_search(numbers, 4) print(index)
O(log n) Equivalent
Here is an example of O(log n) time complexity: binary search on a sorted list, which repeatedly halves the search space.
def binary_search(arr, target): left, right = 0, len(arr) - 1 while left <= right: mid = (left + right) // 2 if arr[mid] == target: return mid elif arr[mid] < target: left = mid + 1 else: right = mid - 1 return -1 sorted_numbers = [2, 3, 4, 5, 8] index = binary_search(sorted_numbers, 4) print(index)
When to Use Which
Choose O(1) operations whenever possible for the fastest performance, such as direct access in arrays or hash tables.
Use O(log n) algorithms like binary search when working with sorted data to get efficient lookups.
O(n) is acceptable for simple scans or when data is unsorted and small.
Avoid O(n^2) algorithms for large data sets because they become very slow; use them only for small inputs or when no better option exists.