Array Insertion at Beginning in DSA Python - Time & Space Complexity
We want to understand how the time to insert an element at the start of an array changes as the array grows.
How does the work needed grow when the array gets bigger?
Analyze the time complexity of the following code snippet.
def insert_at_beginning(arr, value):
arr.insert(0, value)
# Example usage:
my_array = [1, 2, 3, 4]
insert_at_beginning(my_array, 0)
print(my_array) # Output: [0, 1, 2, 3, 4]
This code inserts a new value at the start of the array, shifting all existing elements one position to the right.
Identify the loops, recursion, array traversals that repeat.
- Primary operation: Shifting all existing elements one step to the right.
- How many times: For an array of size n, this shift happens n times.
When the array size grows, the number of elements to shift grows too, so the work grows roughly in direct proportion to the array size.
| Input Size (n) | Approx. Operations |
|---|---|
| 10 | 10 shifts |
| 100 | 100 shifts |
| 1000 | 1000 shifts |
Pattern observation: Doubling the array size roughly doubles the work needed to insert at the beginning.
Time Complexity: O(n)
This means the time to insert at the start grows linearly with the size of the array.
[X] Wrong: "Inserting at the beginning is always fast and takes constant time like adding at the end."
[OK] Correct: Because all existing elements must move one step to make space, the time grows with the number of elements.
Understanding this helps you explain why some data structures are better for front insertions, showing your grasp of efficiency.
"What if we used a linked list instead of an array? How would the time complexity for insertion at the beginning change?"