Insertion and deletion operations in Data Structures Theory - Time & Space Complexity
Insertion and deletion are basic actions in many data structures. Knowing how their time cost grows helps us choose the right structure for our needs.
We want to understand how the time to insert or delete changes as the data size grows.
Analyze the time complexity of the following insertion and deletion operations in a simple array.
// Insert element at position pos in array arr of size n
for (int i = n - 1; i >= pos; i--) {
arr[i + 1] = arr[i];
}
arr[pos] = newElement;
// Delete element at position pos in array arr of size n
for (int i = pos; i < n - 1; i++) {
arr[i] = arr[i + 1];
}
This code shifts elements to make space for insertion or to fill the gap after deletion.
- Primary operation: Loop shifting elements one by one.
- How many times: Up to n times, depending on position.
When the array is small, shifting few elements is quick. As the array grows, shifting many elements takes more time.
| Input Size (n) | Approx. Operations |
|---|---|
| 10 | Up to 10 shifts |
| 100 | Up to 100 shifts |
| 1000 | Up to 1000 shifts |
Pattern observation: The number of shifts grows roughly in direct proportion to the size of the array.
Time Complexity: O(n)
This means the time to insert or delete grows linearly with the number of elements.
[X] Wrong: "Insertion or deletion in an array always takes constant time because we just add or remove one element."
[OK] Correct: Actually, elements often need to be moved to keep the order, so the time depends on how many elements shift.
Understanding how insertion and deletion scale helps you explain trade-offs in data structures clearly. This skill shows you think about efficiency in real situations.
"What if we used a linked list instead of an array? How would the time complexity for insertion and deletion change?"