Delete Node at End in DSA Python - Time & Space Complexity
When we delete a node at the end of a linked list, we want to know how long it takes as the list grows.
We ask: How does the time to delete the last node change when the list gets bigger?
Analyze the time complexity of the following code snippet.
class Node:
def __init__(self, data):
self.data = data
self.next = None
def delete_at_end(head):
if head is None:
return None
if head.next is None:
return None
current = head
while current.next.next:
current = current.next
current.next = None
return head
This code deletes the last node from a singly linked list by walking through the list to find the second last node.
Identify the loops, recursion, array traversals that repeat.
- Primary operation: The while loop that moves through the list nodes.
- How many times: It runs once for each node until the second last node is found.
As the list gets longer, the loop runs more times, roughly one less than the number of nodes.
| Input Size (n) | Approx. Operations |
|---|---|
| 10 | About 9 steps through nodes |
| 100 | About 99 steps through nodes |
| 1000 | About 999 steps through nodes |
Pattern observation: The number of steps grows roughly the same as the list size.
Time Complexity: O(n)
This means the time to delete the last node grows linearly with the list size.
[X] Wrong: "Deleting the last node is always quick and takes constant time."
[OK] Correct: Because we must find the second last node by walking through the list, which takes longer as the list grows.
Understanding how linked list operations scale helps you explain your code clearly and shows you know how data size affects performance.
"What if we had a pointer to the tail node? How would the time complexity change when deleting the last node?"