Topological sorting in Data Structures Theory - Time & Space Complexity
Topological sorting arranges tasks in order when some must come before others.
We want to know how the time needed grows as the number of tasks and dependencies increase.
Analyze the time complexity of the following topological sort using Depth-First Search (DFS).
function topoSort(graph):
visited = set()
stack = []
for node in graph:
if node not in visited:
dfs(node, visited, stack, graph)
return reversed(stack)
function dfs(node, visited, stack, graph):
visited.add(node)
for neighbor in graph[node]:
if neighbor not in visited:
dfs(neighbor, visited, stack, graph)
stack.append(node)
This code visits each task and its dependencies once to produce a valid order.
- Primary operation: DFS visits each node and explores all its edges once.
- How many times: Each node and edge is processed exactly once during the traversal.
As the number of tasks (nodes) and dependencies (edges) grow, the work grows proportionally.
| Input Size (n nodes, e edges) | Approx. Operations |
|---|---|
| 10 nodes, 15 edges | About 25 operations |
| 100 nodes, 200 edges | About 300 operations |
| 1000 nodes, 5000 edges | About 6000 operations |
Pattern observation: The total steps increase roughly by adding nodes and edges together.
Time Complexity: O(n + e)
This means the time grows linearly with the number of tasks plus the number of dependencies.
[X] Wrong: "Topological sort takes quadratic time because it looks at all pairs of tasks."
[OK] Correct: The algorithm only visits each task and its direct dependencies once, not every pair, so it is much faster.
Understanding this time complexity helps you explain how efficiently you can order tasks with dependencies, a common real-world problem.
"What if the graph is represented using an adjacency matrix instead of adjacency lists? How would the time complexity change?"