Bellman Ford Algorithm Negative Weights in DSA Typescript - Time & Space Complexity
We want to understand how the Bellman Ford algorithm's running time changes as the graph size grows.
Specifically, how does the number of edges and vertices affect the work done?
Analyze the time complexity of the following code snippet.
function bellmanFord(edges: [number, number, number][], V: number, src: number): number[] {
const dist = Array(V).fill(Infinity);
dist[src] = 0;
for (let i = 0; i < V - 1; i++) {
for (const [u, v, w] of edges) {
if (dist[u] !== Infinity && dist[u] + w < dist[v]) {
dist[v] = dist[u] + w;
}
}
}
return dist;
}
This code finds shortest paths from a source to all vertices, even with negative weights, by relaxing edges repeatedly.
- Primary operation: The inner loop that checks and updates distances for all edges.
- How many times: This inner loop runs once for every edge, repeated (V - 1) times where V is the number of vertices.
As the number of vertices (V) and edges (E) grow, the total operations grow roughly by E times V.
| Input Size (V, E) | Approx. Operations |
|---|---|
| V=10, E=20 | About 10 * 20 = 200 |
| V=100, E=500 | About 100 * 500 = 50,000 |
| V=1000, E=2000 | About 1000 * 2000 = 2,000,000 |
Pattern observation: Operations grow linearly with edges and vertices multiplied together.
Time Complexity: O(V * E)
This means the time needed grows proportionally to the number of vertices times the number of edges.
[X] Wrong: "The algorithm runs in O(E) time because it just loops over edges."
[OK] Correct: The edge loop runs multiple times--once for each vertex minus one--so total work multiplies by V.
Understanding this time complexity helps you explain why Bellman Ford is slower than other shortest path methods but necessary for graphs with negative weights.
"What if we stopped the algorithm early when no distances change in an iteration? How would that affect the time complexity?"