Bellman-Ford Algorithm - Negative Weight Handling

The Key Difference from Dijkstra

While Dijkstra’s Algorithm - The Greedy Invariant assumes we can finalize nodes greedily, Bellman-Ford makes no such assumption. This allows it to handle negative edge weights correctly.

Why Negative Weights Break Greedy

Consider this graph:

A --(-10)--> B --(5)--> C
|                       ^
+---------(2)----------+

With Dijkstra starting from A:

  1. A is at distance 0
  2. C is reachable directly at distance 2 (marked as final!)
  3. B is discovered at distance -10
  4. Would update C to distance -5, but C is already finalized

The greedy assumption failed: we found a better path later.

How Bellman-Ford Solves This

Bellman-Ford doesn’t finalize nodes early. Instead, it relaxes all edges repeatedly until distances stabilize:

def bellman_ford(graph, source):
    # Initialize distances
    distance = {v: float('inf') for v in graph}
    distance[source] = 0
    predecessor = {v: None for v in graph}
    
    # Relax all edges V-1 times
    for _ in range(len(graph) - 1):
        for u in graph:
            for v, weight in graph[u].items():
                if distance[u] + weight < distance[v]:
                    distance[v] = distance[u] + weight
                    predecessor[v] = u
    
    # Check for negative cycles
    for u in graph:
        for v, weight in graph[u].items():
            if distance[u] + weight < distance[v]:
                raise ValueError("Negative cycle detected")
    
    return distance, predecessor

The V-1 Iteration Guarantee

Why V-1 iterations? Because the longest simple path (no repeated vertices) in a graph has at most V-1 edges.

In iteration i, Bellman-Ford finds all shortest paths using at most i edges:

  • Iteration 1: All shortest paths with ≤ 1 edge
  • Iteration 2: All shortest paths with ≤ 2 edges
  • Iteration V-1: All shortest paths with ≤ V-1 edges (which is all simple paths)

Negative Cycle Detection

If relaxation still succeeds in the V-th iteration, a negative cycle exists.

Why? Because:

  • After V-1 iterations, all simple paths are optimal
  • Any improvement in iteration V means we’re using ≥ V edges
  • This is only possible if there’s a cycle
  • If it improves distance, the cycle must be negative
# After V-1 iterations, this should not succeed
for u, v, weight in edges:
    if distance[u] + weight < distance[v]:
        return "Negative cycle exists"

The Trade-off

Bellman-Ford sacrifices efficiency for correctness with negative weights:

AlgorithmTimeNegative WeightsNegative Cycle Detection
DijkstraO((V+E) log V)❌ FailsN/A
Bellman-FordO(V × E)✅ Correct✅ Yes

When Negative Weights Actually Occur

Negative weights aren’t just theoretical:

  1. Currency arbitrage: Exchange rates can create “negative cost” cycles
  2. Cost modeling: Gaining resources can be modeled as negative cost
  3. Network flow: Certain transformations introduce negative weights
  4. Game theory: Payoffs can be negative

Implementation Optimization: SPFA

The Shortest Path Faster Algorithm (SPFA) is a queue-based optimization:

def spfa(graph, source):
    distance = {v: float('inf') for v in graph}
    distance[source] = 0
    queue = deque([source])
    in_queue = {source}
    
    while queue:
        u = queue.popleft()
        in_queue.remove(u)
        
        for v, weight in graph[u].items():
            if distance[u] + weight < distance[v]:
                distance[v] = distance[u] + weight
                if v not in in_queue:
                    queue.append(v)
                    in_queue.add(v)
    
    return distance

SPFA only relaxes edges from nodes whose distance changed, leading to better average-case performance (though worst-case is still O(V × E)).

Key Insight

Bellman-Ford trades the greedy optimization of Dijkstra for robustness. It’s the “safe” choice when:

  • Negative weights might exist
  • You need to detect negative cycles
  • Graph is small enough that O(V × E) is acceptable

The algorithm’s beauty lies in its simplicity: just keep relaxing edges until things settle, and if they don’t settle by V-1 iterations, you have a problem.


Reconstruction: The V-1 bound comes from the longest simple path property. Negative cycle detection follows from the pigeonhole principle (paths longer than V-1 edges must contain cycles).