Folioby Interconnected
Log InSign Up

Shortest Path Problems in Weighted Graphs

A systematic treatment of shortest path algorithms: BFS for unweighted graphs, Dijkstra's algorithm with a correctness proof, the Bellman--Ford algorithm with negative-cycle detection, the Floyd--Warshall algorithm, and shortest paths in DAGs.

FO
Folio Official
March 1, 2026

1. Formulation of the Shortest Path Problem

Definition 1 (Weighted graph and path weight).
A weighted graphG=(V,E,w) consists of a graph G=(V,E) together with a weight function w:E→R. The weight (or length) of a path P=v0​v1​⋯vk​ is
w(P)=i=0∑k−1​w(vi​,vi+1​).
Definition 2 (Shortest path and shortest distance).
Let P(s,t) denote the set of all paths from s to t. The shortest distance is
δ(s,t)=⎩⎨⎧​minP∈P(s,t)​w(P)−∞∞​if P(s,t)=∅ and no negative cycle is reachable,if a negative cycle is reachable from s on a path to t,if P(s,t)=∅.​
A path P with w(P)=δ(s,t) is called a shortest path.
Remark 3.
If a cycle of negative total weight (a negative cycle) is reachable, one can traverse it arbitrarily many times to make the path weight arbitrarily small, so no shortest path exists. We encode this situation as δ(s,t)=−∞.

A weighted graph with edge weights. The shortest path from s to t is s→a→b→t with total weight 1+2+3=6, not the direct edge s→t of weight 10:

10
1
2
3
7
s
t
a
b

2. BFS Shortest Paths: The Unweighted Case

When every edge has weight 1, BFS (breadth-first search) correctly computes shortest distances.

Theorem 4 (Correctness of BFS).
In an unweighted graph G=(V,E), run BFS from a source s and let d[v] be the layer at which v is discovered. Then d[v]=δ(s,v) for every v∈V.
Proof.
We proceed by induction on k=δ(s,v). For k=0 we have v=s and d[s]=0. Suppose k≥1 and that the claim holds for all vertices at distance at most k−1. Let s=u0​,u1​,…,uk​=v be a shortest path. By induction, d[uk−1​]=k−1. When BFS processes uk−1​, it enqueues v (if not already visited), so d[v]≤k. On the other hand, d[v]<k would imply δ(s,v)<k (since BFS traverses each edge in one step), a contradiction. Hence d[v]=k. □

The running time of BFS is O(∣V∣+∣E∣).

BFS discovers vertices layer by layer. Starting from s, layer 0 contains s, layer 1 its neighbors, layer 2 the neighbors' neighbors, and so on:

s
a
b
c
d
e
f

Algorithm 1: Breadth-First Search (BFS)
Input:
Graph G=(V,E), source s
Output:
Shortest distances d[v]
for all v∈V
d[v]←+∞ 
end for
d[s]←0 
 Queue Q←∅ 
Enqueue(Q,s) 
while Q=∅
u←Dequeue(Q) 
for all v∈Adj(u)
if d[v]=+∞
d[v]←d[u]+1 
Enqueue(Q,v) 
end if
end for
end while
return d

3. Dijkstra's Algorithm

Consider the following graph with nonnegative weights. Starting from s, Dijkstra's algorithm finalizes vertices in order of increasing distance: s (d=0), a (d=1), b (d=3), c (d=4), t (d=6):

1
5
2
4
3
2
s
a
c
b
t

When all edge weights are nonnegative, Dijkstra's algorithm efficiently solves the single-source shortest path problem. The algorithm maintains a set S of vertices whose shortest distances have been finalized. At each step, it selects the vertex u∈/S with the smallest tentative distance d[u], adds u to S, and relaxes the edges leaving u.

Definition 5 (Relaxation).
Relaxation of an edge (u,v) is the operation: if d[v]>d[u]+w(u,v), update d[v]←d[u]+w(u,v).
Theorem 6 (Correctness of Dijkstra's algorithm).
When all edge weights are nonnegative, Dijkstra's algorithm correctly computes d[v]=δ(s,v) for every vertex v.
Proof.
We show by induction on the order in which vertices are added to S that d[u]=δ(s,u) at the moment u is added.

Base case. The source s is added first with d[s]=0=δ(s,s).

Inductive step. Let u be the next vertex added to S, and suppose for contradiction that d[u]>δ(s,u). Consider a shortest path P from s to u. Let (x,y) be the first edge on P with x∈S and y∈/S. By induction, d[x]=δ(s,x). When x was added to S, the edge (x,y) was relaxed, so
d[y]≤d[x]+w(x,y)=δ(s,x)+w(x,y)=δ(s,y).
Since w≥0 and y precedes u on P, we have δ(s,y)≤δ(s,u). Thus d[y]≤δ(s,u)<d[u]. But u was chosen as the vertex outside S with the smallest d-value, so d[u]≤d[y]— a contradiction. □
Algorithm 2: Dijkstra's Algorithm
Input:
Graph G=(V,E), nonnegative weights w, source s
Output:
Shortest distances d[v]
for all v∈V
d[v]←+∞ 
end for
d[s]←0 
 Insert all vertices into priority queue Q with key d 
while Q=∅
u←ExtractMin(Q) 
for all (u,v)∈E
if d[u]+w(u,v)<d[v]
d[v]←d[u]+w(u,v) 
DecreaseKey(Q,v,d[v]) 
end if
end for
end while
return d
Remark 7.
With a binary heap, the running time is O((∣V∣+∣E∣)log∣V∣). With a Fibonacci heap, it improves to O(∣E∣+∣V∣log∣V∣).

4. The Bellman–Ford Algorithm

When edges may have negative weights, Dijkstra's algorithm fails. The Bellman–Ford algorithm handles negative weights and can also detect negative cycles. It works by relaxing all edges ∣V∣−1 times.

Theorem 8 (Correctness of Bellman–Ford).
If no negative cycle is reachable from s, then after ∣V∣−1 iterations, d[v]=δ(s,v) for every vertex v.
Proof.
In the absence of negative cycles, every shortest path has at most ∣V∣−1 edges. Let s=v0​,v1​,…,vk​=v (with k≤∣V∣−1) be a shortest path from s to v. We claim that after iteration i, d[vi​]=δ(s,vi​). For i=0, d[s]=0. Assuming the claim for i−1: after iteration i−1, d[vi−1​]=δ(s,vi−1​). In iteration i, the edge (vi−1​,vi​) is relaxed, giving
d[vi​]≤d[vi−1​]+w(vi−1​,vi​)=δ(s,vi−1​)+w(vi−1​,vi​)=δ(s,vi​).
Since d[vi​]≥δ(s,vi​) always holds (relaxation never makes d smaller than the true distance), equality follows. □
Theorem 9 (Negative cycle detection).
After ∣V∣−1 iterations, perform one additional pass over all edges. If any d[v] is updated, then a negative cycle reachable from s exists.
Proof.
If no negative cycle exists, all d-values are finalized after ∣V∣−1 iterations and no update occurs in the extra pass. By contrapositive, an update implies a negative cycle. □

The running time of Bellman–Ford is O(∣V∣⋅∣E∣).

Algorithm 3: Bellman–Ford Algorithm
Input:
Directed graph G=(V,A), weights w, source s
Output:
Shortest distances d[v], negative cycle detection
for all v∈V
d[v]←+∞ 
end for
d[s]←0 
for i=1 to ∣V∣−1
for all (u,v)∈A
if d[u]+w(u,v)<d[v]
d[v]←d[u]+w(u,v) 
end if
end for
end for
// Negative cycle detection
for all (u,v)∈A
if d[u]+w(u,v)<d[v]
return "Negative cycle exists"
end if
end for
return d

5. The Floyd–Warshall Algorithm

The Floyd–Warshall algorithm computes shortest distances between all pairs of vertices using dynamic programming.

Theorem 10 (Floyd–Warshall recurrence).
Let V={1,2,…,n}, and let d(k)(i,j) be the weight of a shortest path from i to j using only vertices in {1,…,k} as intermediaries. Then
d(k)(i,j)=min(d(k−1)(i,j),d(k−1)(i,k)+d(k−1)(k,j)),
and d(n)(i,j)=δ(i,j).
Proof.
A shortest path from i to j that avoids vertex k as an intermediate has weight d(k−1)(i,j). A shortest path that does pass through k decomposes as i→⋯→k→⋯→j; in the absence of negative cycles, k appears at most once, so its weight is d(k−1)(i,k)+d(k−1)(k,j). The minimum of these two cases gives d(k)(i,j). □
Algorithm 4: Floyd–Warshall Algorithm
Input:
Directed graph G=(V,A), weights w, V={1,…,n}
Output:
All-pairs shortest distances d[i][j]
for i=1 to n
for j=1 to n
if i=j
d[i][j]←0 
else if (i,j)∈A
d[i][j]←w(i,j) 
else
d[i][j]←+∞ 
end if
end for
end for
for k=1 to n
for i=1 to n
for j=1 to n
d[i][j]←min(d[i][j],d[i][k]+d[k][j]) 
end for
end for
end for
return d

The running time is O(∣V∣3). A negative cycle through vertex i is detected when d(n)(i,i)<0.

6. Shortest Paths in DAGs

Theorem 11 (Linear-time shortest paths in DAGs).
In a DAG (directed acyclic graph), single-source shortest paths can be computed in O(∣V∣+∣E∣) time by processing vertices in topological order and relaxing each edge exactly once.
Proof.
In a topological ordering, whenever an edge (u,v) exists, u appears before v. Therefore, by the time vertex v is processed, every edge (u,v) entering v has already been relaxed with the correct value d[u]=δ(s,u). The relaxation of (u,v) then yields d[v]=δ(s,v). □
Remark 12.
This method works correctly even with negative edge weights, since a DAG has no cycles and hence no negative cycles. It is widely applied in project scheduling: computing the longest path (critical path) in a DAG is the basis of PERT (Program Evaluation and Review Technique).
Graph TheoryDiscrete MathematicsTextbookShortest PathsDijkstraBellman-Ford
FO
Folio Official

Mathematics "between the lines" — exploring the intuition textbooks leave out, written in LaTeX on Folio.

1 followers·107 articles
Graph Theory TextbookPart 6 of 13
Previous
Eulerian and Hamiltonian Graphs: Two Classical Traversal Problems
Next
Directed Graphs and Topological Sorting

Share your expertise with the world

Write articles with LaTeX support, build your audience, and earn from your knowledge.

Start Writing — It's Free

More from Folio Official

Folio Official·March 1, 2026

Paths and Connectivity

We formalize the notions of walks, paths, and cycles, then develop the theory of connected components, vertex and edge connectivity, cut vertices and bridges, the characterization of bipartite graphs, and Menger's theorem.

Graph TheoryDiscrete MathematicsTextbook
3
Folio Official·March 1, 2026

Eulerian and Hamiltonian Graphs: Two Classical Traversal Problems

From the Bridges of Königsberg to Euler's theorem on Eulerian circuits, directed Eulerian graphs, Hamiltonian cycles, the theorems of Ore and Dirac, and the NP-completeness of the Hamiltonian cycle problem.

Graph TheoryDiscrete MathematicsTextbook
1
Folio Official·March 1, 2026

Network Flows

Flow networks, the max-flow min-cut theorem, the Ford--Fulkerson method, the Edmonds--Karp algorithm, Dinic's algorithm, and the reduction of bipartite matching to maximum flow.

Graph TheoryDiscrete MathematicsTextbook
Folio Official·March 1, 2026

Recurrence Relations and Counting: Solving Linear Recurrences

We present the method of characteristic equations for solving linear recurrences with constant coefficients, treat the case of repeated roots, discuss nonhomogeneous recurrences, establish the equivalence with generating functions, and introduce matrix exponentiation and the Kitamasa method.

CombinatoricsDiscrete MathematicsTextbook