Floyd-Warshall ‘s All Pair Shortest Path Algorithm
In the previous two posts, I explained two different algorithms to find the shortest path in the graph. In this post I will define another very simple yet important problem in Graph Theory. Its called the “All Pair Shortest Path” problem.
Given a graph G, as a set of vertices V and directed weighted edges E, the aim of the problem is to find the cost of shortest path between every pair of vertices. Cost of a shortest path is defined as the sum of the edge weights in the shortest paths.
At first sight this problem might look really hard to approach. But the following theorem will make this one of the easiest to approach and especially code it.
If A represents the adjacency matrix of the graph, then for an integer k<=|V| , the (i,j) entry in the matrix A^k , gives the shortest path of length atmost k, between i and j in the original graph.
This above theorem can be easily proved using induction on k. I will leave that as a simple exercise.
From the theorem, all that is left to find is the matrix A^n.
A trivial way to compute this would be to do ‘n’ matrix multiplications. This would take a complexity of O(|V|^4).
Instead, Floyd and Warshall gave a Dynamic Programming based approach.
for k from 1 to n: for i from 1 to n: for j from 1 to n: A[i][j] = min(A[i][j],A[i][k]+A[k][j])
The above algorithm has a very simple intuition. The shortest path from i to j can be either the current value, or sum of shortest path from (i,k) and shortest path from (k,j). Instead of calculating this value repeatedly, we will use a dynamic programming approach and calculate once and store it.
The above algorithm has 3 loop each running for O(|V|) time. Hence the overall complexity is O(|V|^3).
A java implementation of the above algorithm can be found here.