# Algorithm 2: Find the number of paths between two vertices u,v in a Graph G = (V,E)

This Algorithm will build on a new understanding of a way to look at Graphs. In particular, we will use the Adjacency Matrix and the Diameter of a Graph to build this Algorithm. The Computational Complexity of this Algorithm will be discussed once the pseudocode is presented.

Pre-requisites :
1) Adjacency Matrix – The Adjacency Matrix $\textbf{A}$ of a graph G = (V,E), with $|V(G)| = n$ is:
–  $a_{ij} = 1$ if $v_{i}v_{j} \in E(G)$
$a_{ij} = 0$ otherwise
2) Diameter – The Diameter of a Graph G = (V, E) is $argmax(|p(v_{i}-v_{j})|) , \forall v_{i}, v_{j} \in V(G)$

Algorithm :
The Algorithm is based off of 2 ideas:
1) The length of any path between any two vertices is less than the Diameter of a Graph.
2) In any Graph G = (V,E) , we have that , $(\textbf{A})^{k}$ has $a_{ij} = |p(v_{i}-v_{j})|$ of length ‘k’.

Proof of 2):
We will use Induction.
a) For k = 1, by definition, the Adjacency Matrix $\textbf{A}$ tells us whether or not an edge exists between two vertices. In a Graph, and Edge is the only path of length-1.
b) Inductive Hypothesis : Assume that, $\forall k \leq n$ for some $n \in \mathbb{N}$ , $(\textbf{A})^{k}$ gives the number of all possible paths from a vertex $v_{i}$ to another vertex $v_{j}$ of length ‘k’.
c) For $n = k + 1$ , we have that : $(\textbf{A})^{(k + 1)}$ = $(\textbf{A})^{(k)}\textbf{A}$.
Now, by the Inductive Hypothesis, $\forall a_{ij} \in (\textbf{A})^{(k)}$ , $a_{ij} = |p(v_{i}-v_{j})|$
Now, for any vertex, all paths that end at a vertex must come in from one of it’s adjacent vertices. By definition of the Adjacency     Matrix and Matrix Multiplication, then: $\sum (a_{ij})^{(k + 1)} = (a_{i(1)})^{k}a_{(1)j} + .... + (a_{i(n)})^{k}a_{(n)j}$
By definition, for all adjacent vertices to $v_{j}$ , $a_{(i)j}$ = 1, $\forall i \in [1, n]$
Therefore, the paths from a vertex $v_{i}$ to a vertex $v_{j}$ of length (k + 1), are the sum of the paths of length (k), from vertex $v_{i}$ to all the adjacent vertices of $v_{j}$.
Therefore, 2) is true.
QED

Pseudocode :

– length = 0
– count = 0
– A = adjacencyMatrix
– B = adjacencyMatrix
– while (length =< $diam_{G}$):         // $O(diam_{G})$
–      $\hspace{16 mm} count += a_{ij}$
–      $\hspace{16 mm} (\textbf{A})^{'}$ = $\textbf{A}$*$\textbf{B}$     // $O(n^{3})$
–      $\hspace{16 mm} length$ += 1
–       $\hspace{16 mm}$ $\textbf{A}$ = $(\textbf{A})^{'}$
return count

Now, we know that $diam_{G} \leq n$ and therefore, all paths of all possible lengths between $v_{i}$ and $v_{j}$ are summed up and return under the variable ‘count’.

Computational Complexity : $O(n^{3}(diam_{G}))$ . Now, $diam_{G}$ is between ‘1’ and ‘n’. Therefore, we will have that the computational complexity of our Algorithm will be bounded by : $[O(n^{3}), O(n^{4})]$.
The $O(n^{3})$ comes from the Matrix Multiplication Algorithm, which can be reduced by using a Cache, or by distributing the task. However, for most purposes, the Algorithm is cubic in time.

Space Complexity : $O(n^{2})$ . Here, the Squared Space Complexity arises because of the need to store the Adjacency Matrix in-memory.

The proof of correctness for this Algorithm follows directly from 2).