Tuesday, November 4, 2014

SRGs II (The linear algebra of strongly regular graphs)

The linear algebra of strongly regular graphs

A graph with vertex set V and edge set E can be encoded in a square array with entries 0s and 1s.  We label the rows and columns of this array by the vertices of the graph and place a 1 in the (i,j) entry if (and only if) the vertex with label i is adjacent to the vertex with label j.  For example, the graph
 has adjacency matrix
Since the adjacency matrix of a graph is a square matrix, then we can consider the linear algebra theory of square matrices and ask questions about eigenvectors, eigenvalues, eigenspaces, minimal and characteristic polynomials and, more generally, pull in many of the powerful tools of linear algebra!

It is natural, at first glance, to ask "Why?"

Why should linear algebra be useful here when all we have done is found a box-like way to encode the graph drawing.  Allen Schwenk, who developed many of the ideas I will cite here, asked that question early in his career.
Just because we find a "box" or "matrix" convenient for collecting the graph data 
-- why should we expect linear algebra to give useful information?? 

This is an example of  "mathematical serendipity" -- it turns out that matrix multiplication encodes the process of "walking" around the graph and so a "geometric" or graph-theoretic concept is simulated by an algebraic one.

Walks on a graph

We need is a simple (but pretty) lemma.

Serendipity Lemma.  If A is the adjacency matrix of a graph then the (i,j) entry in  A^ is just the number of walks of length m from vertex i to vertex j.

The proof of this lemma will not be given here, but it is straightforward, relying on the definition of matrix multiplication and the fact that the matrix A only has entries zeroes and ones, so that any significant contribution in a product occurs when there is an edge.  If one is comfortable with induction, this is a nice short exercise.

The effect of this lemma is profound.  Powers of the matrix A count walks; the adjacency algebra (linear combinations of powers of A) encode graph theoretic information and so eigenvalues of the matrix (zeroes of the minimal polynomial) carry significant graph theoretic information.  In some cases, the eigenvalues completely describe the graph.

The eigenvalues of a strongly regular graph

In an earlier blog we looked at strongly regular graphs.  A strongly regular graph with parameters (v, k, lambda, mu) has the property that the number of walks of length 2 from a vertex to another takes on one of two values, lambda or mu, depending on whether the vertices are adjacent or not.  This fact gives us an explicit formula for the matrix  A^2.  If vertices i, j are adjacent (so that the (i,j) entry of A is 1) then the (i,j) entry of A^2 is lambda.  If vertices i, j are not adjacent (so that the (i,j) entry of the complementary matrix J-A-I, is 1) then the (i,j) entry of A^2 is mu.  Finally – somewhat trivially – the diagonal entries in  A^2  are all k since there are k walks from a vertex i back to itself.

Thus the matrix A^2 is equal to k times the identity matrix plus lambda times the adjacency matrix plus mu times the complementary matrix J-A-I:

Or, after a bit of rewriting,
This means – here come the most important steps of the linear algebra – that other than the trivial eigenvalue k (with eigenvector equal to a vector of all ones), the other eigenvalues will solve a quadratic equation
(Any eigenvector with eigenvalue not k will be in the nullspace of J and so we may treat the matrix J as zero.)

From this equation, we may find the eigenvalues of the adjacency matrix A.  There are the two roots of this quadratic (often called r and s) and the "trivial" eigenvalue k.

In addition to computing the eigenvalues of the matrix A, we may compute the dimensions of the eigenspaces.  (These are the multiplicities of r and s as roots of the characteristic polynomial of A.)  These dimensions, of course, must be positive integers and that requirement places significant restrictions on the parameters of strongly regular graphs.

With a short bit of programming, one can then generate a list of strongly regular graph parameters and ask, "Does a graph with these parameters exist?"  For the first few graphs on the list, the graph exists and is unique.  As we get deeper into the parameters list, the existence problem gets more difficult and occasionally there are two or more distinct graphs with the same parameters.  But the existence requirements for SRGs seem very restrictive and so it comes as a surprise to discover that the number of SRGs with a certain parameter jumps: there are 3854 different graphs with parameters (35, 16, 6, 8) and eigenvalues 16, 2 and -4!

Next in the list (see the webpage with parameters of SRGs, maintained by Andries Brouwer) is a graph with parameters (36, 10, 4, 2) and eigenvalues 10, 4, –2.  That graph is unique!  There is only one SRG with parameters (36, 10, 4, 2).

Shortly after that. in the list of SRGs, is the parameter set (36, 15, 6, 6) (with eigenvalues 15, 3, -3).  In that case there are (drumroll...) 32548 different SRGs with these parameters!

The eigenvalues 10, 4, -2 uniquely determine the graph while the eigenvalues 15, 3, -3 allow more than 32 thousand graph possibilities.  What a strange world.

Further reading

The Wikipedia article on strongly regular graphs, like many Wikipedia articles on mathematics, is nicely done.
Peter Cameron has a nice article (in pdf) here and a blogpost entry on SRGs and SRGs without triangles.
Anything by Norman Biggs is good; some of those papers are on professor Bigg's website here or on the math arXiv.


Next time: Strongly Regular Cayley Graphs

[Mathematical prerequisites: This article requires an introduction to finite graphas and an elementary introduction to some concepts in matrix theory.]

Follow Ken W. Smith on Twitter @kenwsmith54