The beauty of the spectrum of a graph

Rohith Kambampati
3 min readSep 5, 2021

--

Spectral analysis of any signal is goosebump worthy. When I first learned about Fourier analysis, I just fell in love it. I kinda travelled to a different realm when I heard there is another way of looking this,

f(x) vs x

Any undergrad student with basic understanding of Fourier analysis could now say that the above function is sin(x)+sin(3x)+sin(6x)+……+sin(99x), by looking at the spectrum of the above function. This blew my mind away. Knowing all the components of a function using the super imposed function is something that I thought was super cool for me.

Now, brace yourself for something that is gonna give you an adrenaline rush. The thing we now see is the spectrum of not an ordinary signal. But of a graph. Before we see anything else, let’s build some machinery. Adjacency matrix(A) and degree matrix(D) are something every geek knows. There is something else that goes by the name Laplacian of a matrix. The difference of Degree and Adjacency matrices is a Laplacian of a matrix(L=D-A). While by the looks of it looks nothing out of ordinary, it is much cooler than any other matrices you ever came across.

Eigen decomposition is also something everyone is familiar right? Okay okay, here it comes, it is nothing but factoring a matrix into eigen values and eigen vectors. The Eigen decomposition of a Laplacian of a graph in matrix form looks like this equation L*U = Lambda_matrix * U. This equation is very popular in the Graph signal processing community. Lambda_matrix and U are matrices that contain all the Eigen values and Eigen vectors of the graph laplacian. Cool, right? The Eigen decomposition of a graph Laplacian is awesome. Every ML enthusiast nowadays is obsessed with clustering problems. Humans like to see patterns everywhere. The network that runs our body likes to remember things, by grouping stuff together. It is natural that we are obsessed with clustering problems. Many algorithms out there tell you how to do this.

Invoking the graph ideology and Eigen decomposition makes our lives much easier. Just looking at the Eigen vectors lets us do clustering in a jiffy. The Eigen vector corresponding to the first non zero Eigen value is called Fiedler vector, which is very very very popular in the community. Lot of jargon right. Ughhh!!. But every bit is worth it. 2 clusters are formed using Fiedler vector. Consider the Fiedler vector [a1, a2, a3, a4, a5, a6]. Say a1, a4 are >0 and a2, a3, a5, a6 are <0. Then 1, 4 nodes in the network form one group and 2, 3, 5, 6 nodes form one group so clustering is done. Same thing can be done using multiple Eigen vectors. You get the point.

Clustering is just one of the marvels possible using spectral analysis. Stay tuned to know what else is possible with spectral analysis of a graph.

--

--

Rohith Kambampati
Rohith Kambampati

Written by Rohith Kambampati

Programming, Signal Processing, Film and TV

No responses yet