Solution 1:

What does it mean "orthogonal eigenvectors of eigenvalue zero"? Are those eigenvalues really equal 0?

Eigenvector for eigenvalue zero is a vector $\mathbf{a}$ such that $L\mathbf{a}=\mathbf{0}$. In this case $$\begin{bmatrix} L_{G_{1}} & 0\\ 0 & L_{G_{2}} \end{bmatrix} \begin{bmatrix} 0\\ 1 \end{bmatrix} = \begin{bmatrix} 0\\ 0 \end{bmatrix} \text{ and } \begin{bmatrix} L_{G_{1}} & 0\\ 0 & L_{G_{2}} \end{bmatrix} \begin{bmatrix} 1\\ 0 \end{bmatrix}= \begin{bmatrix} 0\\ 0 \end{bmatrix}$$ hence the vectors $\mathbf{a}_1=[1,\dots,1,0,\dots,0]^T$ and $\mathbf{a}_2=[0,\dots,0,1,\dots,1]^T$ are eigenvectors corresponding to eigenvalue 0.

They are orthogonal since $\mathbf{a}_1^T\mathbf{a}_2=0$. Since we know that these vectors are orthogonal, they must be linearly independent.

How do we know that these eigenvalues correspond to $\lambda_2$?

Since the eigenvalues are ordered increasingly and we have two linearly independent eigenvectors, i.e. at least two eigenvalues are zero, we know that $\lambda_1=\lambda_2=0$.

I suspect that all $\lambda$'s are ordered by multiplicity. But how can I prove that every n by n symmetric matrix has n eigenvalues, counted with multiplicity.

Eigenvalues are roots of characteristic polynomial, which has degree $n$. So, counting multiplicity, there is $n$ of them. Here we are speaking about algebraic multiplicity, see wikipedia.

Why actually it's true if $u$ and $v$ are connected by an edge, we have $x(u)=x(v)$, it's not obvious for me, even if $x$ is eigenvector, and what happens in the general case?

Since $\sum\limits_{(u,v)\in E}^{} (x(u)-x(v))^{2} = 0$ is a sum of non-negative real numbers, each of them must be zero. So we have $(x(u)-x(v))^{2}=0$ and, consequently, $(x(u)-x(v))=0$, for each $(u,v)\in E$.

OK, by now we prove that $\lambda_{1}=0$, but what about $\lambda_{2}$?

It was shown in the preceding part of the proof, that (if $G$ is connected) every eigenvector for zero is multiple of $[1,1,\dots,1]^T$. This shows that eigenspace corresponding to zero eigenvalue has dimension 1, i.e., the geometric multiplicity of this eigenvalue is 1.

EDIT: What I wrote here before was unnecessary complicated. Laplacian matrix is symmetric, hence it is diagonalizable. Thus algebraic and geometric multiplicity are the same.

However, we need to show that algebraic multiplicity of $\lambda_1=0$ is 1. Suppose this would be not true. It would imply that there exists a vector $\mathbf a$ such that $L \mathbf a = [1,1,\dots,1]^T$. This would imply that vector $[1,1,\dots,1]^T$ belongs to the column space of the matrix $L$. But directly from the definition of of Laplacian matrix we can see that this vector is orthogonal to each column of $L$. (At least for non-weighted graphs - I did not think about the weighted case.)

I agree with you that this last argument seems to be missing in the pdf-file you linked. (And probably different explanations are possible.)


Just a few words of explanation. (But it is better to read it with more details at wiki or in some linear algebra textbook.)

If $\mathbf{a}$ is an eigenvector of a matrix $A$, it means that $(A-\lambda I)\mathbf a=\mathbf 0$. If algebraic multiplicity (=the multiplicity of $\lambda$ as a root of characteristic polynomial) is greater than the geometric multiplicity (=dimension of the corresponding eigenspace), it implies that there is an eigenvector $\mathbf a$ and a generalized eigenvector $\mathbf b$ such that $(A-\lambda I)\mathbf b=\mathbf a$ .

This was the property which we used in the last paragraph.

Geometric multiplicity is precisely the number of Jordan blocks in the Jordan normal form.


It might be worth mentioning, that in the book Bapat: Graphs and Matrices, Universitext, Springer 2010, the Laplacian matrix is defined as follows. The author first defines the (vertex-edge) incidence matrix of G denoted by $Q(G)$, which is the $n\times m$ matrix with entries $0$ if the given vertex and edge are incident, $1$ if the edge starts in this vertex and $-1$ if it ends there. It is relatively easy to show that rank of $Q(G)$ is $n-k$, where $k$ is the number of connected components of this graph, see p.11-12 of this book.

Then the Laplacian matrix can be obtained as $L(G)=Q(G)Q(G)^T$, see p.45

Using this fact we get $\operatorname{rank}(L)=\operatorname{rank}(QQ^T)=\operatorname{rank}(Q)$, see Lemma 4.3, p.46.

The fact that matrix $Q$ and its Gram matrix $QQ^T$ have the same rank is mentioned in the same book on p.10 as an exercise:

Let $A$ be an $m\times n$ matrix. Show that $A$ and $A^TA$ have the same null space. Hence conclude that $\operatorname{rank} A = \operatorname{rank}A^TA$.

I.e., it was shown that $n-\operatorname{rank}(G)$ is the number of connected components. Since $L(G)$ is symmetric, it is diagonalizable, and thus this is the same as the number of non-zero eigenvalues.

Again, I do not know whether something similar can be done in the weighted case.