Significance of eigenvalue
Here are some things you can learn from eigenvalues.
If there are $d$ distinct eigenvalues, then the diameter of the graph is at least $d+1$.
If a graph is $k$-regular, then the multiplicity of $k$ as an eigenvalue gives you the number of connected components in the graph. In general, if a graph is connected, then the largest eigenvalue must have multiplicity 1.
For strongly regular graphs, you can use information about the eigenvalues and their multiplicities to determine if certain parameter values are even possible. That is, if we have a strongly regular graph with parameters (n, k, a, c), then there exists one eigenvalue of multiplicity 1 and two other eigenvalues, $\theta$ and $\tau$. These eigenvalues have multiplicities:
$m_\theta = \frac{1}{2} \left( (n-1) - \frac{2k+(n-1)(a-c)}{\sqrt{\Delta}}\right)$
and
$m_\tau = \frac{1}{2} \left( (n-1) + \frac{2k+(n-1)(a-c)}{\sqrt{\Delta}}\right)$
where $\Delta = (a - c)^2 + 4(k - c)$. Since these represent eigenvalue multiplicities, they must be integers. Therefore, any parameter values (n, k, a, c) which do not give integer values for these multiplicities, are not possible.
See Algebraic Graph Theory by Godsil and Royle, chapter 10, for more on this.
If you are interested in learning about the connection between eigenvalues and graphs, then read a book. You already know the term spectral analysis. Read Spectra of Graphs, for instance:
http://homepages.cwi.nl/~aeb/math/ipm/ipm.pdf
P14 of this book gives the first result I gave, plus the proof.
The dominant (largest) eigenvalue can be used to tell you which node(s) are the most connected. Since the matrix is symmetric, it is diagonalizable (in fact, orthogonally diagonalizable, but diagonalizable is enough for what follows).
So all $n$-vectors can be written as a linear combination of eigenvectors. Say $$\vec{x}=\sum_{i=1}^nc_i\vec{v}_i$$ where $\vec{v}_i$ is an eigenvector for $\lambda_i$ and $\lambda_1$ is the dominant eigenvalue.
In particular, apply this to the columns of the matrix $\vec{a_j}$: $$\vec{a_j}=\sum_{i=1}^nc_{ij}\vec{v}_i$$
I assume you understand that $A^n$ shows you how many paths there are of length $n$ from each vertex $i$ to each vertex $j$. But $$A^n=A^{n-1}A=\begin{bmatrix}A^{n-1}\vec{a}_j\end{bmatrix}=\begin{bmatrix}\sum_{i=1}^nc_{ij}\lambda_i^{n-1}\vec{v}_i\end{bmatrix}$$ For large $n$, $\lambda_1^{n-1}$ is relatively much larger than all other such terms. So $$A^n\approx\begin{bmatrix}c_{1j}\lambda_1^{n-1}\vec{v}_1\end{bmatrix}$$ where the $\approx$ comparison is made entry by entry and compares entires by looking for ratios close to $1$. (I'm being informal, for ease of reading).
Thus the columns of $A^n$ are proportional to $\vec{v}_1$, so for each column of $A^n$, the entries of $\vec{v}_1$ tell you which entries are largest, which in turn tells you which nodes have the most paths of length $n$ to each of the other nodes. As $n$ gets larger, this feature of $A^n$'s columns won't change. So in a nice sense, you will understand which nodes are the most connected to other nodes through paths of long-enough lengths.