What do the eigenvectors of an adjacency matrix tell us?
The second eigenvalue of a markov chain has meaning, and it affects (for example) the convergence and stability of algorithms that find the equilibrium distribution. See The second eigenvalue of the Google matrix, by Haveliwala and Kamvar for a discussion on how to compute its value.
The second eigenvector, however, has to be something more complex. Given that the PageRank matrix is a reversible markov chain (by construction), it has a unique equilibrium distribution. So the only vector v for which P v = v is the first eigenvector. So, the second eigenvector does not necessarily represent a probability distribution. According to Fluctuation Induced Almost Invariant Sets, by Schwartz and Billings, if a reversible markov chain represents a linear dynamical system the second eigenvector is a good way of finding almost invariant sets, which are subsets of the pages in which the perfect stochastic user would spend a long time "trapped in" before leaving to other parts of the web. The signs of the entries of the second eigenvector (and the third, and maybe others) can be used to find these almost invariant sets.