Cool mathematics I can show to calculus students.

I am a TA for theoretical linear algebra and calculus course this semester. This is an advanced course for strong freshmen.

Every discussion section I am trying to show my students (give them as a series of exercises that we do on the blackboard) some serious math that they can understand and appreciate. For example, when we were talking about cross products, I showed them the isomorphism between $\mathbb{R}^3$ with cross product and Lie algebra $so(3)$. Of course I didn't use fancy language, but we have proved Jacobi identity for cross product, then we picked a basis in skew-symmetric matrices $3\times 3$ and checked that when you commute these three basis vectors, you get exactly the "right hand rule" for cross product. So these two things are the same.

For the next recitation the topics are 1) eigenvalues and eigenvectors; 2) real quadratic forms.

Can you, please, recommend some cool things that we can discuss with them so that they will learn about eigenvalues and forms, but without doing just boring calculations? By "cool things" I mean some problems coming from serious mathematics, but when explained properly they might be a nice exercise for the students.

Moreover, do you know if there is a book with serious math given in accessible for freshmen form (in the form of exercises would be absolutely great!)?

Thank you very much!


Solution 1:

If you plan on talking about matrix diagonalization when you discuss eigenvalues, you could show how eigenvalues can be used to give closed forms for linear recurrences. For example, if $F_n$ denotes the $n$th Fibonacci, then $$\left( \begin{matrix} F_{n+1} \\ F_n \end{matrix} \right)=\left(\begin{matrix}1 & 1 \\1 & 0 \end{matrix}\right)\left(\begin{matrix} F_n \\ F_{n-1} \end{matrix}\right)=\left(\begin{matrix}1 & 1 \\1 & 0 \end{matrix}\right)^n \left(\begin{matrix} 1 \\ 0 \end{matrix}\right).$$ By diagonalizing, we can find a closed form for the $n$th Fibonacci, as a linear combination of the eigenvalues of the matrix $$\left(\begin{matrix}1 & 1 \\1 & 0 \end{matrix}\right).$$

Solution 2:

I myself did not study graph theory yet but I do know that if you consider the adjacency matrix of a graph, then there are interesting things with its eigenvalues and eigenvectors.

You can also show how to solve a system of ordinary differential equations of the form $$ {x_1}'(t) = a_{11}x_1(t) + a_{12}x_2(t) + a_{13}x_3(t)\\ {x_2}'(t) = a_{21}x_1(t) + a_{22}x_2(t) + a_{23}x_3(t)\\ {x_3}'(t) = a_{31}x_1(t) + a_{32}x_2(t) + a_{33}x_3(t) $$ using linear algebra. Let $$ x(t) = \begin{pmatrix} {x_1}(t)\\ {x_2}(t)\\ {x_3}(t) \end{pmatrix}\,\,\,\,\,\,\,\,\,\,\,\,\,\,\,\,\,\,\,\,\,\, A = \begin{pmatrix} a_{11} & a_{12} & a_{13} \\ a_{21} & a_{22} & a_{23} \\ a_{31} & a_{32} & a_{33} \end{pmatrix}. $$ Then the system can be written as $x'(t) = Ax(t)$. You can diagonalize (assume its possible) to get a invertible matrix $Q$ and diagonal matrix $D$ so that $A = QDQ^{-1}$. Then $x'(t) = QDQ^{-1}x(t)$ or $Q^{-1}x'(t) = DQ^{-1}x(t)$. Let $y(t) = Q^{-1}x(t)$. Then $y'(t) = Dy(t)$. Since $D$ is diagonal, this is now a system of equation which is independent of each other which can be solved easily. I thought this was a nice application. This was in my linear algebra book by Friedberg, Insel and Spence.