For a nice geometric interpretation, instead of just exponentiating a matrix $A$, we would first multiply $A$ by a real scalar $t$ and then exponentiate, i.e. we look at $e^{tA}$.

Now if you read on elementary linear algebra carefully, you would notice that there is no concept of "gradual" linear transformation. For instance, while we would have the concept of "gradually rotating a plane from $0$ to $90$ degree" way back in secondary school, we don't have such a concept in linear algebra. We instead have a matrix that rotates the plane by $90$ degree in an instant step. The map $f:\Bbb R\to GL(n,\Bbb R), f(t)= e^{tA}$, provides us a formal way to discuss gradual linear transformations.

As an example, consider the matrix
$$A=\begin{bmatrix} 0 & -1\\ 1 & 0\\ \end{bmatrix},$$ which has exponential $$e^{tA}= \begin{bmatrix} \cos t & -\sin t\\ \sin t & \cos t\\ \end{bmatrix}.$$ You should recognise this as the rotation matrix acting on the plane. As you let $t$ increases continuously, you can see that $e^{tA}$ is a gradual rotation acting on the plane.

When you define such a gradual linear transformation using a matrix $A$, $A$ would be known as the infinitesimal generator of the gradual linear transformation. By generator, we mean that given any differentiable group homomorphism $g:\Bbb R\to GL(n,\Bbb R)$ (which you can interpret as a gradual linear transformation on $\Bbb R^n$ like above), there is a unique square matrix $B$ such that $g(t)=e^{tB}$, and $B$ would generate the gradual linear transformation $g$. By infinitesimal, we mean that $B$ is obtained by calculating the derivative of $g$ at $0$, and $g'(0)=B$.

The precise geometric meaning of $e^A$ depends on what properties $A$ has of course. I can't tell what $e^A$ is if you give me an arbitrary $A$, but there are some famous examples where a property of $A$ makes $e^A$ belong to some well-known class of matrices:

  1. if $A$ is skew-symmetric, then $e^A$ is an orthogonal matrix;

  2. if $A$ has zero trace, then $e^A$ has determinant $1$;

  3. if $A$ is partitioned into blocks, $$A=\begin{bmatrix} B & C\\ D & E\\ \end{bmatrix},$$ with $B$ being a $p$ by $p$ matrix, $C$ being a $p$ by $q$ matrix, $D$ being a $q$ by $p$ matrix, $E$ being a $q$ by $q$ matrix, and $B$, $E$ are skew-symmetric, and transpose of $C$, $C^T$, equals $D$, then $e^A$ satisfies the equation $$(e^A)^T\begin{bmatrix} I_p & 0_{p\times q}\\ 0_{q\times p} & -I_q\end{bmatrix}e^A=\begin{bmatrix} I_p & 0_{p\times q}\\ 0_{q\times p} & -I_q\end{bmatrix},$$ where $0_{m\times n}$ is the $m$ by $n$ matrix, $I_n$ is the $n$ by $n$ identity matrix.

You can see more of these relations between $A$ and $e^A$ if you study matrix lie groups.