Linear Transformation and Matrices

I have been studying linear algebra for a while now, and I still can't understand the basic concept of linear transformation and the easy ''translation'' of them the matrices.

I understand that every proof we make can be done with both concepts, and that they are connected pretty close, and there is a basis which ''moves'' us from one concept to another.

If anyone can write a brief explanation, or some reading material, It would be a big step for me for further understanding of the subject.

Thanks,

Alan


Consider the linear transformation $T$ that takes a vector in $(x,y)\in\mathbb{R}^2$ and maps it to the vector $(x+2y,3x+4y)\in\mathbb{R}^2$. That is,

$$ \begin{bmatrix} x\\ y \end{bmatrix} \overset{T}{\longmapsto} \begin{bmatrix} x+2y\\ 3x+4y \end{bmatrix}. \tag{1} $$

We want to express this transformation in the form of $A\mathbf{x}$ where $A$ is a matrix and $\mathbf{x}=(x,y)$ is the input vector. Hopefully you can see here that the matrix of the linear transformation described in $(1)$ is $$ A= \begin{bmatrix} 1 & 2\\ 3 & 4 \end{bmatrix} $$ since $$ A\mathbf{x}=\underbrace{\begin{bmatrix} 1 & 2\\ 3 & 4 \end{bmatrix}}_{\text{matrix of }T}\ \ \underbrace{\begin{bmatrix}x\\ y\end{bmatrix}}_\text{input vector} = \underbrace{\begin{bmatrix} x+2y\\ 3x+4y \end{bmatrix} }_\text{output vector} $$

Note that $A$ is not the linear transformation per se; rather $A$ is a computationally efficient way to write down the action taking place in $(1)$ in the form of the matrix-vector multiplication $A\mathbf{x}$.

Implicit in this conversation is that we are writing down the matrix of the linear transformation in $(1)$ with respect to some fixed basis. If changing the basis is your emphasis rather than the general idea of the connection between a linear transformation and its associated matrix, you might study this.