What is the solution to this matrix optimization problem $A^* = \text{argmin}_{A} \sum_{i=1}^{r-1}|Ax_i-x_{i+1}|^2$?
Solution 1:
Consider the matrices $\mathbf{R}_1= \left[ \mathbf{x}_1,\ldots,\mathbf{x}_{r-1} \right]$ and $\mathbf{R}_2= \left[ \mathbf{x}_2,\ldots,\mathbf{x}_{r} \right]$
The objective function writes $\phi(\mathbf{A}) = \| \mathbf{A} \mathbf{R}_1 - \mathbf{R}_2 \|^2_F $ The closed form solution is $$ \mathbf{A} = \mathbf{R}_2 \mathbf{R}_1^T \left( \mathbf{R}_1 \mathbf{R}_1^T \right)^{-1} $$
Solution 2:
Isn't this separable by rows of $A$? Because the squared norm is just a sum of squares. Assume $A$ is $n\times n$. Row $j$ of $A$ is given by $R_j$ (an $n-$vector). Then your objective is: $$\sum_{j=1}^n \sum_{i=1}^{r-1} \Big( R_j \cdot x_i -x_{i+1,j}\Big)^2.$$ So you can solve for each best row vector $R_j$ on its own. Say using linear regression.