Solving matrix equations of the form $X = AXA^T + C$
Your solution for $X$ is correct.
Here is another solution (it is cheating, though, because I used Octave). On he plus side, it shows another approach to solving the equation.
$A$ is diagonalizable, and so has a basis of eigenvectors, $u_1,u_2$. Define the linear operator $L(X) = X-A X A^*$. You want to solve $L(X) = C$. Note that $u_i u_j^*$ forms a basis for the space of $2 \times 2 $ matrices, and $L(u_i u_j^*) = (1-\lambda_i \overline{\lambda_j}) u_i u_j^*$, hence $L$ is diagonalizable.
To express $C$ in terms of $u_i u_j^*$, let $C = \sum_{i,j} [\Gamma]_{ij} u_i u_j^*$, and solve for $\Gamma$. Let $U$ be the matrix whose columns are $u_1 ,u_2$. Note that $\sum_i [\Gamma]_{ij} u_i = U \Gamma e_j$, so we have $C = \sum_{j} U \Gamma e_j u_j^* = \sum_{j} U \Gamma e_j (U e_j)^* = \sum_{j} U \Gamma e_j e_j^T U^* = U \Gamma (\sum_{j} e_j e_j^T) U^* = U \Gamma U^*$. It follows that $\Gamma = U^{-1} C (U^{-1})^*$, and if we solve $L(\sum_{i,j} [\tilde{X}]_{ij} u_i u_j^*) = \sum_{i,j} [\Gamma]_{ij} u_i u_j^*$, we get $[\tilde{X}]_{ij} = \frac{1}{1-\lambda_i \overline{\lambda_j}} [\Gamma]_{ij}$.
Reversing the procedure to obtain $X$ in the standard basis, we have $X = U \tilde{X} U^*$.
Using Octave we obtain the same result as above.
a = [1.5 1 ; -0.7 0 ]
c = [1 0.5 ; 0.5 0.25 ]
[u,d] = eig(a)
l = [d(1,1) ; d(2,2) ]
phi = ones(2,2)-l*l'
gamma = inv(u)*c*inv(u')
x_tilde = (1./phi).*gamma
x = u*x_tilde*u'
# check result...
a*x*a'+c-x
Here is the output:
octave:44> a*x*a'+c-x
ans =
0.00000 - 0.00000i 0.00000 - 0.00000i
0.00000 - 0.00000i 0.00000 + 0.00000i
octave:45> x
x =
18.8802 + 0.0000i -11.3672 - 0.0000i
-11.3672 - 0.0000i 9.5013 + 0.0000i
The last displayed line is wrong. The RHS is indeed $\begin{bmatrix} 18.8802 & -11.3672 \\ -11.3672 & 9.5013 \end{bmatrix}$, which is $X$. So, your solution for $X$ is correct.
Given $\mathrm A, \mathrm B \in \mathbb R^{n \times n}$, we have a linear matrix equation in $\mathrm X \in \mathbb R^{n \times n}$
$$\rm X = A X A^{\top} + B$$
Vectorizing both sides, we obtain a system of $n^2$ linear equations in $n^2$ unknowns
$$\left( \mathrm I_{n^2} - \left( \mathrm A \otimes \mathrm A \right) \right) \mbox{vec} (\mathrm X) = \mbox{vec} (\mathrm B)$$
Example
Let
$$\mathrm A := \begin{bmatrix} 1.5 & 1 \\ -0.7 & 0\end{bmatrix} \qquad \qquad \qquad \mathrm B := \begin{bmatrix} 1 & 0.5 \\ 0.5 & 0.25\end{bmatrix}$$
Using NumPy, we solve the linear system and then un-vectorize the solution:
>>> import numpy as np
>>> A = np.array([[ 1.5, 1.0],
[-0.7, 0.0]])
>>> B = np.array([[ 1.0, 0.5],
[ 0.5, 0.25]])
>>> I4 = np.identity(4)
>>> solution = np.linalg.solve(I4 - np.kron(A,A), np.reshape(B, 4))
>>> X = np.reshape(solution, (2,2))
>>> X
array([[ 18.88020833, -11.3671875 ],
[-11.3671875 , 9.50130208]])