Sum of squares of dependent Gaussian random variables
Solution 1:
Lets assume you have $X=(X_1, \dots, X_n)$ a random vector with multinormal distribution with expectation vector $\mu$ and covariance matrix $\Sigma$. We are interested in the quadratic form $Q(X)= X^T A X = \sum \sum a_{ij} X_i X_j$. Define $Y = \Sigma^{-1/2} X$ where we are assuming $\Sigma$ is invertible. Write also $Z=(Y-\Sigma^{-1/2} \mu)$, which will have expectation zero and variance matrix the identity.
Now $$ Q(X) = X^T A X= (Z+\Sigma^{-1/2} \mu)^T \Sigma^{1/2}A\Sigma^{1/2} (Z+\Sigma^{-1/2} \mu). $$ Use the spectral theorem now and write $\Sigma^{1/2}A \Sigma^{1/2} = P^T \Lambda P$ where $P$ is an orthogonal matrix (so that$P P^T=P^T P=I$) and $\Lambda$ is diagonal with positive diagonal elements $\lambda_1, \dotsc, \lambda_n$. Write $U = P Z$ so that $U$ is multivariate normal with identity covariance matrix and expectation zero.
We can compute $$ Q(X) = (Z+\Sigma^{-1/2} \mu)^T \Sigma^{1/2}A\Sigma^{1/2} (Z+\Sigma^{-1/2} \mu) \\ = (Z+\Sigma^{-1/2} \mu)^T P^T \Lambda P (Z+\Sigma^{-1/2} \mu) \\ = (PZ+P\Sigma^{-1/2} \mu)^T \Lambda (PZ+P\Sigma^{-1/2} \mu) \\ = (U+b)^T \Lambda (U+b) $$ where now $b = P \Sigma^{-1/2} \mu $. (There was a small typo in above defs of $U$ and $b$, now corrected.) So: $$ Q(X) = X^T A X = \sum_{j=1}^n \lambda_j (U_j+b_j)^2 $$ In your case, $\mu=0$ so $b=0$ so your quadratic form is a linear combination of independent chi-square variables, each with one degree of freedom. In the general case, we will get a linear combination of independent non-central chisquare variables.
If you want to work numerically with that distribution, there is an CRAN package (that is, package for R) implementing it, called CompQuadForm
.
If you want (much) more detail, there is a book dedicated to the topic, Mathai & Provost: "Quadratic forms in random variables".