Condition such that $ \sum_{n \geq 1} a_n^2 X_n^2 = \infty$ for iid Gaussian $X_n$

Yes.

  1. We first make several simplifications.

    • W.l.o.g., we assume that the $X_n$'s are standard Gaussian variables.
    • W.l.o.g., we assume that $a_n>0$ for all $n\ge1$. Indeed, it is easy to see by taking expectations (and Fubini's theorem) that $$\sum_{n\ge1}\frac{X_n^2}{n^2}<\infty$$ holds almost surely, so we may replace $a_n$ by $\sqrt{a_n^2+\frac1{n^2}}$.
    • W.l.o.g., we assume that $(a_n)_{n\ge1}$ is eventually non-increasing. Indeed, we have $$\sum_{n\ge1}a_{\sigma(n)}^2=\sum_{n\ge1}a_n^2\qquad\text{and}\qquad\sum_{n\ge1}a_n^2X_n^2\stackrel d=\sum_{n\ge1}a_{\sigma(n)}^2X_n^2$$ for any bijection $\sigma\colon\mathbb N\to\mathbb N$.
    • W.l.o.g., we assume that $a_n\le(2\log n)^{-1/2}=:\varepsilon_n$ holds for all $n\ge2$. Indeed, using that $(a_n)_{n\ge1}$ is eventually non-increasing, we still have $$\sum_{n\ge2}(a_n\wedge\varepsilon_n)^2=\infty$$ by the Cauchy condensation test, and because $$\sum_{n\ge1}a_n^2X_n^2\ge\sum_{n\ge2}(a_n\wedge\varepsilon_n)^2X_n^2,$$ we may replace $a_n$ by $a_n\wedge\varepsilon_n$.
  2. Now, the process $S_n:=\sum_{k=1}^na_k^2X_k^2,\,n\ge0,$ is a nonnegative submartingale for the natural filtration $\mathcal F_n:=\sigma(X_1,\ldots,X_n)$. Its compensator $$A_n:=\sum_{k=1}^n\mathbb E[S_k-S_{k-1}\mid\mathcal F_{k-1}]=\sum_{k=1}^na_k^2$$ tends to $A_\infty:=\infty$ as $n\to\infty$, almost surely. If the condition $$\mathbb E\biggl[\sup_{n\ge1}\:\Bigl\lvert S_n-S_{n-1}\Bigr\rvert\biggr]=\mathbb E\biggl[ \sup_{n\ge1}\,a_n^2X_n^2\biggr]<\infty,\tag{$\star$}$$ is fulfilled, then we know by a consequence of Doob's decomposition theorem (see e.g. the proof of Proposition VII.3.9 in Neveu's book, or Exercise 2.7.11.4.d) of my notes) that the event $$\Biggl\{\limsup_{n\to\infty}S_n=\infty\Biggr\}=\Biggl\{\sum_{n\ge1}a_n^2X_n^2=\infty\Biggr\}$$ coincides up to a $\mathbb P$-null set with the event $\{A_\infty=\infty\}$, and thus holds almost surely.

  3. We prove ($\star$). Because the $X_n$'s are i.i.d., we have for every $t\ge1$, $$\mathbb P\Bigl(\sup_{n\ge1}\,a_n^2X_n^2\le t\Bigr) = \prod_{n\ge1}\Bigl(1-\mathbb P\bigl(a_n\lvert X_1\rvert>\sqrt t\bigr)\Bigr)\ge1-\sum_{n\ge1}\mathbb P\Bigl(a_n\lvert X_1\rvert>\sqrt t\Bigr).$$ Therefore \begin{align*} \mathbb E\biggl[\sup_{n\ge1}\,a_n^2X_n^2\biggr] &=\int_0^\infty\mathbb P\Bigl(\sup_{n\ge1}\,a_n^2X_n^2>t\Bigr)\,\mathrm dt\\[.4em] &\le1+\int_1^\infty\sum_{n\ge1}\mathbb P\Bigl(a_n\lvert X_1\rvert>\sqrt t\Bigr)\,\mathrm dt\\[.4em] &=1+2\sum_{n\ge1}a_n^2\int_{a_n^{-1}}^\infty\mathbb P(|X_1|>x)\,x\,\mathrm dx, \end{align*} by the change of variable $x\gets \sqrt t/a_n$.
    Next, we use (twice) the Gaussian tail bound $\mathbb P(|X_1|>x)\le\frac cx\,\mathrm e^{-\frac{x^2}2}$ for all $x>0$ and some constant $c>0$ to obtain \begin{align*} \mathbb E\biggl[\sup_{n\ge1}\,a_n^2X_n^2\biggr] &\le1+2c\sum_{n\ge1}a_n^2\int_{a_n^{-1}}^\infty\mathrm e^{-\frac{x^2}2}\,\mathrm dx\\[.4em] &\le1+2c^2\sqrt{2\pi}\sum_{n\ge1}a_n^3\exp\biggl(-\frac1{2a_n^2}\biggr)\\[.4em] &<\infty, \end{align*} thanks to our extra assumption that $$a_n^3\exp\biggl(-\frac1{2a_n^2}\biggr)\le\varepsilon_n^3\exp\biggl(-\frac1{2\varepsilon_n^2}\biggr)=\frac1{n(2\log n)^{3/2}}$$ for all $n\ge2$.