You can use this idea as a start (it is actually more that a start!) Without loss of generality, assume that $K = c$ (the constant in the exponent of subgaussian tail).

\begin{eqnarray} \mathbb{E}\max \frac{|X_i|}{\sqrt{1+\log i}} &=& \int_0^\infty \mathbb{P}\left(\max \frac{|X_i|}{\sqrt{1+\log i}} > t \right) dt\\ &\leq& \int_0^2 \mathbb{P}\left(\max \frac{|X_i|}{\sqrt{1+\log i}} > t \right) dt + \int_2^\infty \mathbb{P}\left(\max \frac{|X_i|}{\sqrt{1+\log i}} > t \right) dt \\&\leq& 2 + \int_2^\infty \sum_{i=1}^N\mathbb{P}\left( \frac{|X_i|}{\sqrt{1+\log i}} > t \right) dt \\ &\leq& 2 + \int_2^\infty \sum_{i=1}^N 2 \exp\big(-\frac{ct^2(1+\log(i))}{K^2} \big) dt\\ &\leq& 2 + 2\sum_{i=1}^N \int_2^\infty \exp(-ct^2/K^2) \;\;i^{-t^2} dt \\ &\leq& 2 + 2\sqrt{2\pi }K\sum_{i=1}^N \int_2^\infty \frac{1}{\sqrt{2\pi }K}\exp(-\frac{ct^2}{K}) \;\;i^{-4} dt \leq \infty \end{eqnarray} We know that the sum of $\frac{1}{i^4}$ in convergent.

I choose 2 as the point to split two integrals to make the sum convergent. (you could have used other points).


The above answer seems not correct (although some interesting idea in it). Here is how I solved it.

Consider $Z_i=\frac{|X_i|}{K\sqrt{1+\log{i}}}, i=1,2,...$. We want to show $\mathbb{E}[\max_i Z_i]<C$ for some $C$.

Then we look at event $\Omega_i:=\{Z_i\ge a\}$. Show that $\mathbb{P}(\Omega_i)\le 2(\frac{1}{i})^{-a^2}$ using $|X_i|$ being subgaussian and $K$ the largest subgaussian norm.

Then for choice of some large $a$ that makes $2(\frac{1}{i})^{-a^2}$ summable, we can see from Borel Cantelli, which does NOT require $\{\Omega_i\}$ to be independent events, that $\mathbb{P}(\limsup \Omega_i)=0$. This means, with probability 1, there exists N, such that for all $i>N,$ $Z_i<a$. Then, $\mathbb{E}[\max_i Z_i]\le \mathbb{E}[\max_{i\le N} Z_i]+\mathbb{E}[\max_{i> N} Z_i]\le \mathbb{E}[\sum_{i=1}^N Z_i]+a= \sum_{i=1}^N\mathbb{E}[ Z_i]+a\le N\cdot \max_{i\le N}\mathbb{E}Z_i+a<\infty.$