Solution 1:

The convergence in probability to a constant $\mu$ is equivalent to weak convergence to $\mu$. And the latter, by Lévy's criterion, is equivalent to the pointwise convergence of characteristic functions to $e^{i\mu t}$.

Thus, $S_n/n \overset{P}{\longrightarrow} \mu$ $\iff$ $\varphi_{S_n/n}(t)\to e^{i\mu t}, t\in \mathbb{R}$ $\iff$ $\varphi_{X}(t/n)^n \to e^{i\mu t},t\in\mathbb{R}$ $\iff$ $n\log \varphi_{X}(t/n)\to it\mu,t\in\mathbb{R}$ $\iff$ $\frac{n}{t} (\varphi_{X}(\frac{t}{n})-1) \to i\mu, t\in \mathbb{R}$, where $\varphi_X$ is the characteristic function of one summand.

The last convergence says roughly that the derivative of $\varphi_{X}(t)$ at $t=0$ is equal to $i\mu$; clearly, it is implied by $\varphi_{X}'(0) = i\mu$. It turns out that this convergence is equivalent to $\varphi_{X}'(0) = i\mu$.

Although this provides a way to identify $\mu$, the usefulness of this result is quite limited; e.g. in your case it is not clear at all how to find the derivative of $\varphi_X(t)$ at $t=0$ (maybe, someone knows the answer, but I don't).

An alternative approach is to use Feller's weak law of large numbers, which says that if $x P(|X|>x)\to 0$, $x\to+\infty$, then $S_n/n -\mu_n \overset{P}{\longrightarrow} 0$, $n\to\infty$, where $\mu_n = E[X\mathbf{1}_{|X|\le n}]$.

In our case $$ nP(|X|>n) = C n\sum_{k=n+1}^\infty \frac{1}{k^2 \log k}\sim \frac{C}{\log n}, n\to\infty, $$ where the last follows e.g. from the Stolz rule.

Therefore, $S_n/n -\mu_n \overset{P}{\longrightarrow} 0$, $n\to\infty$. But now the sequence $\mu_n$ has a limit $$ \mu = C\sum_{k=2}^\infty \frac{(-1)^k}{k\log k}, $$ whence $S_n/n\overset{P}{\longrightarrow} \mu $, $n\to\infty$.