Let $X_1,\ldots,X_n$ be independent random variables, each distributed uniformly on the interval $[0,1]$. Your question is then equivalent to $$ \lim_{n\to\infty}\mathbb E\frac{X_1^2+\cdots+X_n^2}{X_1+\cdots+X_n}=\frac{2}{3}. $$

We will deduce this from the Strong Law of Large Numbers and the Bounded Convergence Theorem. Consider an infinite iid sequence $(X_i)_{i=1}^{\infty}$ of uniform $[0,1]$ random variables on a probability space $\Omega$. By the Strong Law of Large Numbers, each of the events $$ \left\{\omega\in\Omega\colon\lim_{n\to\infty}\frac{X_1(\omega)+\cdots+X_n(\omega)}{n}=\frac{1}{2}\right\} $$ and $$\left\{\omega\in\Omega\colon\lim_{n\to\infty}\frac{X_1(\omega)^2+\cdots+X_n(\omega)^2}{n}=\frac{1}{3}\right\} $$ occurs with probability 1. Therefore, it holds with probability 1 that $$ \lim_{n\to\infty}\frac{X_1^2+\cdots+X_n^2}{X_1+\cdots+X_n}=\frac{2}{3}. $$ Taking expectations gives that $$ \mathbb E\lim_{n\to\infty}\frac{X_1^2+\cdots+X_n^2}{X_1+\cdots+X_n}=\frac{2}{3}. $$ Since $X_i^2\leq X_i$ for all $i$, the quantity inside the limit is bounded above by $1$. Thus by the Bounded Convergence Theorem we may interchange the expectation with the limit, and therefore $$ \lim_{n\to\infty}\mathbb E\frac{X_1^2+\cdots+X_n^2}{X_1+\cdots+X_n}=\frac{2}{3}. $$


Suppose $X_1,X_2,X_3,\ldots$ are independent random variables, each uniformly distributed on the interval $[0,1].$ Then for each value of $n$ we have $\operatorname{E}(X_n^2) = 1/3.$ The weak law of large numbers says $$ \operatorname*{l.i.p.}_{n\to\infty} \frac{X_1^2 + \cdots + X_n^2} n = \frac 1 3 $$ where $\operatorname{l.i.p.}$ means "limit in probability", and that is defined by saying $$ \text{for every } \varepsilon>0\ \lim_{n\to\infty} \Pr\left( \left| \frac{X_1^2+\cdots + X_n^2} n - \frac 1 3 \right| < \varepsilon \right) = 1. $$ Similarly $$ \operatorname*{l.i.p.}_{n\to\infty} \frac{X_1+\cdots + X_n} n = \frac 1 2. $$ In general, $\Pr(A\cap B) \ge \Pr(A) + \Pr(B) - 1.$ Thus \begin{align} & \Pr\left( \left| \frac{X_1^2+\cdots + X_n^2} n - \frac 1 3 \right| < \varepsilon \text{ and } \left| \frac{X_1+\cdots + X_n} n - \frac 1 2 \right| < \varepsilon \right) \\[10pt] \ge {} & \Pr\left( \left| \frac{X_1^2+\cdots + X_n^2} n - \frac 2 3 \right| < \varepsilon\right) + \Pr\left( \left| \frac{X_1+\cdots + X_n} n - \frac 1 2 \right| \right) - 1. \end{align} Next you need to say that if one number is near $1/3$ and another near $1/2$, then the quotient is near $2/3.$

This sketch of an argument leaves a lot of details to be filled in.


My proof based on calculation of asymptotic on integral. Let us consider the integral at lagre fixed $n$ $$ \int_0^1 \cdots \int_0^1\, dx_1 \cdots dx_n \frac{ x_1^2 + \cdots + x_n^2}{x_1 + \cdots + x_n} = n\int_0^1 \cdots \int_0^1 \, dx_1 \cdots dx_n \frac{ x_1^2}{x_1 + \cdots + x_n} = $$ $$ n\int_0^1 \cdots \int_0^1 \, dx_1 \cdots dx_n \int_0^\infty d\lambda\,\, x_1^2 e^{-\lambda(x_1 + \cdots + x_n)} $$ After integration over $x_2\cdots x_n$ we obtain $$ n\int_0^1 dx_1 x_1^2 \int_0^\infty d\lambda\,\Bigr(\frac{1-e^{-\lambda}}{\lambda}\Bigr)^{n-1} e^{-\lambda x_1} $$ After that we can calculate asymptotic of integral over $\lambda$. We can present integral over $\lambda$ in the following form $$ \int_0^\infty d\lambda e^{-n f(\lambda)} G(\lambda) $$ where $f(\lambda)=\log \lambda-\log(1-e^{-\lambda})$ and $G(\lambda)=\dfrac{\lambda e^{-\lambda x_1}}{1-e^{-\lambda}} $ Since the function $f$ is a monotonically decreasing function we can obtain asymptotic of its integral using integral by parts. $$ \int_0^\infty d\lambda e^{-n f(\lambda)} G(\lambda)=\frac{G(0)}{n f'(0)}+o\Bigr(\frac{1}{n}\Bigr) $$ And we obtain $$ 2\int_0^1 dx_1 x_1^2=\frac{2}{3} $$