Meaning of non-existence of expectation?
Solution 1:
With usual notation, decompose $X$ as $X=X^+ - X^-$ (also note that $|X|=X^+ + X^-$). $X$ is said to have finite expectation (or to be integrable) if both ${\rm E}(X^+)$ and ${\rm E}(X^-)$ are finite. In this case ${\rm E}(X) = {\rm E}(X^+) - {\rm E}(X^-)$. Moreover, if ${\rm E}(X^+) = +\infty$ (respectively, ${\rm E}(X^-) = +\infty$) and ${\rm E}(X^-)<\infty$ (respectively, ${\rm E}(X^+)<\infty$), then ${\rm E}(X) = +\infty$ (respectively, ${\rm E}(X) = -\infty$). So, $X$ is allowed to have infinite expectation.
Whenever ${\rm E}(X)$ exists (finite or infinite), the strong law of large numbers holds. That is, if $X_1,X_2,\ldots$ is a sequence of i.i.d. random variables with finite or infinite expectation, letting $S_n = X_1+\cdots + X_n$, it holds $n^{-1}S_n \to {\rm E}(X_1)$ almost surely. The infinite expectation case follows from the finite case by the monotone convergence theorem.
If, on the other hand, ${\rm E}(X^+) = +\infty $ and ${\rm E}(X^-) = +\infty $, then $X$ does not admit an expectation. In this case, must of the following must occur (a result by Kesten, see Theorem 1 in the paper The strong law of large numbers when the mean is undefined, by K. Bruce Erickson): 1) Almost surely, $n^{-1}S_n \to +\infty$; 2) Almost surely, $n^{-1}S_n \to -\infty$; 3) Almost surely, $\lim \sup n^{ - 1} S_n = + \infty$ and $\lim \inf n^{ - 1} S_n = - \infty$.
EDIT: Since you mentioned the recent post "Are there any random variables so that ${\rm E}[X]$ and ${\rm E}[Y]$ exist but ${\rm E}[XY]$ doesn't?", it is worth stressing the difference between "$X$ has expectation" and "$X$ is integrable". By definition, $X$ is integrable if $|X|$ has finite expectation (recall that $|X|=X^+ + X^-$). So, for example, the random variable $X=1/U$, where $U \sim {\rm uniform}(0,1)$, is not integrable, yet has (infinite) expectation (indeed, $\int_0^1 {x^{ - 1} \,{\rm d}x} = \infty $). Further, it is worth noting the following. A random variable $X$ is integrable (i.e., ${\rm E}|X|<\infty$) if and only if $$ \int_\Omega {|X|\,{\rm dP}} = \int_{ - \infty }^\infty {|x|\,{\rm d}F(x)} < \infty . $$ A random variable has expectation if and only if $$ \int_\Omega {X^ + \,{\rm dP}} = \int_{ - \infty }^\infty {\max \{ x,0\} \,{\rm d}F(x)} = \int_0^\infty {x\,{\rm d}F(x)} < \infty $$ or $$ \int_\Omega {X^ - \,{\rm dP}} = \int_{ - \infty }^\infty {-\min \{ x,0\} \,{\rm d}F(x)} = \int_{ - \infty }^0 {|x|\,{\rm d}F(x)} < \infty. $$ In any of these cases, the expectation of $X$ is given by $$ {\rm E}(X) = \int_0^\infty {x\,{\rm d}F(x)} - \int_{ - \infty }^0 {|x|\,{\rm d}F(x)} \in [-\infty,\infty]. $$ Finally, $X$ does not admit an expectation if and only if both $\int_\Omega {X^ + \,{\rm dP}} = \int_0^\infty {x\,{\rm d}F(x)}$ and $\int_\Omega {X^ - \,{\rm dP}} = \int_{ - \infty }^0 {|x|\,{\rm d}F(x)} $ are infinite. Thus, for example, a Cauchy random variable with density function $f(x) = \frac{1}{{\pi (1 + x^2 )}}$, $x \in \mathbb{R}$, though symmetric, does not admit an expectation, since both $\int_0^\infty {xf(x)\,{\rm d}x}$ and $\int_{ - \infty }^0 {|x|f(x)\,{\rm d}x}$ are infinite.
Solution 2:
As Numth says, the expected case looks like a typo. I suspect that the original draft tried to say the excepted case, referring to the earlier "Unless both $E[X^{+}]$ and $E[X^{-}]$ are $+\infty, \ldots$".
So Kai Lai Chung is prepared to treat infinite expectations as existing while the current Wikipedia definition does not. An example might be the St Petersburg game. In that article Wikipedia says "the expected win for the player of this game, at least in its idealized form, in which the casino has unlimited resources, is an infinite amount of money".
Chung's definition is the same as Rudin's. Any expectation they say does not exist does not exist in Wikipedia's definition either: an example would be a discrete approximation to the Cauchy distribution. As Chung and Rubin do accept some infinite expectations, some other statements would need to be qualified, such as the expectation of the sum of a finite number of random variables being the sum of the expectations of the random variables; Wikipedia's definition avoids this issue.