Why does the Cauchy distribution have no mean if it's symmetric around 0?

The mathematical answer is what Lost1 said, so I won't repeat it.

Morally, the mathematical answer is the correct one because the interesting objects in probability (to me anyway) tend to be idealized versions of things one encounters in experiments. Before I tell you what I mean by that, take a second and ask yourself what you would do if a person who had never encountered any higher mathematics before asked you "what is a mean?"

I would tell them that a mean is an average. If you repeat an experiment a lot of times, then average the results you get, the mean is that number. By the law of large numbers, we know that up to a little fuzziness and some regularity assumptions that answer is usually essentially correct.

Surely any definition of "mean" has to agree with the one I just gave. The problem with the Cauchy distribution is that if you had a bunch of genuine independent standard Cauchy distributed random variables and you averaged them, your limit wouldn't be all that close to zero. It would be some random number. In fact, its distribution would again be standard Cauchy.

In essence, I think the reason the mean of a Cauchy distribution isn't zero is that if you were to encounter a bunch of approximately independent, approximately Cauchy random variables, their empirical average probably would not be all that close to zero.


For the mean to exist, you need $\int^\infty_{-\infty}\frac{|x|}{1+x^2}\text{d}x$ to be finite. This is the same as the requirement of a function being integrable. A measurable function $f$ is integrable if $\int |f|\text{d}\mu<\infty$. As it has been pointed out by many people, if the mean exists, it is 0 by symmetry, but it does not.