Bar Mean vs Bracket Mean

I know that the standard representation for the average of a data set: $$ \bar{x} = \frac{1}{N} \sum_{i}^N{x_i} $$

I have also ran into an average denoted as $\langle x \rangle$. This notation is frequently used in physics, and I understand it to be a continuous average over either time or space:

$$ \langle x \rangle = \frac{1}{b-a} \int_a^b{f(x)} $$

Where $f(x)$ can be a function of any coordinates of space and/or time.

From this there is also the relation as $N \to \infty$ and $\Delta(x_i, x_{\pm i}) \to 0 $:

$$ \bar{x} = \langle x \rangle = \frac{1}{\infty}\sum_i^\infty{x_i} = \frac{1}{b-a} \int_a^bf(x) $$

This indicates that given enough samples, it should be the case that $\bar{x}=\langle x \rangle$.

Am I correct in this understanding that $\bar{x}$ represents a discrete average and $\langle x \rangle$ represents a continuous average, or is there something else to this?


Solution 1:

The ISO 80000-2:2009 defines $\bar{x}$ as the "mean value of $x$" and $\langle x \rangle$ as the "arithmetic mean of $x$". So both notations denote the same object only when the mean is arithmetic. However, usually non-arithmetic means are explicitly denoted --for instance, $\bar{x}_G$, $\tilde{x}$, and $G(x)$ are used for geometric mean--, which means $\bar{x} = \langle x \rangle$ for most practical cases.

All statistics books I have use the bar notation to denote means. But Wolfram states that the angle notation is sometimes also used.