Why is an infinite series not considered an infinite sum of terms?

Solution 1:

The operation of addition is a binary operation: it is an operation defined on pairs of real (or complex) numbers. When we write something like $a+b+c$, apparently adding three numbers, we’re really doing repeated addition of two numbers, either $(a+b)+c$ or $a+(b+c)$ (assuming that we don’t change the order of the terms); one of the basic properties of this operation is that it doesn’t actually matter in which order we do these repeated binary additions, because they all yield the same result.

It’s easy enough to understand what it means to do two successive additions to get $a+b+c$, or $200$ to get $a_0+a_1+\ldots+a_{200}$; it’s not so clear what it means to do infinitely many of them to get $\sum_{k\ge 0}a_k$. The best way that’s been found to give this concept meaning is to define this sum to be the limit of the finite partial sums:

$$\sum_{k\ge 0}a_k=\lim_{n\to\infty}\sum_{k=0}^na_k\tag{1}$$

provided that the limit exists. For each $n$ the summation inside the limit on the righthand side of $(1)$ is an ordinary finite sum, the result of performing $n$ ordinary binary additions. This is always a meaningful object. The limit may or may not exist; when it does, it’s a meaningful object, too, but it’s the outcome of a new kind of operation. It is not the result of an infinite string of binary additions; we don’t even try to define such a thing directly. Rather, we look at finite sums, which we can define directly from the ordinary binary operation of addition, and then take their limit. In doing this we combine an algebraic notion, addition, with an analytic notion, that of taking a limit.

Finite sums like $a_0+\ldots+a_{200}$ all behave the same way: they always exist, and we can shuffle the terms as we like without changing the sum. Infinite series do not behave the same way: $\sum_{n\ge 0}a_n$ does not always exist, and shuffling the order of the terms can in some cases change the value. This really is a new operation, with different properties.

Solution 2:

Series do not enjoy all the nice properties usual finite sums do. For example:

Altering parenthesis may alter the sum. We know that $$(1-1)+(1-1)+\cdots=0+0+\cdots=0$$ but $$1+(-1+1)+(-1+1)+\cdots=1+0+0+\cdots=1$$

Thus, introducing parenthesis alters the sum. And deleting them:

$$1-1+1-1+1-1+1-+\cdots$$

gives us a divergent sum!

Two convergent series may produce a divergent series upon multiplication Take $$S=\sum_{n=1}^\infty\frac{(-1)^{n-1}}{\sqrt n}$$

One can show this converges by use of Leibniz's Test. But the Cauchy product of $S$ with itself diverges, because the harmonic series diverges.

The sum of rational numbers may yield an irrational number Take, as an example $$\sum_{n\geq 1} \frac{1}{n^2}=\frac{\pi^2}6$$$$\sum_{n\geq 1} \frac{(-1)^{n-1}}n=\log 2$$

The order in which we sum terms matters For example, if we take the Harmonic series, we can always take positive and negative terms to our liking to make the sum any number we want. This is a particular case of a theorem by Riemann

Solution 3:

The issue we need to discuss here is that for some series each of these arrangements of terms can have different values despite the fact that they are using exactly the same terms.

That is known as the Riemann series theorem for semi-convergent series; but if you look at a series as "just an infinite sum of terms", this should not hold (the terms are the same, so the sum should be).

Solution 4:

Addition, in a beginning, is defined for a pair of numbers. From this you can consistently define the sum of a finite number of numbers. Notice that this requires a definition.

Definition: $a_1+a_2+...a_n:= a_1+(a_2+(a_3+...a_n)...)$.

Then, for this definition you need to prove a lot of things. Among these, you prove that the newly defined symbol $a_1+\ldots+a_n$ agrees with other 'natural' ways of defining $a_1+\ldots+a_n$. For example that it is also equal to when you define it placing the brackets in other ways, when you define it placing the numbers in other ways, that it distributes with multiplication. All this need to be probed. Luckily you manage to prove it and things work very similar to the sum of two numbers.

Now you have a new symbol $a_1+a_2+\ldots$ that you want to define. As any definition it can be done in any way you like. There are many many ways to define it, not only as the limit of the partial sums. As before you then would like to have properties that, as for the sum of $n$ numbers, makes it look like a sum of two numbers.

The problem for this third step is that things don't work out as nicely. Different ways of defining it don't turn out to be equivalent (some 'summation procedures' converge while other don't, some converge to different sums). Associativity doesn't work.

Summarizing: 1) $a_1+a_2$ 2) $a_1+a_2+\ldots+a_n$ and 3) $a_1+a_2+\ldots$

are three different definitions. It just happens that 2 is very similar to 1, and 3 is a bit similar to 1 but not as much as 2.

Solution 5:

Even when one uses the same word "sum" for finite and infinite sums (of real numbers), it is important to remember that the two notions are conceptually very different.

(1) Finite sums can be defined purely algebraically, just by repeated use of the primitive operation of adding two numbers. Infinite sums require for their definition the notion of limit.

(2) Finite sums always exist; you can add up any finite list of real numbers. Infinite sums need not exist; some infinite series fail to converge.

(3) As Clement C. pointed out, even when infinite series do converge, they need not satisfy all the algebraic laws that apply to finite sums.