Can a number have infinitely many digits before the decimal point?

I asked my teacher if a number can have infinitely many digits before the decimal point. He said that this isn't possible, even though there are numbers with infinitely many digits after the decimal point. I asked why and he said that if you keep adding digits to the left of a number it will eventually approach infinity which is not a number and you could no longer distinguish two numbers from one another. Now this is the part of his reasoning I don't understand: why can we distinguish two numbers with infinitely many digits after the point but not before it? Or is there a simpler explanation for this?


Solution 1:

The formal way to understand this is, of course, using the definition of real numbers. A real number is "allowed" to have infinite digits after the decimal point, but only a finite number of digits before. (http://en.wikipedia.org/wiki/Real_number)

(if it interests you, there are numbers that have infinite digits before the decimal point, and only a finite number after. Take a look at http://en.wikipedia.org/wiki/P-adic_number . just to have some fun, know that the $10$-adic expansion of $-1$ is $\color{red}{\dots 99999} = -1$)

If you want to get some intuition about this, first think that, as your teacher said, said number would approach infinity, which is not a real number. This is reason enough.

About the comparing two numbers part: if I give you $$1234.983...$$ and $$1234.981...$$ you know which one is bigger, it does not matter what the other digits are.

But with $$...321.99$$, $$...221.99$$ you don't, because the information relies in the "first" digit. Of course nobody know what the first digit is, since there is no first digit.

But as I said before, this is to gain some intuition; the correct way to think about this is using the definition (which is not trivial)

Solution 2:

The short answer to your question is that by definition we only allow real numbers to have finitely many digits before the decimal point. There are very good reasons for this:

Formally, we can think of a number as a finite sequence of digits $x_0,\ x_1, \ \ldots , x_N$, where the number $x$ is equal to $$x=\sum_{n=0}^Nx_n10^n$$

For example, the number $126 = 6 + 2\times 10 + 1 \times 100$.

Inherently there is no reason to restrict ourselves to finite sequences of digits. Given an infinite sequence, we could define a "number" as above and we could differentiate between two numbers by saying that they are different if they differ in at least one digit.

The problems start coming in when try to do arithmetic. Consider the "number" $\ldots 99999$ made up of infinitely many $9$'s. If this is a number, we should be able to add and subtract as normal. But what happens when we try to add $1$ to this number?

$$\begin{array}[t]{r} \ldots 9\ 9\ 9\ 9\ 9\ 9\ 9\\ + \qquad\qquad \ \ \ \ 1 \\ \hline \ldots0\ 0 \ 0\ 0\ 0\ 0\ 0 \end{array}$$

At each stage, we get $10$ and carry a $1$, but this happens at every stage, and never stops. In other words, adding $1$ to this number gives us $0$ - not especially coherent with what we would expect from our number system.

Or consider the "number" $\ldots 11111.1111\ldots$ made up of infinitely many $1$'s before and after the decimal point. What happens when we multiply this number by $10$? We get exactly the same number - meaning that this number and others like it would become solutions to the equation $10x = x$. But in our number system, we would like $0$ to be the only solution to this equation.

It is possible to create a coherent number system where we allow arithmetic like this, but it would be completely different from our own number system.

Solution 3:

For ordinary integers, there can only be infinitely many digits to the left of the decimal point if the are only finitely many nonzero digits among them, in other words if there is a point to the left of which there are only zero digits. Usually one does not write those digits for reasons that are easy to imagine.

The reason this is so is because numbers are not strings of digits, they are just denoted by strings of digits. This is a lot like my name not being the same thing as I am, but is just used to designate me. It is a bit more difficult to imagine, since while you can see me (or you could if you were in the same room as I am) and see the difference with my name, you cannot see the number designated by $421.94$, as it is an abstract quantity. You might image however two line segments of which one is exactly $421.94$ times as long as the other, and say that $421.94$ is the (abstract) ratio of those two lengths, and you might agree that this has a meaning quite independent and different from the glyphs in "421.94". (By the way such ratios is how the ancient Greeks thought about the nature of numbers.) Numbers are things that exist in our minds independently of how we represent them (or of the question whether we can represent them at all, if you are sufficiently imaginative).

One can very well imagine digits going on forever to the left as well as to the right (and for that matter I can imagine them being stacked in infinitely many rows in the plane as well) but we only say that a string of digits represents a number if it corresponds according to a definite convention to a number. As long as there are only finitely many digits, that convention (for the decimal number system) is that according to its position each digit is multiplied by a certain power of ten, and all those values are added together to get our number. Adding finitely many numbers together is something we imagine always possible, and always returning some number.

However adding infinitely many numbers together is not something we can imagine always possible, at least not if we care about preserving the beautiful properties that numbers have. Imagine adding together infinitely many copies of the number $1$. If that were to give a number as a result (you might want to call it "$\infty$") then one could also ask to compute $1+\infty$. On hand that would still be adding up infinitely many copies of the number $1$, so it should give $\infty$, but on the other hand adding $1$ to a number should always produce a different number (the difference between the new and the old number should be$~1$, not$~0$). As a consequence of such simple observations, one cannot consider that in general adding up infinitely many numbers gives a result that is a number. (Saying the result "is infinitely large" is just a manner of speach, and is not a number.)

Now in certain special cases one can agree that adding up infinitely many terms gives a definite number as result. A somewhat boring instance of this is when all terms except finitely many ones are zero; in this case one can agree that the (finite) sum of the nonzero terms also gives the sum of all the terms (by definition of the latter). This is why I said one can allow infinitely many digits $0$ going off to the left (or to the right for that matter).

Another case that is usually considered to give a well defined result is when the terms are all positive (or zero) but get smaller and smaller, in such a way that there is a threshold that none of the results of adding finitely many terms passes this threshold. In this case we imagine there is a smallest possible such threshold, one that is never passed, but such that any smaller threshold will be ultimately passed by some finite sum of terms. I said "imagine" here, because one needs to invent the real numbers specifically for this purpose: if for instance all our terms are rational numbers, one for which we have fractions to express them precisely, then all finite sums will also be rational numbers, but the smallest threshold might not be equal to any rational number. However the real numbers are, by design, such that whenever any threshold exists for the finite sums, there exists a (unique) real number that is the smallest threshold; this number is then defined to be the sum of the infinite sequence of terms.

If negative terms are allowed things get a bit more complicated, but still there are certain cases where the finite sums ultimately get confined in smaller and smaller intervals, and are said to tend to a limit; this condition defines "convergent" infinite sums of terms, and these are the only infinite summations that conventionally designate a number (namely that limit). All other summations are said to be "divergent", and no value at all is attributed to adding up their terms. $1+1+1+\cdots$ is a typical divergent summation.

Now you can imagine that adding up ever higher power of $10$, each multiplied by a nonzero digit, is even worse than doing $1+1+1+\cdots$; the summation diverges and produces no number. However when taking ever smaller powers of$~10$, it is easy to find a threshold for the finite sums; indeed stopping at any term, and taking the finite sum obtained after adding $1$ to the final digit, always produces threshold valid for all finite sums of the original terms. This is why infinite progressions of decimals after the decimal point always designate a real number. (They do not define a real number, as it existed independently, and even less is it true that they are real numbers; in particular it can happen that two different progressions of decimals designate the same real number, which confuses people who think of numbers as being progressions of decimals a lot.)

I might add, if this interests you, that other number systems than the real numbers exist, and that other notions of converging sums can be proposed (not based on getting "ever closer" to a limit in the usual sense). In one such number systems, the $10$-adic integers, infinite progressions of decimals to the left (but not to the right!) do define a convergent sum, and therefore designate a number. These numbers however do not have all the usual propreties of numbers; notably there is no notion of "less than" and "more than" among the $10$-adic integers. Can you see why that would be a difficult thing to define for numbers whose digits run off to the left?

Solution 4:

You can't have infinitely many digits before the decimal point because the geometric series $$\sum_{n=0}^\infty ar^n$$ only converges when $|r| < 1$. So you can have infinitely many digits after the decimal point, because that's adding something of the form $$\sum_{n=1}^\infty a_n(\frac{1}{10})^n \le \sum_{n=1}^\infty 9(\frac{1}{10})^n$$ which converges because $|\frac{1}{10}| < 1$, but having infinitely many terms before the decimal point is of the form $$\sum_{n=0}^\infty a_n(10)^n \ge \sum_{n=0}^\infty 10^n$$ which doesn't converge (even if some $a_n = 0$ you can compare $10^n$ with the next nonzero term which is bigger).