For every real number $a$ there exists a sequence $r_n$ of rational numbers such that $r_n$ approaches $a$.

Solution 1:

Without loss of generality we may assume the real number $a$ is $\gt 0$. (If $a \lt 0$, we can apply the argument below to $|a|$ and then switch signs.) We sketch a fairly formal proof, based on the fact that the reals are a complete ordered field. In one of the remarks at the end, we give an easy informal but incomplete "proof."

Let $n$ be a natural number. Let $m=m(n)$ be the largest positive integer such that $\frac{m}{n}\lt a$. Then $\frac{m+1}{n}\ge a$, and therefore $|a-m/n|\lt 1/n$.

Let $r_n=m/n$. It is easy to show from the definition of limit that the sequence $(r_n)$ has limit $a$.

Remarks: $1.$ One really requires proof that there is a positive integer $m$ such that $\frac{m}{n}\ge a$. It is enough to show that there is a positive integer $k$ such that $k \ge a$, for then we can take $m=kn$. The fact that there is always an integer $\gt a$ is called the Archimedean property of the reals. We proceed to prove that the reals do have this property.

Suppose to the contrary that all positive integers are $\lt a$. Then the set $\mathbb{N}$ of positive integers is bounded, so has a least upper bound $b$. That means that for any $\epsilon \gt 0$ there is an integer $k$ such that $0\lt k\lt b$ and $b-k\lt \epsilon$. Pick $\epsilon=1/2$. Then $k+1\gt b$, contradicting the assumption that $b$ is an upper bound for $\mathbb{N}$.

$2.$ One can also give a very quick but not fully persuasive "proof" of the approximation result. Assume as before that $a\gt 0$. The numbers obtained by truncating the decimal expansion of $a$ at the $n$-th place are rational, and clearly have limit $a$. The problem is that we are then assuming that every real number has a decimal expansion.

Solution 2:

We can prove it easily by the density property of the rationals, where the later can be proved by Archimedean property in the case that you assume this is equivalent to the density property.

Let $a$ be a real number, for any $n \in\mathbb N$, $a - \frac 1n < a + \frac 1n$, so there is a rational $r_n$ that lies between them, i.e $a-\frac 1n < r_n < a+\frac 1n$, which implies that $|r_n - a|< \frac 1n$, so $r_n$ converges to $a$.