Different sizes of infinity
Solution 1:
The notation $\displaystyle \lim_{x\rightarrow\infty}f(x)=\infty$ where $f$ is any real-valued function of a real variable means exactly that as $x$ gets arbitrarily large, so does $f(x)$. That's all it means. This usage has no relation to any metaphysical ideas about infinity.
It's also the case that in the mathematical field of set theory, there is an entire elaborate theory of transfinite numbers. This is a very interesting area of math and the basics are accessible at an elementary level so do take a look at this if you're interested. This usage of infinity is not related to the infinity of the first paragraph.
The observation you made about $x$ and $x^2$ happens to lead to yet another interesting area of math: namely the study of the rate at which functions grow. For example as you noted, the functions $x$ and $x^2$ each go to infinity (meaning they grow without bound) as $x$ gets large. And yet, $x^2$ grows "faster" than $x$ in some way.
Growth rates of functions have been studied since the late nineteenth century. They've always been important in some areas of pure math; and today they are of extreme interest in computer science.
Suppose we have two algorithms whose running time increases as a function of the length of their input (sorting a list, say). Then as the input gets larger, the algorithm that grows faster starts to take an impractical amount of time. Computer scientists are always interested in finding algorithms that grow slowly.
The growth rate of a given function is described by its Big-O notation. The study of which functions have roughly the same growth rate is computational complexity theory.
Solution 2:
I like this question a lot. The pedantic answer (and the right answer on an exam) is, of course, "They are the same. As limits, they are both equal to the formal symbol "$+\infty$"." That's pretty boring, though. There is a clear intuition that the behavior of $x^2$ at infinity is "bigger" than the behavior of $x$; the challenge is to try to figure out how to clarify that difference. In math, sometimes these intuitions go nowhere: when you try to understand the intuition well enough to express it clearly, it evaporates. In this case, you're actually really onto something.
As I see it, there are three ways to talk about what's going on. First, there's the pedantic way. There it quite a good reason why, if we want to call what we're doing a "limit," we shouldn't distinguish between the behavior of the two functions. Next, there is the idea of "asymptotic growth rate:" even though both of these functions tend to $+\infty$, we can compare how fast that tendency is. Computer scientists like that view, because it is useful when analyzing the performance of algorithms on large inputs. Finally, and my personal favorite, is the idea of comparing "germs at infinity." That method allows us to make much finer comparison.
Limits
First of all, limits are about continuity. When we say, for example, $$\lim_{x \to 0} \frac{\sin(x)}{x} = 1 \text{,}$$ what we really mean is that, if we want to consider $\frac{\sin(x)}{x}$ as a continuous function, its value at 0 would have to be 1. $\frac{\sin(x)}{x}$ isn't actually defined at 0 (since we'd have to divide by 0), so this is about extending the domain of the function. So what does that mean for limits involving infinity?
Imagine for a moment that on the real line, sitting out past all the positive real numbers, there was a single extra point called "$+\infty$." Similarly, past all the negative real numbers might lie a point called "$-\infty$." The real line doesn't actually look like this (those points don't exist); what I've described instead is something else we call the Extended Real Line, sometimes denoted $\overline{\mathbb{R}}$.
Now, our original functions $x$ and $x^2$ go from $\mathbb{R}$ to $\mathbb{R}$. As it turns out, it makes sense to talk about what continuity would mean for functions involving $\overline{\mathbb{R}}$, not just $\mathbb{R}$. So we can ask: if we wanted to extend the domain of these functions to $\overline{\mathbb{R}}$ (and maybe extend the range to $\overline{\mathbb{R}}$, too, if we have to), what should the value of these functions be at $\pm \infty$? Our limits tell us the answer.
When talking about the behavior of our functions at infinity in this sense, it doesn't make sense to distinguish between them: we're only adding one point on either end of the real number line.
Asymptotic growth rate
As you know, both $x$ and $x^2$ approach $+\infty$ as $x$ approaches $+\infty$. That statement, however, just tells us where they're going; we can still ask about how quickly they do so. As your intuition tells you, $x^2$ is heading there a lot faster than $x$ is. We can quantify that, if we want:
Definition. Given two functions $f(x)$ and $g(x)$ from $\mathbb{R}$ to $\mathbb{R}^{>0}$, say $f$ grows asymptotically faster than $g$ at $+\infty$ if $$\lim_{x \to +\infty} \frac{f(x)}{g(x)} = +\infty\text{.}$$ In that case, write $$f(x) \gg g(x)\text{.}$$
In your specific example, we clearly have that $x^2 \gg x$ in this sense.
This notion is useful when trying to analyze the qualitative behavior of a compound system. Roughly, we know that if $f \gg g$, the contribution of the parts that look like $f$ will eventually dominate the contribution of the parts that look like $g$.
Germs at infinity
Germs are a precise way to talk about "how a function behaves at infinity." Unlike asymptotic growth rates, these germs capture all the behavior of the function at infinity, while ignoring all the other stuff.
Definition. Functions $f(x)$ and $g(x)$ have the same germ at $+\infty$ if there is some number $a$ such that $f(x) = g(x)$ for all $x > a$.
Similarly, say (the germ of) $f$ is greater than (the germ of) $g$ at $+\infty$ if there is some number $a$ such that $f(x) > g(x)$ for all $x > a$.
For example, consider the functions $|x|$, $|x-1|+1$ and $|x-2|+2$. These are all different functions as a whole, but they are identical once we get past $x = 2$. (See graphs of these functions here.) Consequently, we say the have the same germ at $+\infty$.
In this view, $x^2 > x$ at $+\infty$ merely because $x^2$ is eventually greater than $x$. Note that this notion is much finer than asymptotic growth rates, which involve the limits of quotients: $x + 1 > x$ at $+\infty$, even though neither grows asymptotically faster than the other ($\lim_{x \to +\infty} \frac{x+1}{x} = 1$).
There are a lot of neat things we can do with these germs at infinity. We can add, subtract and multiply them. We can sometimes take their derivatives and divide by them. How they work, and when they behave especially nicely, is an active area of research math. See, for example, Hardy fields at Wikipedia.
Solution 3:
If that seems puzzling to you, this will surely blow your mind:
$$\lim_{x \to \infty} \frac{1}{x} = \lim_{x \to \infty} \frac{1}{x^2} = 0$$
even though $\dfrac{1}{x} > \dfrac{1}{x^2}$ for all $x > 1$. Are there different sizes of $0$, too?
Actually, there's nothing special about $\infty$ or $0$ in this respect. It's perfectly possible for two functions to approach the same limit (any limit!) from the same direction, but for one function to be consistently closer to the limit than the other. This doesn't make the limits any different — it just means that one of the functions approaches the limit faster than the other.
(Of course, $\infty$ as a limit is somewhat special in other ways: when we write $\lim_{x \to a} f(x) = \infty$, what we really mean is that $f(x)$ diverges without bound as $x$ approaches $a$ — or as $x$ grows without bound, if $a$ itself is $\infty$. But all these special definitions are really only necessary because the real number line with the usual metric we use on it is not topologically compact, and does not even include $\pm \infty$ as proper points. Instead, we can easily define a metric on the extended real number line, e.g. by homeomorphically mapping it to a closed interval, such that limits at (and to) infinity work just like normal limits.)
Solution 4:
The notation $\lim _{x\to \infty}f(x)=\infty$ is just an abbreviation for $$\forall y\in \mathbb R\;\exists z\in \mathbb R\;\forall x\in \mathbb R\; (x>z\implies f(x)>y).$$
Literally.Nothing more. It does not assume the existence of a "number" $\infty.$ It is useful, in pure math, as well as in applied math, to intuit $x$ as an object in motion, with $f(x)$ varying over time as $x$ does. But the logic of it is that numbers don't move or change. The ( abbreviated ) sentence $\lim_{x\to \infty}f(x)=\infty$ is about a property that is shared by every set of the form $\{(x,f(x)):x>r\}$.
Solution 5:
You misremember how limits interact with inequalities:
Theorem: If
- $f(x) < g(x)$ for all $x$ sufficiently near $a$
- $\lim_{x \to a} f(x)$ exists
- $\lim_{x \to a} g(x)$ exists
Then
- $\lim_{x \to a} f(x) \leq \lim_{x \to a} g(x)$
Note that "less than" becomes "less than or equal to" after taking the limits; the point being that it is possible for $f(x)$ and $g(x)$ to both converge to the same limit, but with $g(x)$ doing so more quickly.
For a finite example, observe that $x^2 < x$ for all $x \in (0,1)$, however $$\lim_{x \to 1^-} x^2 = \lim_{x \to 1^-} x$$ or if you want to see it for limtis at $+\infty$, $0 < \frac{1}{x}$ for all positive $x$, however $$\lim_{x \to +\infty} 0 = \lim_{x \to +\infty} \frac{1}{x} $$
One way to think about how $+\infty$ and $-\infty$ work is to observe that the number line is an interval; however it is an interval without endpoints, like $(0,1)$. The extended number line is formed by adding the two endpoints; so the number line can be written as $(-\infty, +\infty)$, and the extended number line is $[-\infty, +\infty]$.
As $x \to +\infty$, both $x$ and $x^2$ approach the right endpoint $+\infty$ without reaching it; it's just that $x^2$ approaches "faster".
Sometimes, we do want to talk about the rate at which a function approaches $+\infty$ (or other values), in which case we would point out that $x^2$ grows asymptotically faster than $x$ as $x \to +\infty$. Things like this are the subject of asymptotic analysis.