Why can there be an infinite difference between two functions as x grows large, but a ratio of 1?

I learned in grade school that the closer $a$ and $b$ are to one another, the closer $\frac{a}{b}$ is going to be to $1$. For example, $\frac{3}{\pi}$ is pretty close to 1, and $\frac{10^{100}}{42}$ isn't even close to 1.

So, why is:

$$\lim_{x\to\infty} \frac{x^{2}}{x^{2}+x} = 1$$

But:

$$\lim_{x\to\infty}[(x^2+x)-(x^2)] = \infty$$ ?

Seems pretty counterintuitive. What's going on here?


Vadim's answer handles the math (and I've upvoted it), so I will try to provide intuition.

The idea is the word "closer" is relative. That is, in some sense, $100{,}000$ is closer to $100{,}010$ than $1$ is to $0$. Of course, in an absolute sense, $1$ is $1$ away from $0$ while $100{,}000$ is $10$ away from $100{,}010$.

But, say you had $\$1$ and I took away $\$1$ and contrast that with the scenario where you have $\$100{,}010$ and I take $\$10$. In the second scenario, I take more money. But in the first scenario, you care a lot more about the theft.

In your particular example, something similar happens. Yes, $x^2 + x$ and $x^2$ have a very large difference when $x$ is large. But the relative difference is tiny. For example, when $x = 1{,}000$, then $x^2 + x = 1{,}001{,}000$ while $x^2 = 1{,}000{,}000$. So, yes, the difference is large (it's $1000$), but the relative difference is not that big - these two numbers are separated by a mere $0.1\%$ And this percentage shrinks as $x$ gets even bigger.


The reason is that $$\frac{x^2}{x^2+x}=\frac{x^2}{x^2+x}\frac{1/x^2}{1/x^2}=\frac{1}{1+1/x}$$ The fraction is always less than $1$ (for positive $x$), as the denominator is bigger than $1$. However, as $x\to\infty$, we have $\frac{1}{x}\to 0$, so the fraction gets closer and closer to $1$.