Can the difference of 2 undefined limits be defined?
Is this limit defined or undefined? $$\lim\limits_{x \to 0+} \left(\sqrt{\frac{1}{x}+2}-\sqrt{\frac{1}{x}}\right)$$ When I apply the rule of difference of limits, it's undefined. But, when I manipulate it, it gives me zero. And the graph of the function indicates it's defined on the right side.
By multiplying by $\frac{\sqrt{\frac{1}{x}+2}+\sqrt{\frac{1}{x}}}{\sqrt{\frac{1}{x}+2}+\sqrt{\frac{1}{x}}}$: $$\lim\limits_{x \to 0+} \frac{\left( \sqrt{\frac{1}{x}+2}-\sqrt{\frac{1}{x}} \, \right) \left(\sqrt{\frac{1}{x}+2}+\sqrt{\frac{1}{x}} \, \right)}{\sqrt{\frac{1}{x}+2}+\sqrt{\frac{1}{x}}}$$
$$=\lim\limits_{x \to 0+} \frac{\frac{1}{x}+2-\frac{1}{x}}{\sqrt{\frac{1}{x}+2}+\sqrt{\frac{1}{x}}}$$ $$=\lim\limits_{x \to 0+} \frac{2}{\sqrt{\frac{1}{x}+2}+\sqrt{\frac{1}{x}}}$$ Then, we multiply by $\frac{\sqrt{x}}{\sqrt{x}}$: $$=\lim\limits_{x \to 0} \frac{2\sqrt{x}}{\sqrt{1+2x}+1}$$ And, we substitute: $$=\frac{2\sqrt{0}}{\sqrt{1+2\times0}+1} = 0$$ So, is this limit defined or not? and what's my error, if any?
Remember that the rule that you referred to, "the rule of difference of limits", is not just the equation $$ \lim_{x\to a}(f(x)-g(x))=\lim_{x\to a}f(x)-\lim_{x\to a}g(x) $$ but rather the statement that, if both of the limits on the right side of this equation are real numbers, then the limit on the left side (is also a real number and) is given by this equation. So this rule does not apply to the limit in your question.
More generally, when learning rules (or theorems or principles or whatever they may be called), don't just learn formulas, but pay attention also to the words around them. The words are not just decoration but are essential for the correctness of the rule.
What you did is correct. The point is that in its initial form, your problem was of an indeterminate form. Basically, if we have a limit that "looks like" $\infty-\infty$ or something like this, the value can basically be anything under the sun. It can be illuminating to see the process in reverse. $$ 0=\lim_{x\to\infty} 0=\lim_{x\to\infty}(x-x)\ne \lim_{x\to\infty} x-\lim_{x\to\infty}x"="\:\text{nonsense}.$$
Just to add to what has been said, a limit expression that has undefined (or infinite) value cannot be treated as an ordinary real expression, and so it technically is invalid (senseless) to manipulate it as if it were a real number. For example $\sin(n)-\sin(n) \to 0$ as $n \to \infty$, but "$\lim_{n\to\infty} \sin(n)$" itself is simply undefined and it is technically invalid to even write "$\lim_{n\to\infty} \sin(n) - \lim_{n\to\infty} \sin(n)$", not to say ask for its value (unless you want to have propagation of undefined values...).
So the literal answer to your question is:
The difference of 2 undefined limits cannot be defined, by definition. (Even if you wish to permit writing potentially undefined expressions, it would not make a difference, since any expression with an undefined subexpression will itself be undefined.)
The correct statement is that it is possible for the difference of two expressions to have a limit even though neither of the expressions has a limit (under the same limiting conditions).
I have the same take as user21820. Your error is treating “undefined” like it's a value when it's really just a predicate.
When we write $\lim_{x\to a} f(x) = L$, it's not really an equation so much as a statement. It's a statement about $f$, $a$, and $L$ all in one. When we say that $\lim_{x\to a} f(x)$ is undefined, we mean that $\lim_{x\to a} f(x) = L$ is not true for any number $L$.
It's tempting to think of “undefined” as some magic quantity which nullifies real numbers. Sometimes this leads to true statements. For instance, if $\lim_{x\to a} f(x) = L$ and $\lim_{x\to a} g(x)$ is undefined, then $\lim_{x\to a} (f(x) + g(x))$ is undefined. You might want to think of this succinctly as “finite plus undefined equals undefined.”
But it also leads to false statements, as you have discovered. It's just not true that if $\lim_{x\to a}f(x)$ is undefined and $\lim_{x\to a}g(x)$ is undefined, then $\lim_{x\to a} (f(x) + g(x))$ is undefined. The simplest counterexample would be if $g(x) = -f(x)$. So even though you might want to think to yourself “undefined plus undefined equals undefined,” this is specious reasoning.
In a lot of computer languages, you can have an undefined or null value, and in some cases it can combine with other values. For instance, in Excel, when a formula in a cell evaluates to #NA, any formula that uses that cell will also evaluate to #NA. But it doesn't work that way with limits in math.