When are algebraic expressions equivalent?

Solution 1:

You've come across a good point (that many students miss). Namely, that when dealing with compositions of functions, the domains are very important (and must be maintained in the final answer).

For the problem of $f\circ f$, since $f$ is not defined at $x=-1$, you must avoid situations where $x=-1$ or where $f(x)=-1$. Since there are no solutions to $f(x)=-1$ (if there were, such $x$'s would need to be removed from the final domain), this means that your domain must only avoid $x=-1$, so it is all real numbers except for $x=-1$. After simplification, one finds that $f\circ f(x)=x$, but this hasn't changed the domain, so we would say that $f\circ f(x)=x$ for all real numbers except for $x=-1$ (so the domain is part of the answer).

In some situations, one would replace $f\circ f$ by $x$ and extending the domain; this is because the singularity at $x=-1$ is removable, namely, there is another function that agrees with $f\circ f$ whenever $f\circ f$ is defined and has a larger domain. In analysis, this is thought of in many ways and can come up with equivalence classes of measurable functions (since $f\circ f$ and $x$ agree except on a measure zero set) or with analytic continuations in complex analysis (analytic continuations extend the domain of a function).

Your second example, $x=x+\frac{1}{x}-\frac{1}{x}$, we consider these as algebraic expressions (and not as functions). The rules are different when the objects are just algebraic expressions because we're not plugging in for $x$. In general, we define two fractions $\frac{a}{b}$ and $\frac{c}{d}$ to be equal when $ad=bc$ (by cross multiplication). In algebraic expression, we don't substitute for $x$ (like in a formula for a function), so we don't worry about when the denominator vanishes.

In conclusion:

  • Algebraically: $x=x+\frac{1}{x}-\frac{1}{x}$.
  • In Functions: $g(x)=x$ and $h(x)=x+\frac{1}{x}-\frac{1}{x}$, but $g\not=h$ because the domains are different.
  • In functions: Let $f(x)=\frac{1-x}{1+x}$, $f\circ f(x)=x$ when $x\not=-1$.
  • Algebraically: $f\circ f(x)=x$
  • Analytically: $f\circ f$ can be extended to a larger domain by $x$.
  • Solution 2:

    It is important to be aware that Wolfram Alpha is a machine, not a source of ultimate truth. You can usually trust its calculations, but for concepts and understanding of what you're doing, you're better off trusting your own understanding. In the particular case of $x+\frac1x-\frac1x$, Wolfram Alpha makes the guess that you're probably not interested in what happens at the isolated points where the original expression is not defined, so it presents a simplified expression that is equivalent to the original one when the original expression is defined. That doesn't represent any deeper mathematical truth; it is just what the programmers felt would probably be most useful for most users.

    It is a fact that the left-hand side of $x+\frac1x-\frac1x=x$ cannot be evaluated when $x=0$, and so the equation is not true when $x=0$. The question is then, should you care? Only you can decide whether it's worth caring about, based on what use you're going to make of the equation and the result. In many cases the "removable singularity" at $x=0$ is completely irrelevant to what you're actually doing, and it is then reasonable to forget about it as quickly as possible. In other cases, it may be a sign that things are going to be hinky around $x=0$ -- perhaps some effects that you thought you could neglect when you came up with $x+\frac1x-\frac1x$ are actually not negligible for small $x$, and you need to go back to the drawing board and do a more careful analysis. It all depends on where the expression $x+\frac1x-\frac1x$ comes from.

    If you're in a classroom setting and have been given the expression $x+\frac1x-\frac1x$ without any context and asked to simplify it, it is usually the case that you will be expected to pretend to care about the trouble at $x=0$, in the absence of any particular reason not to care. But note that this is a highly artificial situation -- when you actually use what you've learned there will always be some context that you should keep in mind to determine whether the non-definedness at $x=0$ is a relevant fact for you or not.

    Solution 3:

    The key is that $\frac{1-x}{1+x}$ doesn't make sense when $x=-1$.

    So $f(f(-1))=f(\mathrm{undefined})=\mathrm{undefined}$.

    When simplifying, $$f(f(x))=\frac{1-\frac{1-x}{1+x}}{1+\frac{1-x}{1+x}}=\frac{1-\frac{1-x}{1+x}}{1+\frac{1-x}{1+x}}\cdot\frac{1+x}{1+x}=\frac{1+x-1+x}{1+x+1-x}=\frac{2x}{2}=x$$

    But multiplying by $\frac{1+x}{1+x}$ is only valid when $x\neq-1$. So we can only simplify it if $x\neq-1$.