Is there a(n elementary) function whose derivative we cannot integrate?

Say, for example, I take a reasonably-complicated function $f(x)=\tanh[\ln(x^x)]$, and differentiate it to get $$f'(x)=\frac{4x^{2x} [1+\ln(x)]}{(x^{2x}+1)^2}.$$

Now, to integrate this, I imagine, would be very difficult and time-consuming.

My question is: does there exist a function $f$ whose derivative $f'$ we can't integrate (without differentiating $f$ in the first place), using substitution, parts and/or partial fractions (or other integration methods)?

My motivation for this is that it's very easy to differentiate even a ridiculously-complicated function (using the chain and/or product rule), but I've often wondered whether we could get back to the original function by integrating this derivative.

If I'm not articulating myself clearly enough, please ask me to explain further.

Thanks!

Edit

I know, from the Fundamental Theorem of Calculus, that such a function can be integrated, but, other than knowing the fact that this is the derivative of a suitable function, could it be impossible to reverse-engineer the problem, to get $f'$ back to $f$ (using only methods of integration)?


The answer is no.

While the methods you learn in calculus like partial fractions, integration by parts, etc. are going to be limited, there is a complete algorithm, called the Risch algorithm for integrating elementary functions. Complete means that for any elementary function, the algorithm will either find an elementary antiderivative, or prove that none exists (note that an antiderivative always exists, but it will not always be an elementary function, for instance, it's well known that $\int e^{-x^2}\, dx$ is not an elementary function).

Since the derivative of an elementary function is always an elementary function, the Risch algorithm applied to any derivative of an elementary function will always work. It may not produce exactly the same function back, since the antiderivative is only defined up to an additive constant (i.e., you might get your original function back plus $C$).

Note that the Risch algorithm itself is very complicated, and involves some deep algebra, and a ton of different technical cases. It's much more complicated than the tricks you learn in calculus.

Many computer algebra systems implement some subset of the Risch algorithm. For instance, Mathematica (or Wolfram Alpha) should be able to integrate just about any derivative of an elementary function that you throw at it.

If you're interested in a high level view of the Risch algorithm, I recommend Manuel Bronstein's Symbolic integration tutorial. If you want a more detailed view (with pseudocode to implement in a compute algebra system), I recommend his book, Symbolic Integration I: Transcendental Functions.


The question as posed has no answer. As opposed to differentiation, integration is not a well-defined operation: there is not a fixed set of rules to apply (as you have with differentiation of elementary functions). So the question of whether an anti-derivative cannot be found depends on a person's ability/ingenuity.

That anti-derivative that looks impossible to you, might be obvious to someone else.