Prove that a polynomial has at least one nonreal complex root
Solution 1:
Between any two real roots of a polynomial there should be at least one root of its derivative. So the maximum possible number of roots in the polynomial is the number of roots of the derivative plus one.
In this case, we have $f(x)=x^5+x^4/2+x^3/3+x^2/4+x/24+1/120$, and $$ f'''=60x^2+12x+2, $$ which has no real roots. So $f''$ has at most one real root; $f'$ has at most two real roots, and finally $f$ has at most three real roots. We conclude that $f$ has at least two complex roots.
Solution 2:
Here is a way to do it without resorting to calculus:
Eliminate the quartic term (making it into a "depressed quintic") by making the substitution $x=z-1/10$ (as $1/10$ is one fifth of the coefficient of $x^4$), which turns the polynomial into $$z^5+\frac{7 z^3}{30}+\frac{17 z^2}{100}+\frac{z}{6000}+\frac{239}{37500}\text{.}$$ By Descartes' Rule of Signs, this has no positive roots and either three or one negative roots. $0$ certainly isn't a root, so it must have at least 2 non-real roots.
As an aside, you could use the discriminant to show further that it must have exactly one real root, but the discriminant is $2258539/17915904000$ and is not something you could calculate by hand, given the formula for the discriminant of a monic quintic is pretty terrible (which you can verify with Wolfram|Alpha).