Compute the recursion directly: Define $$\Gamma(\alpha;x) = \int_{z = x}^\infty \frac{z^{\alpha-1} e^{-z}}{\Gamma(\alpha)} \, dz$$ (which is the regularized upper incomplete gamma function). For integer $\alpha > 1$, integration by parts with the choice $$u = \frac{z^{\alpha-1}}{\Gamma(\alpha)}, \quad du = \frac{z^{\alpha-2}}{\Gamma(\alpha-1)} \, dz, \\ dv = e^{-z} \, dz, \quad v = -e^{-z},$$ yields $$\begin{align*} \Gamma(\alpha;x) &= \left[-\frac{z^{\alpha-1}e^{-z}}{\Gamma(\alpha)} \right]_{z=x}^\infty + \Gamma(\alpha-1;x) \\ &= \frac{x^{\alpha-1} e^{-x}}{\Gamma(\alpha)} + \Gamma(\alpha-1; x) \\ &= \frac{x^{\alpha-1} e^{-x}}{(\alpha-1)!} + \Gamma(\alpha-1;x). \end{align*}$$ Then unfolding the recursion and observing $\Gamma(1;x) = e^{-x}$, we get $$\Gamma(\alpha;x) = \sum_{k=0}^{\alpha-1} e^{-x} \frac{x^k}{k!},$$ as claimed.


Intuitive explanation & proof

First I'll try to explain this in words. Suppose you are fishing, and your catches occur randomly, however, with a fixed 'average' rate. Then the number of fishes you will probably have caught at a certain moment are Poisson distributed (https://en.wikipedia.org/wiki/Poisson_distribution):

Suppose that according to the statistics, you should have caught $\mu$ fishes by now. Then the chance you caught $k$ fishes by now is:

$$ \displaystyle P(X=k) = \frac{\mu^k e^{-\mu}}{k!}$$

So the sum in your exercise expresses the chance on having caught less than $k$ fishes:

$$\sum_{x=0}^{k-1}\frac{\mu^x e^{-\mu}}{x!}$$

Well: the chance that you caught less than $k$ fishes, is equivalent to the chance that either now or in the future, there will be a specific moment that you will have caught exactly $k-1$ fishes.

Why? Because with the given success rate per time, eventually the moment will come that you have hooked fish number $k-1$.

So how can we express this with the Poisson distribution? Well, we keep focused on fish $k-1$, however, we just let our expectation value $\mu$ tend to infinity.

$$ = \int_{\mu}^{\infty} \frac{z^{k-1}e^{-z}}{{(k-1)}!} \, dz $$

How? By simply waiting till the end of times...

Proof

(from: http://qr.ae/TYDZVK)

Consider the following upper incomplete Gamma function [1],

$$ \Gamma(s, \mu) = \int_{\mu}^{\infty} z^{s - 1} e^{-z}dz $$

Using integration by parts,

$$ \Gamma(s, \mu) = [-z^{s - 1}e^{-z}]_{\mu}^{\infty} - \int_{\mu}^{\infty}(s - 1)z^{s - 2}(-e^{-z})dz $$

This gives,

$$ \Gamma(s, \mu) = (s - 1)\Gamma(s - 1, \mu) + \mu^{s-1}e^{-\mu}$$

Note that this is a recurrence relation. If $s$ is a positive integer (say $k$), then, we can solve the recurrence relation in the following manner,

$$ \Gamma(k, \mu) = (k - 1)\Gamma(k - 1, \mu) + \mu^{k-1}e^{-\mu} \;\;\; \cdots \;\;\; (1)$$

Write the recurrence relation for $k-1$ and multiply by $k-1$ to get,

$$ (k - 1)\Gamma(k - 1, \mu) = (k - 1)(k - 2)\Gamma(k - 2, \mu) + (k - 1)\mu^{k-2}e^{-\mu} \;\;\; \cdots \;\;\; (2) $$

Write the recurrence relation for $k-2$ and multiply by $(k-1)(k-2)$ to get,

$$ (k - 1)(k - 2)\Gamma(k - 2, \mu) = (k - 1)(k - 2)(k - 3)\Gamma(k - 3, \mu) + (k - 1)(k - 2)\mu^{k-3}e^{-\mu} \;\;\; \cdots \;\;\; (3) $$

We can continue so on up to,

$$ (k - 1)!\Gamma(1, \mu) = 0 \times \Gamma(0, \mu) + (k - 1)!\mu^{0}e^{-\mu} \;\;\; \cdots \;\;\; (k) $$

Adding all of the k equations, we get,

$$ \Gamma(k, \mu) = (k - 1)! \sum_{x = 0}^{k - 1} \frac{\mu^{x}e^{-\mu}}{x!}$$

That is,

$$ \Gamma(k, \mu) = \Gamma(k) \sum_{x = 0}^{k - 1} \frac{\mu^{x}e^{-\mu}}{x!}$$

$$ \frac{\Gamma(k, \mu)}{\Gamma(k)} = \sum_{x = 0}^{k - 1} \frac{\mu^{x}e^{-\mu}}{x!} $$

Hence proved.

[1] Incomplete Gamma Function (http://mathworld.wolfram.com/IncompleteGammaFunction.html)

See also: Poisson and Erlang distribution: relation between their CDF's (https://goo.gl/i8eo7h)