Partial fraction expansion for non-rational functions

I felt the answer to the original question posed had to be "No", because since $\Gamma(s)$ has simple poles at the nonpositive integers, $\Gamma(s)^2$ has double poles, and your series was suspiciously convergent, suggesting that the right-hand side only had simple poles. Thankfully, this proved to be correct.

So, onto answering the modified question. I'll run through a simpler example first to work out what's going on, then tackle $\Gamma(s)$. The answer to the broader question, "Is there a generalisation of partial fractions to analytic functions", is yes: this is provided by Mittag-Leffler's Theorem:

Let $D$ be an open set in $\mathbb{C}$ and $E \subset D$ a closed discrete subset. For each $a$ in $E$, let $p_a(z)$ be a polynomial in $1/(z-a)$. There is a meromorphic function $f$ on $D$ such that for each $a \in E$, the function $f(z)-p_a(z)$ is holomorphic at $a$. In particular, the principal part of $f$ at $a$ is $p_a(z)$.

This is an existence theorem, but we can in principle extend it to build functions we are actually interested in. For example, the Gamma function is given by the Mellin-type integral $$ \Gamma(s) = \int_0^{\infty} x^{s-1} e^{-x} \, dx, \quad (\Re(s)>0). $$ We can produce a partial fractions expansion for $\Gamma$ in the following way: notice that the problems with this integral converging all stem from the possible singularity at zero. Therefore, we expect that the part of the integral near zero will tell us about the poles of $\Gamma$. Therefore, write $$ \Gamma(s) = \int_0^1 x^{s-1} e^{-x} \, dx + \Gamma_1(s), \quad (\Re(s)>0). $$ $\Gamma_1(s)$ is the upper incomplete Gamma function evaluated at $1$; by trivial bounding arguments, it is convergent for all $s$ and hence defines and entire function (with a convergent power series [...]), so we can safely ignore it from the point of view of partial fractions, residues and so on.

The lower integral's integrand is finite for $s>0$, and in particular we expand $e^x$ in a power series, and can then interchange the order of integration and summation as follows: $$ \begin{align*} \int_0^1 x^{s-1} e^{-x} \, dx &= \int_0^1 x^{s-1} \left( \sum_{k=0}^{\infty} (-1)^k \frac{x^k}{k!} \right) \, dx \\ &= \sum_{k=0}^{\infty} \frac{(-1)^k}{k!} \int_0^1 x^{s+k-1} \, dx \\ &= \sum_{k=0}^{\infty} \frac{(-1)^k}{k!} \frac{1}{s+k} \end{align*}$$ Ah-ha! This is a convergent sum for any complex $s$ that is not a nonpositive integer (just lop off the finite number of terms with $\Re(s)+k<0$ and compare with the exponential sum, for example). It has exactly the same poles as $\Gamma(s)$ (as we hoped for, given that it is $\Gamma(s)$ with an analytic bit subtracted off), and, even better, we got all of $\Gamma(s)$'s residues for free, too (and inspection tells us we were right about what they were all along).


Okay, so we know about $\Gamma(s)$ now. Now we can move onto your question, what about $\Gamma(s)^2$? Now, I could do this by squaring the expansion I just found, but that looks like a seriously unpleasant idea. Thankfully you have provided us with a better way using the integral representation $$ \int_0^{\infty} x^{s-1} \left( 2K_0 ( 2 \sqrt{x} ) \right) \, dx = \Gamma(s)^2. $$ First we write $K_0$ in a more useful form using the identity (mentioned in your original answer, and also in DLMF) $$ 2K_0(2\sqrt{x}) = \sum_{k=0}^{\infty} \frac{2H_k}{(k!)^2} x^k - \left( 2\gamma+\log{z} \right) I_0(2\sqrt{x}) = \sum_{k=0}^{\infty} \frac{1}{(k!)^2} \left( 2H_k-2\gamma- \log{x} \right) x^k . $$

Now we do as we did before, splitting at $x=1$ and ignoring the analytic bit of the integral, the singular part is: $$ \begin{align*} \int_0^1 x^{s-1} 2K_0(2\sqrt{x}) \, dx &= \int_0^1 x^{s-1} \left( \sum_{k=0}^{\infty} \frac{1}{(k!)^2} \left( 2H_k-2\gamma- \log{x} \right) x^k \right) \, dx \\ &=\sum_{k=0}^{\infty} \frac{1}{(k!)^2} \left( 2(H_k-\gamma) \int_0^1 x^{s+k-1} \, dx -\int_0^1 x^{s+k-1}\log{x} \, dx \right) \end{align*}$$

As every schoolchild knows, $$ \int_0^{1} x^{n}\log{x} \, dx = \left. x^{n+1} \left( \frac{\log{x}}{n+1} - \frac{1}{(n+1)^2} \right) \right|_)^1 = -\frac{1}{(n+1)^2}, $$ and hence we obtain the expansion $$ \Gamma(s)^2 = \sum_{k=0}^{\infty} \left( \frac{1}{(k!)^2} \frac{1}{(s+k)^2} + 2\frac{H_k-\gamma}{(k!)^2} \frac{1}{s+k} \right) + \int_1^{\infty} x^{s-1} 2K_0(2\sqrt{x}) \, dx $$

This has all the properties we could hope for: double poles, the right residues, and the coefficients of the double poles are what we'd expect naïvely.


Final important remark: the alert reader my have noticed that I swept something under the rug: I chose to split the integral at $x=1$. Won't this choice cause problems for the values of the residues? In particular, why are the residues well-defined when I produced them out of a hat with this procedure?

Suppose instead I chopped the gamma integral at $x=a$, for example. Then the $k$th term in the expansion of the finite interval's integral is proportional to $$ \int_0^a x^{s+k-1} \, dx = \frac{a^s a^k}{s+k}, $$ and $a^s$ is only analytic in $s$ throughout $\mathbb{C}$ when $a=1$, so the only way to get meromorphic terms in the series was to choose $a=1$ (or $0$ or $\infty$, neither of which give us anything). Hence the choice is no choice at all, and the residues are what I stated they would be.