Can one differentiate an infinite sum?
One has to be careful about asking about the general conditions under which an infinite series can be differentiated.
There's both an algebraic aspect and an analytic aspect. The set of all formal power series with coefficients in a field-either real or complex-forms a ring under addition and term by term multiplication. This is purely an algebraic operation and is completely independent of whether or not the sum converges.We can treat the infinite formal series as a sort of infinite polynomial and add, multiply and differentiate terms where differentiation is defined as a derivation operation on the terms.
When we deal with analysis and questions where the convergence of the series matters,things get more delicate. The central theorem that sets the limitations of differentiation is as follows:
12.17 Theorem (Differentiation theorem.) Let $\sum\{a_nz^n\}$ be a power series. Then $D(\sum\{a_nz^n\})$ and $(\sum\{a_nz^n\})$ have the same radius of convergence. The function $f$ associated with $\sum\{a_nz^n\}$ is differentiable in the disc of convergence, and the function represented by $D(\sum\{a_nz^n\})$ agrees with $f'$ on the disc of convergence.
The proof is rather tedious and delicate,but basically it boils down to the fact that taking the derivation operation on a power series that converges in a particular radii of convergence yields a series that a) converges to the derivative of the function the original power series represents and b) the 2 series agree within the same radii of convergence. Bottom line-it's always possible to formally differentiate a power series as a ring, but it may or may not yield an analytically valid result in the sense of calculus. Stronger results need to be applied for both conditions to hold.
That help?
Mathemagician1234's answer gives general theorem for power series, but there is also a more general statement about function sequences. As the question in the title is about "infinite sum" in general, it may be worth to mention it.
Theorem Suppose $\{f_n\}$ is a sequence of functions, differentiable on $[a, b]$ and such that $\{f_n(x_0)\}$ converges for some point $x_0$ on $[a,b]$. If $\{f_n^\prime\}$ converges uniformly on $[a,b]$, then $\{f_n\}$ converges uniformly on $[a, b]$, to a function $f$, and $$ f^\prime(x) = \lim_{n\to\infty} f_n^\prime(x) \qquad (a \leq x \leq b) $$
(quoted from Rudin's Principles of Mathematical Analysis)
This, the fact that radius of convergence of the formal derivative of power series is the same and that power series is uniformly convergent inside its radius of convergence yields the special case mentioned by Mathemagician1234.
Recall the finite sum evaluation $$ 1+x+x^2+...+x^n=\frac{1-x^{n+1}}{1-x}, \quad |x|<1. \tag1 $$ Then by differentiating $(1)$ you get $$ 1+2x+3x^2+...+nx^{n-1}=\frac{1-x^{n+1}}{(1-x)^2}-\frac{(n+1)x^{n}}{1-x}, \quad |x|<1, \tag2 $$ and by making $n \to +\infty$, using $|x|<1$, the right hand side of $(2)$ gives the desired result.
This is a stronger version (uniformly integrability condition) of what you need:
If a sequence of absolutely continuous functions {$f_n$} converges pointwise to some $f$ and if the sequence of derivatives {$f_n’$} converges almost everywhere to some $g$ and if {$f_n’$} is uniformly integrable then $\lim\limits_{n\mapsto \infty} f_n’ = g= f’$ almost everywhere. Where the derivative of $f$ is $f’$. If the convergence is pointwise and $ g $ is continuous then $ f'$ = $ g $ everywhere.
Proof : by FTC $f_n(x) – f_n(a) = \int_a^x f_n’ dx$
By Vitali convergence theorem : $\lim\limits_{n\mapsto \infty}\int_a^x f_n’ dx = \int_a^x g dx$
Therefore $\lim\limits_{n\mapsto \infty}( f_n(x) – f_n(a))= \int_a^x g dx$
$f(x)-f(a) = \int_a^x g dx$
$f(x)’=g$ almost everywhere
If the convergence is pointwise and $ g $ is continuous then $ f'$ = $ g $ everywhere.