Is calculating the summation of derivatives "mathematically sound"?
I have just discovered that if you take the following series: $$1 + x + x^2 + x^3 + x^4 + \cdot \cdot \cdot = \sum_{n = 0}^\infty x^n$$ and replace each term in the series with the derivative of them, you'll get: $$1 + 2x + 3x^2 + 4x^3 + 5x^4$$ Which I think could simplify to this: $$\sum_{n = 0}^\infty \frac {d}{dx}x^n$$ The question about this is: Is it [mathematically] sound to compute a summation of derivatives (or differentials)? I'm asking this because it looks like it is sound in this case because we are adding up all the derivatives of $x^n$ until $x = \infty$. So, is it sound to compute sums of derivatives?
Reminders about Question
I have seen a question related to this: infinite summation of derivatives of a convergent function, but it didn't get me to where I am aiming for. I have also seen Calculus Summations and Help with derivative inside a summation, but they don't answer my question.
This sort of thing usually comes up in the context of trying to interchange the derivative with the sum: that is, you would like to have
$$\frac{d}{dx} \sum_{n=0}^\infty f_n(x) = \sum_{n=0}^\infty f'_n(x)$$
by analogy with the case of finite summation. This interchange can sometimes fail. The most basic criterion that I have heard of is one of the "advanced calculus criteria" (so called because it is taught in undergraduate real analysis and has no famous name). It requires that $\sum_{n=0}^\infty f'_n(x)$ converges uniformly in $x$ (over the domain on which you have the equality). Milder criteria exist; for instance there is a variant based on the dominated convergence theorem from measure theory.
I'm not sure if this answers your question because I'm not sure what "sound" means in this context.
Notice that $$ \sum_{n=2}^N \left( \frac {\sin((n+1)x)}{n+1} - \frac{\sin(nx)} n \right) = \frac{\sin((N+1)x)}{N+1} - \frac{\sin(2x)} 2 \longrightarrow \frac{-\sin(2x)} 2 \text{ as } N\to\infty $$ and so $$ \frac d {dx} \sum_{n=2}^\infty \left( \frac {\sin((n+1)x)}{n+1} - \frac{\sin(nx)} n \right) = \frac d {dx} \frac{-\sin(2x)} 2 = -\cos(2x). $$ But \begin{align} & \sum_{n=2}^N \frac d {dx} \left( \frac {\sin((n+1)x)}{n+1} - \frac{\sin(nx)} n \right) \\[8pt] = {} & \sum_{n=2}^N \left( \cos((n+1)x) - \cos(nx) \right) \\[8pt] = {} & \cos((N+1)x) - \cos(2x) \end{align} and for most values of $x$ this does not converge as $N\to\infty$. Hence $\dfrac d{dx} \sum\limits_n\cdots$ is not in all cases equal to $\sum\limits_n\dfrac d{dx}\cdots$.
However, if a power series $$ \sum_{n=0}^\infty a_n x^n \tag 1 $$ converges in an interval $(-R,R)$, then it can be validly differentiatied term-by-term in that interval. This follows in part from the fact that the convergence of $(1)$ is uniform, not necessarily in the interval $(-R,R)$, but in every interval $(-R+a,R-a)$, no matter how small $a>0$ is.
(I might have considered attempting to include a proof in this answer, but you've already accepted another answer$\,\ldots$)