Basic Taylor expansion question
Edited and rewritten, hopefully to make it clearer.
There is something that the notation does not help and which you need to keep straight (this may be the source of your confusions). $R_n(x)$ is the "remainder" at $x$: the error between the actual value of the function and the value you get from evaluating the Taylor polynomial of degree $n$ instead. That is, the $n$th remainder is equal to $$f(x) - \sum_{k=0}^n \frac{f^{(k)}(0)}{k!}x^k$$ (I'm assuming the polynomial is being expanded around $a=0$).
So the remainder really needs three pieces of information to make it precise: the value of $n$, the value of $x$, and the function $f$ in question. It would be best to write it as $R(n,x,f)$, but since the function $f$ is usually clear from context, we ignore it. But I will use $R(n,x,f)$ in what follows. So notice that in your question, in (1) what you have is $R(n,x,xe^x)$, while in (2) what you have is $xR(n-1,x,e^x)$. And so the question is what is the relation between the two expressions.
Before addressing this in more detail, let me take a slight detour to Taylor series.
Say we have the Taylor series for $f$, $$\sum_{k=0}^{\infty}\frac{f^{(k)}(0)}{k!}x^k.$$ We know this power series converges inside the radius of convergence (which can be found using, say, the Ratio Test); for $f(x)=e^x$, the radius of convergence is infinite, so the series converges for all values of $x$.
However, there is still the question of whether the value of the series at a given $x$ equals the value of the function at that point. This is where the remainders come in. The Taylor series for $f$ converges to $f(x)$ at the point $x_0$ if and only if $R(n,x_0,f)\to 0$ as $n\to\infty$.
In the case of $f(x)=e^x$, one can show that the Taylor series not only converges everywhere, but also that it converges to $e^x$. For example, this can be done using the Cauchy estimate for the remainder: given $r\gt 0$, and $x$ in $(-r,r)$, if $M_n$ is a number such that $|f^{(n+1)}(x)|\leq M_n$ for all $x$ in $(-r,r)$, then $$|R(n,x,f)|\leq \frac{M_nr^{n+1}}{(n+1)!}.$$ For $f(x) = e^x$, you can take $M=e^r$ (or even $3^r$), so you get an exponential divided by a factorial, and that goes to $0$ as $n\to\infty$. This holds for any $x$ (by changing the $r$), so that $R(n,x,e^x)\to 0$ as $n\to\infty$ for all $x$. So the Taylor series for $e^x$ converges to $e^x$ at every $x$. That is, for each $x$, $$e^x = \sum_{k=0}^{\infty}\frac{x^k}{k!} = 1 + x + \frac{x^2}{2!} + \cdots + \frac{x^n}{n!}+\cdots$$ in the sense that the value of the limit on the right hand side is exactly equal to $e^x$.
The "Taylor series" for $x$ is also very easy: it's just $x$ itself (not even a real series). It also has infinite radius of convergence, and $R(n,x,x) = 0$ for any $n\gt 1$.
It is a theorem that if the Taylor series for $f$ converges to $f$ in $(-r,r)$, and the Taylor series for $g$ converges to $g$ in $(-R,R)$, then the product of the series will converge to $fg$ in $(-\min(r,R),\min(r,R))$ (that is, on the interval where they both converge). For $e^x$ and $x$, this tells you that indeed you have that $$xe^x = x\left(\sum_{k=0}^{\infty}\frac{x^k}{k!}\right) = \sum_{k=0}^{\infty}\frac{x^{k+1}}{k!} = \sum_{k=1}^{\infty}\frac{x^k}{(k-1)!}.$$ Because the series converges, and converges to $xe^x$, that means that if you write $$xe^x = \sum_{k=1}^n \frac{x^k}{(k-1)!} + R(n,x,xe^x)$$ then you have that $$R(n,x,xe^x) = \sum_{k=n+1}^{\infty} \frac{x^k}{(k-1)!}$$ (in the sense that the value of $R(n,x,xe^x)$ is the limit of the partial sums of that series) and moreover that $$\sum_{k=n+1}^{\infty}\frac{x^k}{(k-1)!} \to 0\text{ as }n\to\infty.$$
What you are doing in (2) is to essentially deal with the Taylor polynomials instead of the series as I did above. We have that the Taylor series for $e^x$ converges to $e^x$ at every $x$, so that if we write $$e^x = \sum_{k=0}^{n}\frac{x^k}{k!} + R(n,x,e^x)$$ then $$R(n,x,e^x) = \sum_{k=n+1}^{\infty}\frac{x^k}{k!}$$ and $R(n,x,e^x)\to 0$ as $n\to \infty$.
Multiplying the expression for $e^x$ by $x$, we get $$xe^x = x\left(\sum_{k=0}^n\frac{x^k}{k!} + R(n,x,e^x)\right) = \sum_{k=0}^n\frac{x^{k+1}}{k!}+xR(n,x,e^x).$$ Except that note that this time $xR(n,x,e^x)$ is giving the $(n+1)$st remainder, since the polynomial we have is of degree $n+1$. So, equating this with the expression we had before, this suggests that we should have $$R(n+1,x,xe^x) = xR(n,x,e^x).$$ And indeed, this is what we find when we express them as series and limits: \begin{align} xR(n,x,e^x) &= x\left(\sum_{k=n+1}^{\infty}\frac{x^k}{k!}\right)\\ &=x\left(\lim_{m\to\infty}\sum_{k=n+1}^m\frac{x^k}{k!}\right)\\ &= \lim_{m\to\infty}\left(x\sum_{k=n+1}^m \frac{x^k}{k!}\right)\\ &= \lim_{m\to\infty}\sum_{k=n+1}^m \frac{x^{k+1}}{k!}\\ &=\lim_{m\to\infty}\sum_{k=n+2}^{m-1}\frac{x^k}{(k-1)!}\\ &= \lim_{m\to\infty}\sum_{k=n+2}^{m}\frac{x^k}{(k-1)!}\\ &= R(n+1,x,xe^x). \end{align}
What this tells you is that the $n+1$st remainder at $x$ for the function $xe^x$ equals $x$ times the $n$th remainder at $x$ for the function $e^x$. So your equations both make sense and they are telling you correct things, once you put back the necessary information into the remainder.