Problem involving Rolle's Theorem

Prove that, between any $2$ roots of $e^x\sin x = 1$, there exists a root of $e^x\cos x + 1=0$.

I am able to solve the problem in this way:

Let $a,b$ be the roots of $f(x)=e^x\sin x - 1$. Define $g(x)=e^{-x}f(x) = \sin x-e^{-x}$. Then it's simple to notice that $a,b$ are the roots of $g(x)$ too (as $e^{-a} = \sin a$ and $e^{-b} = \sin b$). Applying Rolle's Theorem on $g(x)$ tells us that there exists a $c \in \mathbb R$ such that $g'(c) = e^{-x}+\cos x=0, $ or $1+e^{x}\cos x=0 $. Q.E.D

But when I first set out to solve the problem I was thinking in a different way. Notice that applying Rolle's Theorem on $f(x)$ gives us $$e^x\sin x+ e^x\cos x=0.\tag1$$Now if we need to prove that$$e^x\cos x + 1=0,\tag2$$we need to prove that there exists some $c \in (a,b)$ such that $e^x\sin x =1$(this follows directly from the two equations), i.e. we need to find a root of $f(x)$ in $(a,b)$ and prove that such a root exists for all $a$ and $b$ in the domain of $f(x)$.

So here's the graph of $e^{-x}$ and $\sin x$ plotted by Desmos:


enter image description here Clearly it is not true that between any two points of intersection lies another(else the intersection points would form a continuous curve, if you get what I mean).

So my question is, where in this method have I messed up?


Your error lies in assuming that the $x$ such that $e^x\cos(x)+1=0$ is the same $x$ such that $e^x\cos(x)+e^x\sin(x)=0$. It doesn't have to be. The function $f$ has a zero when $x\approx r_1=0.588533$ and the first zero after that is when $x\approx r_2=3.09636$. But in $[r_1,r_2]$ the only solution of $e^x\cos(x)+1=0$ is when $x\approx1.74614$ and the only solution of $e^x\cos(x)+e^x\sin(x)=0$ is when $x\approx2.35619$.