Limitations of approximating $\sin(x) = x$
As you know, $\sin x$ is only approximately equal to $x$, and $\cos x$ is only approximately $1 - \frac12 x^2$. How do you know when the approximation is good enough? One way is to keep track of how big the terms we've thrown away are. If you look at the Taylor series of $\sin$ and $\cos$, you find that $\sin x = x + O(x^3)$, and $\cos x = 1 - \frac12 x^2 + O(x^4)$, where $O(x^n)$ means something on the order of $x^n$ whose exact value we don't care about. So your limit is $$\begin{align} \lim_{x\to 0} \frac {\cos x \sin x - x}{\sin^3 x} &= \lim_{x\to 0} \frac{\big(1 - \frac12 x^2 + O(x^4)\big)\big(x + O(x^3)\big) - x}{\big(x + O(x^3)\big)^3} \\ &= \lim_{x\to 0} \frac{\big(x - \frac12 x^3 + O(x^3)\big) - x}{x^3 + O(x^5)} \\ &= \lim_{x\to 0} \frac{-\frac12 x^3 + O(x^3)}{x^3 + O(x^5)} \\ &= \lim_{x\to 0} -\frac12 + O(1) \end{align}$$ As you can see, one of the terms we ignored produces an error that doesn't go away as we approach $0$, so our solution is no good. If you trace backward to where the error came from, you'll find that it's the $O(x^3)$ term in the approximation of $\sin$. Then you would be wise to replace it with its true value, giving $\sin x = x - \frac16 x^3 + O(x^5)$, and evaluate the limit again. This time you should get the right answer, with an extra term that goes to zero as $x \to 0$.
You've lost a term involving $x^3$ by approximating $\cos x\sin x$ as $(1-(x^2/2))x$ instead of $(1-x^2/2)(x-(x^3/6))$.
I'm not going about proving it here (the people above have already provided good solutions), I'm just explaining why you can't always apply limits directly inside a limit:
The point is that, sin x and cos x tend to their approximations at different rates. So, their difference will not tend to the difference of their limits.
As a general rule of thumb, you can do:
$\lim ab=\lim a \lim b$,
$\lim \frac{a}{b}=\frac{\lim a}{\lim b}$,
and $\lim a+b=\lim a+\lim b$,
but not $\lim \frac{a+b}{c}=\frac{\lim a+\lim b}{\lim c}$
even though the last limit appears to be a combination of the second two. The flaw here is that we can only apply $\lim \frac{a}{b}=\frac{\lim a}{\lim b}$, when $\lim b$ is defined and not zero. So, $\lim \frac{a+b}{c}=\frac{\lim a+\lim b}{\lim c}$ only works when $\lim c=0$, which is almost always not the case when you apply this.