Proof of rule for multiplying numbers with uncertainties
So suppose you have a value $A$ with an absolute uncertainty $\pm a$ and another value $B$ with absolute uncertainty $b$, it is easy to prove that the rule for dealing with the uncertainties when adding these values like so: $$(A\pm a)+(B\pm b)=(A+B)\pm (a+b)$$ However the rule for dealing with uncertainties when multiplying the values is: $$(A\pm a)\times(B\pm b)=(A\times B)\pm \left[\left(\frac{a}{A}\cdot100\right)+\left(\frac{b}{B}\cdot100\right)\right]\%$$ my question is how would one go about proving this fact or is it just useful convention with no logical proof. If it is the latter why is useful to deal with uncertainties this way. Thanks in advance :)
The rule is $(A \pm a) \times (B \pm b)=AB \pm Ab \pm Ba \pm ab$, which you can verify by the distributive principle. We often write this as $(A \pm a) \times (B \pm b)=AB(1 \pm \frac bB \pm \frac aA \pm \frac {ab}{AB})$ which has the idea you are remembering that the fractional errors add. Then if the fractional errors are small we say $\frac {ab}{AB}$ is small twice and can be ignored, giving the more famous $$(A \pm a) \times (B \pm b)=AB\left(1 \pm \frac aA \pm \frac bB\right)$$
Suppose $A, B$ are the variables and $a, b$ are their absolute uncertainties respectively. We could then calculate absolute uncertainty of $AB$ as follows:
$$\max(AB)= (A+a)(B+b)=AB + Ab + Ba + ab$$ $$\min(AB)= (A-a)(B-b)=AB - Ab - Ba + ab$$
So absolute uncertainty of $AB = \frac{\max(AB) - \min(AB)}2 = \frac{2Ab + 2Ba}2 = Ab + Ba$
% uncertainty of $AB$ = absolute / $AB = b/B + a/A =$ %uncertainty of $A$ + % uncertainty of $B$.