Distributive property on fractions

I'm in seventh grade and my teacher wasn't able to explain this to me.

why is $\frac{1}{a+b}$ not equal to $\frac 1b +\frac 1a$?

I'm sorry if this is obvious.

EDIT: thank you to everyone who responded. I think I understand fractions a lot more now. it was good to get both intuitive and algebraic answers... that really nailed the point home for me


Check for yourself by trying some numbers! For example, if $a= b =1$, then $1/(a+b) = 1/2$, while $1/a + 1/b = 2$. Since $1/2 \neq 2$, we have that $1/(a+b) \neq 1/a + 1/b$ in this case.

So clearly $a/(b+c) \neq a/b + a/c$ in general. Why, on the other hand, does $(a+b)/c = a/c + b/c$? The answer is that this really is just using the usual distributive property! I can do the following algebraic tricks: $$\frac{a+b}{c} = (a+b) \frac{1}{c} = a \frac{1}{c} + b \frac{1}{c} = \frac{a}{c} + \frac{b}{c}$$ So really, all we've done here is distributed the factor of $1/c$ over the sum $(a+b)$.

The sum $(b+c)$ in $a/(b+c)$ isn't being multiplied by anything in this expression; in fact, it's being divided by! So trying to distribute the division over this sum would be a new distributive property, and as we observed above, this property does not actually hold.


Since you know algebra, here is a proof that may satisfy you. Consider: $$\frac{1}{a+b} \stackrel{?}{=} \frac{1}{a} + \frac{1}{b}$$

If they are equal, then multiplying them by the same thing should keep them equal. Similarly, if they are unequal, then multiplying them by the same thing (except 0) should keep them unequal.

Now, multiply both sides by $ab$ to get $$\frac{ab}{a+b} \stackrel{?}{=} \frac{ab}{a} + \frac{ab}{b}$$ $$\frac{ab}{a+b} \stackrel{?}{=} b + a$$

Now, multiply both sides by $a + b$ to get $$\frac{ab(a+b)}{a+b} \stackrel{?}{=} (b + a)(a+b)$$ $$ab \stackrel{?}{=} (a + b)^2$$

You already know what the right-hand side is equal to: it is $a^2 + 2ab + b^2$: $$ab \stackrel{?}{=} a^2 + 2ab + b^2$$

Now subtract $ab$ from both sides: $$0 \stackrel{?}{=} a^2 + ab + b^2$$

Here's the key:

Your claim was that this equality holds for all $a$ and $b$. (That is, $a^2 + ab + b^2 = 0$ for all $a$ and $b$.)

Since this is obviously false (you had complete freedom to choose $a$ and $b$), your claim cannot be true. Does this make sense?


Why would it be equal?

Think of $1/3$ or $1/4$ as a single third or a single fourth. (I would use $1/c$ here, but then there isn't a convenient word to go with it.) We have $a/3+b/3=(a+b)/3$, since $a$ thirds plus $b$ thirds is $a+b$ thirds. However, we don't have $1/3+1/4=1/7$; a seventh is smaller than a third or a fourth.