When can we not treat differentials as fractions? And when is it perfectly OK?

Background

I am a first year calculus student so I would prefer if answers remained in Layman's terms.

It is common knowledge and seems to me a mantra that I keep hearing over and over again to "not treat differentials/derivatives as fractions".

I am of course, in particular, referring to Leibniz notation.

However, aside from a quick response such as "oh, it's because its not a fraction but rather a type of operator", I never really got a full answer as to why we can't treat it as such. It just kind of sits at the edge of taboo in my mind where it sometimes gets used and sometimes doesn't.

Confusion is further compounded when a lot of things seem to just work out if we treat them just as fractions (e.g. u-substitution/related-rates)


Example

Air is being pumped into a balloon at a rate of $100cm^3/s$. We want the rate of change of radius when the radius is at $25cm$.

$$\text{we are given}\ \frac{dv}{dt}=100cm^3/s$$ $$\text{we want}\ \frac{dr}{dt}\ \text{when}\ r=25cm$$ Thus we will solve this by using the relation $v=\frac{4}{3}\pi r^3$ $$\frac{dv}{dt}=\frac{dv}{dr}\frac{dr}{dt}$$ $$\frac{dv}{dt}\frac{dr}{dv}=\frac{dr}{dt}$$ $$100\frac{1}{4\pi r^2}=\frac{1}{25\pi}$$ So the answer is $\frac{dr}{dt}=\frac{1}{25\pi}$ when $r=25cm$

*Note the manipulation of derivatives just as if they were common fractions using algebra.


Question

When exactly can I treat differentials/derivatives as fractions and when can I not?

Please keep in mind that at the end of the day, I am a first year college student. An answer that is easy to understand is preferred over one that is more mathematically rigorous but less friendly to a beginner such as me.


I'll just make two extended comments.

First, if you'd like to treat $dy/dx$ as a fraction, then you need to do two things:

  • (1) Have a clear, precise mathematical definition of what $dy$ and $dx$ are, and
  • (2) Have a way of dividing the quantities $dy$ and $dx$.

There are a few ways of answering (1), but the most common answer among mathematicians -- that is, to the question of "what are $dy$ and $dx$ really?" -- is somewhat technical: $dy$ and $dx$ are "differential forms," which are objects more advanced than a typical calculus course allows.

More problematic, though, is (2): differential forms are not things which can be divided. You might protest that surely every mathematical object you can think of can be added, subtracted, multiplied, and divided, but of course that's not true: you cannot (for example) divide a square by a triangle, or $\sqrt{2}$ by an integral sign $\int$.

Second, every single instance in which expressions like $dy/dx$ are treated like fractions -- like, as you say, $u$-substition and related rates -- are just the chain rule or the linearity of derivatives (i.e., $(f+g)' = f' + g'$ and $(cf)' = cf'$). Every single instance.

So, yes, $dy/dx$ can be treated like a fraction in the sense (and to the extent) that the Chain Rule $dy/dx = (dy/du)(du/dx)$ is a thing that is true, but that's essentially as far as the fraction analogy goes. (In fact, in multivariable calculus, pushing the fraction analogy too far can lead to real issues, but let's not get into this.)

Edit: On the OP's request, here are examples of fraction-like manipulations which are not valid: $$\left( \frac{dy}{dx} \right)^2 = \frac{(dy)^2}{(dx)^2} \ \ \text{ or } \ \ 2^{dy/dx} = \sqrt[dx]{2^{dy}}.$$ Because these manipulations are nonsensical, students are often warned not to treat derivatives like fractions.


Suppose $\Delta x$ is a tiny (but finite and nonzero) real number and $\Delta f$ is the amount that a function $f$ changes when its input changes from $x$ to $x + \Delta x$. Then, it's not true that $\Delta f = f'(x) \Delta x$ (with exact equality), but it is true that $\Delta f \approx f'(x) \Delta x$. You are free to manipulate $\Delta x$ and $\Delta f$ however you like, just as you would with any real numbers, so long as you rememember that the equations you derive are only approximately true. You can hope that "in the limit" you will obtain exactly true equations (as long as you are careful).

For example, suppose that $f(x) = g(h(x))$. Then \begin{align} f(x + \Delta x) &= g(h(x+\Delta x)) \\ &\approx g(h(x) + h'(x) \Delta x) \\ &\approx g(h(x)) + g'(h(x)) h'(x) \Delta x, \end{align} which tells us that \begin{equation} \frac{f(x+\Delta x) - f(x)}{\Delta x} \approx g'(h(x)) h'(x). \end{equation} And it certainly seems plausible that if we take the limit as $\Delta x$ approaches $0$ we will obtain exact equality: \begin{equation} f'(x) = g'(h(x)) h'(x). \end{equation}

These kinds of arguments, introducing tiny changes in $x$ and making linear approximations using the derivative, are the essential intuition behind calculus.

Often, arguments like this can be made into rigorous proofs just by keeping track of the errors in the approximations and bounding them somehow.