Integrating with respect to different variables
I have started reading a book on differential equations and it says something like:
$$\frac{dx}{x} = k \, dt$$
Integrating both sides gives
$$\log x = kt + c$$
How is it that I can 'integrate both sides here' when I am integrating one side with respect to $x$ yet I am integrating the other side with respect to $t$?
What is going on there is what is called an abuse of notation. What you really have there is an equation in $t$. $x=x(t)$ is a function dependent on $t$. So what we're doing is the following - I presume this is the original equation:
$$\frac{dx}{dt}=k x $$
This is the same as
$$x'(t)=k x(t) $$
$$\frac{x'(t)}{x(t)}=k $$
Now we integrate wrt to $t$
$$\int\frac{x'(t)}{x(t)}dt=kt+C $$
But we note letting $X=x(t)$ so $dX = x'(t) dt$ gives
$$\int\frac{dX}{X}=kt+C $$
$$\log X = kt+C$$
So switching back
$$\log x(t) = kt+C$$
$$x(t)=C e^{kt}$$
What we actually do, in some sense, is integrate with respect to "only" $t$ in one side, and "$x(t)$" in the other (which is done implicitly). The notation is very useful and suggestive, so we use it, understanding what we're doing is the above.
Good point! We have $\frac{dx}{dt}=kx$. Thus $x$ is a function of $t$. Crossing our fingers about possible division by $0$, we have $$\frac{1}{x}\frac{dx}{dt}=k.\tag{$1$}$$ Now let $F(x)$ be any antiderivative of $\frac{1}{x}$ with respect to $x$. Then by the Chain Rule, the left-hand side of $(1)$ is the derivative of $F(x)$ with respect to $t$. So we have $$\frac{d}{dt}(F(x))=k,$$ and therefore $F(x)=kt+C.$ In our case, $\log(|x|)$ is an antiderivative of $\frac{1}{x}$ with respect to $x$, and we get the general solution of the DE.
Note that the mysterious process in which we separate $\frac{dx}{dt}$, which is not a ratio, into $2$ parts, "$dx$" and "$dt$," gives us exactly the same final answer. We can think of it as a symbolic manipulation that gets us to the right answer.
Remarks: $1.$ Note that $x=0$ is a solution of the DE. If we use $\log(|x|)=kt+C$, then $x=\pm e^C e^{kt}$, and $e^C$ can never be $0$. However, it is traditional to replace the constant $\pm e^C$ by a new constant $D$, so we get $x=De^{kt}$. The case $D=0$ covers the solution $x=0$ that we had lost by dividing. A nice case of two mistakes cancelling.
$2.$ At a (much) later stage, the "differentials" that we used can be given precise meaning.
Letting $y= \log x$, we have $$ \frac{dy}{dx} = \frac 1 x. $$ If we regard $dy$ and $dx$ as corresponding infinitely small increments of $y$ and $x$, and multiply both sides by $dx$, we find that the infinitely small increment of $y$ is $$ dy = \frac{dx}{x}. $$ If $y=kt+c$, and we increment $y$ and $t$ by infinitely small amounts while letting $k$ and $c$ remain constant, we get $$ dy = k\,dt. $$ Thus $$ \frac{dx}{x} = k\,dt,\text{ and }\log x = kt+c. $$
One side is integrated with respect to $x$ and the other with respect to $t$ because $dx$ appears as a factor (i.e. as something one multiplies by) on one side and $dt$ on the other.
If anyone finds deficiencies in the extent to which this has been made logically rigorous in the existing literature, I think that should be construed as a challenge to work further on the logic and the conventions so that the above will be fully logically rigorous. What appears above should not be viewed simply as an abuse of notation. It's an abuse of notation only within the context of certain conventions, which may represent the state of the science at some particular point in history. The current state of the science is never the last word.