How does calculus without Euler's number (e) look? [closed]

I'm starting to study calculus and I've become very interested in Euler's number ($e$). I understand that the property of being its own derivative makes it the "natural" base to work on for studying rates of change.

However, I was wondering what would happen if we pretended not to know about the existence of $e$. Would trying to find the derivative of something like $a^x$ lead us into finding the definition of $e$ or is it possible to avoid $e$ altogether?

In this video it says that not using $e$ in calculus leads to some pretty crazy math. What does that math look like?


Solution 1:

What was meant in the video is that if you do calculus with $2^x$ for example, you get that

$$ \begin{aligned} \frac{d}{dx} 2^x = (\ln 2)2^x, \quad \int2^x dx = \frac{1}{\ln 2}2^x + C \end{aligned}$$

And of course, if you didn't know about $e$ and logarithms, $\ln 2$ would just be some crazy constant you have to figure out somehow. So the formulas for differentiation and integration are "not very nice". Compare this with $e^x$:

$$ \begin{aligned} \frac{d}{dx} e^x = e^x, \quad \int e^x dx = e^x + C \end{aligned}$$

Suddenly there's nothing to remember, no crazy constants, and differentiation and integration are as simple as they could possibly be.

Finally, if no-one had discovered $e$, I think that once calculus had been invented people would find it pretty fast. Once you know what differentiation is, it's natural to ask "is there a function that is its own derivative?" and this question will naturally lead to the function $e^x$.

Solution 2:

The usual method for not introducing $e$ early is to do everything else, including integral and the fundamental theorem of calculus. Next, define a new function for $x>0$ by $$ f(x) = \int_1^x \; \frac{1}{t} \; dt $$ Then the exponential function is the inverse function of $f(x),$ call that $\exp, $ then the constant becomes $\exp 1$

Solution 3:

\begin{align} \frac d {dx} 2^x & = \lim_{\Delta x\to0} \frac{2^{x+\Delta x} - 2^x}{\Delta x} = \lim_{\Delta x\to 0} \left( 2^x \frac {2^{\Delta x} - 1}{\Delta x} \right) \\[10pt] & = 2^x \lim_{\Delta x\to0} \frac {2^{\Delta x} - 1}{\Delta x} \quad \text{This step is possible because $2^x$ is} \\ & \quad \text{ “constant” in the sense that it does not change as $\Delta x$ approaches $0.$} \\[10pt] & = \Big(2^x \times \text{a constant}\Big) \text{ where this time “constant” means} \\ & \qquad \text{ not changing as $x$ changes.} \end{align} Similarly $$ \frac d {dx} 3^x = \Big(3^x \times \text{a constant} \Big) $$ but it's a different constant.

Now we have the problem of ascertaining what these "constants" are.

And instead of $2$ or $3$ as the base, for which number as base would the "constant" be equal to $1\text{?}$

Answering that last question is how the number $e$ would be discovered if we didn't already know about it.

And once we've done that, the laws of exponents plus the chain rule would would tell us that the two "consants" mentioned above are $\log_e 2$ and $\log_e 3.$

Solution 4:

Just to be contrarian, imagine that $e$ has not been discovered but it is known that $\arcsin$ can be analytically continued to some subset of $\mathbb C$ containing the positive imaginary axis. Then using the substitution $u=t-1/t$, we can evaluate the integral $$ \int\frac{dt}t=\int\frac{du}{\sqrt{4+u^2}}=-i\arcsin(iu/2)+C=-i\arcsin(i(t-1/t)/2)+C. $$ In other words, you can (awkwardly) get by without the exponential function using trignometric functions, which have been known longer.

Solution 5:

Nothing would change really, because $e$ isn't that useful anyway.

That may seem a bit of a crazy statement, but bear with me. $x\mapsto e^x$ is of course an extremely useful function, just, it doesn't actually hinge much on Euler's number.

Starting from algebra, it doesn't make that much sense to calculate the derivative $\frac{\partial}{\partial x} b^x$ in the first place, for any basis $b\neq1$, because $b^x$ simply isn't defined for $x\in\mathbb{R}$, and you need real arguments in order for derivatives to make sense.

You can, however, leave “iterated-multiplication exponentiation” completely aside and just define the function $\exp$ with the specific goal of fulfulling $\frac{\partial}{\partial x} (\exp x) = \exp x$. After all, that is a Lipschitz-continuous ordinary differential equation; hence if we choose $\exp 0 = 1$ as the starting condition, the Picard-Lindelöf theorem tells us that this uniquely determines the function $\exp$ for all real arguments.

You can then (using Taylor expansion or integral formulas) go on to observe that $\exp$ behaves in every regard as if it was a function of the form $x\mapsto b^x$. Namely, you have $$ \exp(p+q) = \exp p \cdot \exp q $$ and can (using the starting condition) therefore, at least for rational $n$, always write $$ \exp n = (\exp 1)^n $$ (which we know as $\exp n = e^n$, but that's just a trivial shorthand definition of $e = \exp 1$.)

By introducing also logarithms as the inverse to $\exp$, you can then write any power function in terms of $\exp$ and that readily allows you to differentiate any such function. But that doesn't at all require that you've ever introduced the convention of what $e^x$ means. So, you'd just get slightly longer formulas because you'd occasionally need to write out $\exp 1$, but the maths as such wouldn't change at all.


This isn't in fact that much of a problem because $b^x$ is readily defined on all rationals, these are dense in $\mathbb{R}$, and you can show that $b^x$ maps Cauchy sequences to Cauchy sequences, which can also be used to define arbitrary-basis realexponential functions as the unique continuous extension of the rational exponentials. Problem is, the limits of the resulting Cauchy sequence are not in $\mathbb{Q}$, and computing them isn't very practical without using $\exp$ as a proxy.