Fit exponential with constant

I have data whic would fit to an exponential function with a constant.

$$y=a\cdot\exp(b\cdot t) + c.$$

Now I can solve an exponential without a constant using least square by taking log of y and making the whole equation linear. Is it possible to use least square to solve it with a constant too ( i can't seem to convert the above to linear form, maybe i am missing something here) or do I have to use a non linear fitting function like nlm in R ?


Solution 1:

A direct method of fitting (no guessed initial values required, no iterative process) is shown below. For the theory, see the paper (pp.16-17) : https://fr.scribd.com/doc/14674814/Regressions-et-equations-integrales

enter image description here

Solution 2:

The problem is intrinsically nonlinear since you want to minimize $$SSQ(a,b,c)=\sum_{i=1}^N\big(ae^{bt_i}+c-y_i\big)^2$$ and the nonlinear regression will require good (or at least reasonable and consistent) estimates for the three parameters.

But, suppose that you assign a value to $b$; then defining $z_i=e^{bt_i}$ the problem turns to be linear $(y=az+c)$ and a linear regression will give the value of $a,c$ for the given $b$ as well as the sum of the squares. Try a few values of $b$ untill you see a minimum of $SSQ(b)$. For this approximate value of $b$, you had from the linear regression the corresponding $a$ and $c$ and you are ready to go with the nonlinear regression.

Another approach could be : assume a value of $c$ and rewrite the model as $$y-c=a e^{bt}$$ $$\log(y-c)=\alpha + bt$$ which means that defining $w_i=\log(y_i-c)$, the model is just $z=\alpha + bt$ and a linear regression will give $\alpha$ and $b$. From these, recompute $y_i^*=c+e^{ \alpha + bt_i}$ and the corresponding some of squares $SSQ(c)$. Again, trying different values of $c$ will show a minimum and for the best value of $c$, you know the corresponding $b$ and $a=e^{\alpha}$ and you are ready to go with the nonlinear regression.

It is sure that there is one phase with trial and error but it is very fast.

Edit

You can even have an immediate estimate of parameter $b$ if you take three equally spaced points $t_1$, $t_2$, $t_3$ such that $t_2=\frac{t_1+t_3}2$ and, the corresponding $y_i$'s. After simplification $$\frac{ y_3-y_2}{ y_3-y_1}=\frac{1}{1+e^{\frac{1}{2} b (t_1-t_3)}}$$ from which you get the estimate of $b$ $$b=\frac{2}{t_1-t_3} \log \left(\frac{y_1-y_2}{y_2-y_3}\right)$$ from which $$a=\frac{y_1-y_3}{e^{b t_1}-e^{b t_3}}$$ $$c=y_1-ae^{bt_1}$$ Using the data given in page 18 of JJacquelin's book, let us take (approximate values for the $x$'s) the three points $(-1,0.418)$, $(0,0.911)$, $(1,3.544)$. This immediately gives $b\approx 1.675$, $a\approx 0.606$, $c\approx 0.304$ which are extremely close to the end results by JJacquelin in his book for this specific example.

Having these estimates, just run the nonlinear regression.

This was a trick proposed by Yves Daoust here