Can anyone explain why reducing the stepsize h used in Euler's Method reduces the approximation of a function at a point?
Let $y'=t^{3}y^{2}$ where $y(0)=1$. Approximate $y(1)$ using Euler's method with h=0.25. I learnt online that reducing the step size h reduces the error of the approximation. Can anyone explain why please?
Solution 1:
More precisely, we have bounds for the error which are increasing functions of $h > 0$: $E(h) \le C h$ for some constant $C$. In most cases the actual error is likely to be an increasing function of $h$ as well, at least for small $h$, but examples can be concocted for which e.g. the error happens to be $0$ for some arbitrarily small values of $h$ and nonzero for others.