Santino Bautista

2022-06-25

Is there a rule that characterizes when Euler's method over-estimates or under-estimates?
For example, "if $f\left(x\right)$ is increasing, then Euler's method underestimates," or something similar?

Odin Jacobson

Expert

Generally speaking, Euler's method will overestimate when the second derivative of $f$ is negative. This comes from the Taylor's series of the function:
$f\left(x\right)=f\left({x}_{0}\right)+\left(x-{x}_{0}\right){f}^{\prime }\left(x\right)+\frac{1}{2}\left(x-{x}_{0}{\right)}^{2}{f}^{″}\left(x\right)+\dots$
Euler's method accounts for the ${f}^{\prime }$ term but no more. This is not absolute. For a given step size, the third derivative could be huge and swamp the effect of the second. Higher order methods account for more terms of the Taylor series.

Do you have a similar question?