In particular, if Euler's method is implemented on a computer, what's the minimum step size that can be used before rounding errors cause the Euler approximations to become completely unreliable? I presume it's when step size reaches the machine epsilon? E.g. if machine epsilon is e-16, then once step size is roughly e-16, the Euler approximations are unreliable.

kominis3q

kominis3q

Answered question

2022-07-23

In particular, if Euler's method is implemented on a computer, what's the minimum step size that can be used before rounding errors cause the Euler approximations to become completely unreliable?
I presume it's when step size reaches the machine epsilon? E.g. if machine epsilon is e-16, then once step size is roughly e-16, the Euler approximations are unreliable.

Answer & Explanation

escobamesmo

escobamesmo

Beginner2022-07-24Added 18 answers

It depends on the problem. Basically, you have two sources of errors in each step, the rounding errors that are some multiple of the machine constant μ = 2 10 16 (scaled by the scale of the functions involved) and the local truncation error, which depends on derivatives of the ODE function (thus also involving the function scale) and the square h 2 of the step size. You want the truncation error to dominate the rounding errors, which in the simplest case demands h 2 > μ or h > μ 10 8 . This changes if the ODE function is in some sense "strongly curved".

Do you have a similar question?

Recalculate according to your conditions!

Ask your question.
Get an expert answer.

Let our experts help you. Answer in as fast as 15 minutes.

Didn't find what you were looking for?