Since we know that in a good linear approximation, L ( x ) = f ( a )

ziphumulegn

ziphumulegn

Answered question

2022-07-09

Since we know that in a good linear approximation, L ( x ) = f ( a ) + f ( a ) ( x a ) .. But what if f ( a ) does not exist? How to prove that if a function has a good linear approximation, then it must be differentiable?

Answer & Explanation

Dobermann82

Dobermann82

Beginner2022-07-10Added 15 answers

One definition of having a good linear approximation for f at a is
f ( x ) = f ( a ) + c ( x a ) + ϵ ( x a )
where ϵ ( x a ), the error term, is a function that satisfies the following
lim x a ϵ ( x a ) x a = 0
Then, that f is differentiable basically follows immediately, by subtracting f ( a ), dividing by x−a and taking the limit as x a showing that the function is differentiable at a, with value c.
f ( x ) = f ( a ) + c ( x a ) + ϵ ( x a ) f ( x ) f ( a ) = c ( x a ) + ϵ ( x a ) f ( x ) f ( a ) x a = c + ϵ ( x a ) x a
and thus taking the limit
lim x a f ( x ) f ( a ) x a = c + lim x a ϵ ( x a ) x a f ( a ) = c

Do you have a similar question?

Recalculate according to your conditions!

New Questions in Differential Equations

Ask your question.
Get an expert answer.

Let our experts help you. Answer in as fast as 15 minutes.

Didn't find what you were looking for?