Prove an inequality involving a root of a

sempteim245

sempteim245

Answered question

2022-03-30

Prove an inequality involving a root of a quadratic equation
If x=ρ
is a solution to: x2+bx+c=0
Prove that |ρ|1<|b|+|c|

Answer & Explanation

Tristatex9tw

Tristatex9tw

Beginner2022-03-31Added 18 answers

Step 1
Let ρ1 and ρ2 be the solutions to x2+bx+c=0/
Let's assume, without loss of generality, that |ρ1|ρ2
We shall consider two cases:
a) If |ρ1|<1, then
|ρ2|1|ρ1|1<0|b|+|c|
b) If |ρ1|1, it follows from Vieta's formulas and the reverse triangular inequality that
|b|+|c|=|ρ1+ρ2|+|ρ1ρ2|||ρ1||ρ2||+|ρ1||ρ2|
=|ρ1||ρ2|+|ρ1||ρ2|
=(|ρ1|1)(|ρ2|+1)+1>(|ρ1|1)(|ρ2|+1)|ρ1|1|ρ2|1.
Therefore, in both cases it is true that |ρ|1<|b|+|c|
microsgopx6z7

microsgopx6z7

Beginner2022-04-01Added 14 answers

Step 1
Consider the matrix
A=[01cb]
then x2+bx+c is the corresponding characteristic polynomial.
By Gershgorin's theorem, we have |ρ|1 or
|ρ+b||c|
By reversed triangle inequality, |ρ|1 or |ρ||c|+|b|,
hence |ρ|max(1,|b|+|c|)1+|b|+|c|

Do you have a similar question?

Recalculate according to your conditions!

Ask your question.
Get an expert answer.

Let our experts help you. Answer in as fast as 15 minutes.

Didn't find what you were looking for?