Consider a NLP min{f(x):g(x)<=0}. There are no equality constraints. The problem is feasible for small steps t>0. I have to prove that g(x+td)<=0 if g(x)<0, where t is the step length and d is the direction of the line search (gradient descent).

Addyson Bright

Addyson Bright

Answered question

2022-09-24

Consider a NLP min { f ( x ) : g ( x ) 0 }. There are no equality constraints. The problem is feasible for small steps t > 0. I have to prove that g ( x + t d ) 0 if g ( x ) < 0, where t is the step length and d is the direction of the line search (gradient descent).
I was thinking that since t is positive and the direction d can not be negative (not too sure about this fact), hence their multiplication is positive. The only way for g ( x + t d ) to be 0 or negative is for g ( x ) to be negative.

Answer & Explanation

Anna Juarez

Anna Juarez

Beginner2022-09-25Added 5 answers

If I understood correctly, you are given g ( x ) < 0 and you have to show that for some small t and any dd you will still have g ( x + t d ) 0.
So if g ( x ) < 0 then you move along d. If for all points r of the ray R from x in the direction of d you will have g ( r ) < 0, the condition is fulfilled.
So assume there is some point r on R where you will have g ( r ) = 0. (If there are many such points, pick the closest one to x.) Since r is on R, there must exist a T > 0 such that x + t d = r, and so you have g ( x ) < 0 and g ( x + T d ) = 0 for some fixed T, with g ( x + t d ) < 0 for all 0 t < T, as desiblack.

Do you have a similar question?

Recalculate according to your conditions!

New Questions in Multivariable calculus

Ask your question.
Get an expert answer.

Let our experts help you. Answer in as fast as 15 minutes.

Didn't find what you were looking for?