I have a multivariable function that I have defined the cost function and its gradient with respect

Gaaljh

Gaaljh

Answered question

2022-06-25

I have a multivariable function that I have defined the cost function and its gradient with respect to the variable vector. Let's call the cost function f ( x ), and variable vector x . I have been using nonlinear conjugate gradient descenet method for minimization. Algorithm is as follows:
k = 0 x = 0 g 0 = x f ( x 0 ) Δ x 0 = g 0
x k + 1 x k + t Δ x k g k + 1 f ( x k + 1 ) Δ x k + 1 g k + 1 + γ Δ x k
I know that I need to update formula so the solution "ascends" instead of descension. How can I update the algorithm? Also I wonder how the wolfe condition changes for maximization problems. Thank you and have a nice day.

Answer & Explanation

Myla Pierce

Myla Pierce

Beginner2022-06-26Added 20 answers

Convert your function from maximization problem to minimization by setting f n e w ( x ) = f ( x ) and run default conjugate gradient for minimization.

Since you had specifically asked how the formulas and wolfe conditions update you can plug in f ( x ) for function values and f ( x ) for gradients and see for yourself.

Do you have a similar question?

Recalculate according to your conditions!

New Questions in High school geometry

Ask your question.
Get an expert answer.

Let our experts help you. Answer in as fast as 15 minutes.

Didn't find what you were looking for?