the vector-to-scalar objective function
whether to use exact (e.g., GoldenLS) or inexact (e.g., WolfeLS) Line Search
The objective function f plus a weighted penalty based on the constraint function g.
The objective function f plus a weighted penalty based on the constraint function g. Override for constrained optimization and ignore for unconstrained optimization.
the coordinate values of the current point
Show the flaw by printing the error message.
Show the flaw by printing the error message.
the method where the error occurred
the error message
Perform an exact (GoldenSectionLS) or inexact (WolfeLS) line search.
Perform an exact (GoldenSectionLS) or inexact (WolfeLS) line search. Search in direction 'dir', returning the distance 'z' to move in that direction.
the current point
the direction to move in
the initial step size
Set the partial derivative functions.
Set the partial derivative functions. If these functions are available, they are more efficient and more accurate than estimating the values using difference quotients (the default approach).
the array of partial derivative functions
Solve the Non-Linear Programming (NLP) problem using the Steepest Descent algorithm.
Solve the Non-Linear Programming (NLP) problem using the Steepest Descent algorithm.
the starting point
the initial step size
the tolerence
This class solves unconstrained Non-Linear Programming (NLP) problems using the Steepest Descent algorithm. Given a function 'f' and a starting point 'x', the algorithm computes the gradient and takes steps in the opposite direction. The algorithm iterates until it converges. The class assumes that partial derivative functions are not availble unless explicitly given via the setDerivatives method.
dir_k = -gradient (x)
minimize f(x)