where f is the objective function to be minimized g is the constraint function to be satisfied, if any
Classes mixing in this trait must implement a function fg that rolls the constraints into the objective functions as penalties for constraint violation, a one-dimensional Line Search (LS) algorithm lineSearch and an iterative method (solve) that searches for improved solutions x-vectors with lower objective function values f(x).
Perform an exact, e.g., GoldenSectionLS or inexact, e.g., WolfeLS line search. Search in direction dir, returning the distance z to move in that direction.
Perform an exact, e.g., GoldenSectionLS or inexact, e.g., WolfeLS line search. Search in direction dir, returning the distance z to move in that direction.
Solve the Non-Linear Programming (NLP) problem by starting at x0 and iteratively moving down in the search space to a minimal point. Return the optimal point/vector x and its objective function value.
Solve the Non-Linear Programming (NLP) problem by starting at x0 and iteratively moving down in the search space to a minimal point. Return the optimal point/vector x and its objective function value.
The objective function f plus a weighted penalty based on the constraint function g. Override for constrained optimization and ignore for unconstrained optimization.
The objective function f plus a weighted penalty based on the constraint function g. Override for constrained optimization and ignore for unconstrained optimization.
Solve the following Non-Linear Programming (NLP) problem: min { f(x) | g(x) <= 0 }. To use explicit functions for gradient, replace gradient (fg, x._1 + s) with gradientD (df, x._1 + s). This method uses multiple random restarts.
Solve the following Non-Linear Programming (NLP) problem: min { f(x) | g(x) <= 0 }. To use explicit functions for gradient, replace gradient (fg, x._1 + s) with gradientD (df, x._1 + s). This method uses multiple random restarts.