NelderMeadSimplex2
The NelderMeadSimplex2
solves Non-Linear Programming (NLP) problems using the Nelder-Mead Simplex algorithm. Given a function f and its dimension n, the algorithm moves a simplex defined by n + 1 points in order to find an optimal solution. The algorithm is derivative-free.
minimize f(x)
Value parameters
- f
-
the vector-to-scalar objective function
- n
-
the dimension of the search space
Attributes
- Graph
-
- Supertypes
-
trait MonitorEpochstrait BoundsConstrainttrait Minimizerclass Objecttrait Matchableclass AnyShow all
Members list
Value members
Concrete methods
Improve the simplex by replacing the worst/highest vertex (x_h) with a a better one found on the line containing x_h and the centroid (x_c). Try the reflection, expansion, outer contraction and inner contraction points, in that order. If none succeeds, shrink the simplex and iterate. Return both distance and difference between x_h (worst) and x_l (best).
Improve the simplex by replacing the worst/highest vertex (x_h) with a a better one found on the line containing x_h and the centroid (x_c). Try the reflection, expansion, outer contraction and inner contraction points, in that order. If none succeeds, shrink the simplex and iterate. Return both distance and difference between x_h (worst) and x_l (best).
Value parameters
- toler
-
the tolerance used for termination
Attributes
Initialize the search simplex by setting n + 1 vertices and computing their functional values.
Initialize the search simplex by setting n + 1 vertices and computing their functional values.
Value parameters
- step
-
the step size
- x0
-
the given starting point
Attributes
Perform an exact (e.g., GoldenSectionLS
) or inexact (e.g., WolfeLS
) line search. Search in direction dir, returning the distance z to move in that direction. Currently NOT USED, but may be used to find a better point to add to simplex.
Perform an exact (e.g., GoldenSectionLS
) or inexact (e.g., WolfeLS
) line search. Search in direction dir, returning the distance z to move in that direction. Currently NOT USED, but may be used to find a better point to add to simplex.
Value parameters
- dir
-
the direction to move in
- step
-
the initial step size
- x
-
the current point
Attributes
Solve the Non-Linear Programming (NLP) problem using the Nelder-Mead Simplex algorithm.
Solve the Non-Linear Programming (NLP) problem using the Nelder-Mead Simplex algorithm.
Value parameters
- step
-
the initial step size
- toler
-
the tolerance used for termination
- x0
-
the given starting point
Attributes
Inherited methods
Constraint the current point x, so that lower <= x <= upper by bouncing back from a violated contraint. !@param x the current point
Constraint the current point x, so that lower <= x <= upper by bouncing back from a violated contraint. !@param x the current point
Attributes
- Inherited from:
- BoundsConstraint
The objective function f plus a weighted penalty based on the constraint function g. Override for constrained optimization and ignore for unconstrained optimization.
The objective function f plus a weighted penalty based on the constraint function g. Override for constrained optimization and ignore for unconstrained optimization.
Value parameters
- x
-
the coordinate values of the current point
Attributes
- Inherited from:
- Minimizer
Return the loss function for each epoch.
Attributes
- Inherited from:
- MonitorEpochs
Solve the following Non-Linear Programming (NLP) problem: min { f(x) | g(x) <= 0 }. To use explicit functions for gradient, replace gradient (fg, x._1 + s) with gradientD (df, x._1 + s). This method uses multiple random restarts.
Solve the following Non-Linear Programming (NLP) problem: min { f(x) | g(x) <= 0 }. To use explicit functions for gradient, replace gradient (fg, x._1 + s) with gradientD (df, x._1 + s). This method uses multiple random restarts.
Value parameters
- n
-
the dimensionality of the search space
- step_
-
the initial step size
- toler
-
the tolerance
Attributes
- Inherited from:
- Minimizer
Inherited fields
Attributes
- Inherited from:
- MonitorEpochs