NelderMeadSimplex2

scalation.optimization.NelderMeadSimplex2
class NelderMeadSimplex2(f: FunctionV2S, n: Int, checkCon: Boolean, lower: VectorD, upper: VectorD) extends Minimizer, BoundsConstraint, MonitorEpochs

The NelderMeadSimplex2 solves Non-Linear Programming (NLP) problems using the Nelder-Mead Simplex algorithm. Given a function f and its dimension n, the algorithm moves a simplex defined by n + 1 points in order to find an optimal solution. The algorithm is derivative-free.

minimize f(x)

Value parameters

f

the vector-to-scalar objective function

n

the dimension of the search space

Attributes

Graph
Supertypes
trait Minimizer
class Object
trait Matchable
class Any
Show all

Members list

Value members

Concrete methods

def improveSimplex(toler: Double): (Double, Double)

Improve the simplex by replacing the worst/highest vertex (x_h) with a a better one found on the line containing x_h and the centroid (x_c). Try the reflection, expansion, outer contraction and inner contraction points, in that order. If none succeeds, shrink the simplex and iterate. Return both distance and difference between x_h (worst) and x_l (best).

Improve the simplex by replacing the worst/highest vertex (x_h) with a a better one found on the line containing x_h and the centroid (x_c). Try the reflection, expansion, outer contraction and inner contraction points, in that order. If none succeeds, shrink the simplex and iterate. Return both distance and difference between x_h (worst) and x_l (best).

Value parameters

toler

the tolerance used for termination

Attributes

def initSimplex(x0: VectorD, step: Double): Unit

Initialize the search simplex by setting n + 1 vertices and computing their functional values.

Initialize the search simplex by setting n + 1 vertices and computing their functional values.

Value parameters

step

the step size

x0

the given starting point

Attributes

def lineSearch(x: VectorD, dir: VectorD, step: Double): Double

Perform an exact (e.g., GoldenSectionLS) or inexact (e.g., WolfeLS) line search. Search in direction dir, returning the distance z to move in that direction. Currently NOT USED, but may be used to find a better point to add to simplex.

Perform an exact (e.g., GoldenSectionLS) or inexact (e.g., WolfeLS) line search. Search in direction dir, returning the distance z to move in that direction. Currently NOT USED, but may be used to find a better point to add to simplex.

Value parameters

dir

the direction to move in

step

the initial step size

x

the current point

Attributes

def solve(x0: VectorD, step: Double, toler: Double): FuncVec

Solve the Non-Linear Programming (NLP) problem using the Nelder-Mead Simplex algorithm.

Solve the Non-Linear Programming (NLP) problem using the Nelder-Mead Simplex algorithm.

Value parameters

step

the initial step size

toler

the tolerance used for termination

x0

the given starting point

Attributes

Inherited methods

def constrain(x: VectorD): Unit

Constraint the current point x, so that lower <= x <= upper by bouncing back from a violated contraint. !@param x the current point

Constraint the current point x, so that lower <= x <= upper by bouncing back from a violated contraint. !@param x the current point

Attributes

Inherited from:
BoundsConstraint
def fg(x: VectorD): Double

The objective function f plus a weighted penalty based on the constraint function g. Override for constrained optimization and ignore for unconstrained optimization.

The objective function f plus a weighted penalty based on the constraint function g. Override for constrained optimization and ignore for unconstrained optimization.

Value parameters

x

the coordinate values of the current point

Attributes

Inherited from:
Minimizer
def lossPerEpoch(): ArrayBuffer[Double]

Return the loss function for each epoch.

Return the loss function for each epoch.

Attributes

Inherited from:
MonitorEpochs
def plotLoss(): Unit

Attributes

Inherited from:
MonitorEpochs
def resolve(n: Int, step_: Double, toler: Double): FuncVec

Solve the following Non-Linear Programming (NLP) problem: min { f(x) | g(x) <= 0 }. To use explicit functions for gradient, replace gradient (fg, x._1 + s) with gradientD (df, x._1 + s). This method uses multiple random restarts.

Solve the following Non-Linear Programming (NLP) problem: min { f(x) | g(x) <= 0 }. To use explicit functions for gradient, replace gradient (fg, x._1 + s) with gradientD (df, x._1 + s). This method uses multiple random restarts.

Value parameters

n

the dimensionality of the search space

step_

the initial step size

toler

the tolerance

Attributes

Inherited from:
Minimizer

Inherited fields

protected val epochLoss: ArrayBuffer[Double]

Attributes

Inherited from:
MonitorEpochs