LBFGS_B

scalation.optimization.quasi_newton.LBFGS_B
class LBFGS_B(f: FunctionV2S, g: FunctionV2S, ineq: Boolean, exactLS: Boolean, var l: VectorD, var u: VectorD) extends Minimizer

The LBFGS_B the class implements the Limited memory Broyden–Fletcher– Goldfarb–Shanno for Bound constrained optimization (L-BFGS-B) Quasi-Newton Algorithm for solving Non-Linear Programming (NLP) problems. L-BFGS-B determines a search direction by deflecting the steepest descent direction vector (opposite the gradient) by * multiplying it by a matrix that approximates the inverse Hessian. Furthermore, only a few vectors represent the approximation of the Hessian Matrix (limited memory). The parameters estimated are also bounded within user specified lower and upper bounds.

minimize f(x) subject to g(x) <= 0 [ optionally g(x) == 0 ]

Value parameters

exactLS

whether to use exact (e.g., GoldenLS) or inexact (e.g., WolfeLS) Line Search

f

the objective function to be minimized

g

the constraint function to be satisfied, if any

ineq

whether the constraint is treated as inequality (default) or equality

l

vector of lower bounds for all input parameters

u

vector of upper bounds for all input parameters

Attributes

Graph
Supertypes
trait Minimizer
class Object
trait Matchable
class Any

Members list

Value members

Concrete methods

override def fg(x: VectorD): Double

The objective function f plus a weighted penalty based on the constraint function g.

The objective function f plus a weighted penalty based on the constraint function g.

Value parameters

x

the coordinate values of the current point

Attributes

Definition Classes
def lineSearch(x: VectorD, dir: VectorD, step: Double): Double

Perform an exact GoldenSectionLS or inexact WolfeLS Line Search. Search in direction dir, returning the distance z to move in that direction.

Perform an exact GoldenSectionLS or inexact WolfeLS Line Search. Search in direction dir, returning the distance z to move in that direction.

Value parameters

dir

the direction to move in

step

the initial step size

x

the current point

Attributes

def setHistorySize(hs_: Int): Unit

Modify the number of historical vectors to store.

Modify the number of historical vectors to store.

Value parameters

hs_

the new history size

Attributes

def solve(x0: VectorD, alphaInit: Double, toler: Double): FuncVec

Solve the following Non-Linear Programming (NLP) problem using L-BFGS_B: min { f(x) | g(x) <= 0 }.

Solve the following Non-Linear Programming (NLP) problem using L-BFGS_B: min { f(x) | g(x) <= 0 }.

Value parameters

alphaInit

the initial step size

toler

the tolerance

x0

the starting point

Attributes

Inherited methods

def resolve(n: Int, step_: Double, toler: Double): FuncVec

Solve the following Non-Linear Programming (NLP) problem: min { f(x) | g(x) <= 0 }. To use explicit functions for gradient, replace gradient (fg, x._1 + s) with gradientD (df, x._1 + s). This method uses multiple random restarts.

Solve the following Non-Linear Programming (NLP) problem: min { f(x) | g(x) <= 0 }. To use explicit functions for gradient, replace gradient (fg, x._1 + s) with gradientD (df, x._1 + s). This method uses multiple random restarts.

Value parameters

n

the dimensionality of the search space

step_

the initial step size

toler

the tolerance

Attributes

Inherited from:
Minimizer