c

scalation.minima

QuasiNewton

class QuasiNewton extends Minimizer with Error

The QuasiNewton the class implements the Broyden–Fletcher–Goldfarb–Shanno (BFGS) Quasi-Newton Algorithm for solving Non-Linear Programming (NLP) problems. BFGS determines a search direction by deflecting the steepest descent direction vector (opposite the gradient) by * multiplying it by a matrix that approximates the inverse Hessian. Note, this implementation may be set up to work with the matrix 'b' (approximate Hessian) or directly with the 'binv' matrix (the inverse of 'b').

minimize f(x) subject to g(x) <= 0 [ optionally g(x) == 0 ]

Linear Supertypes
Error, Minimizer, AnyRef, Any
Ordering
  1. Alphabetic
  2. By Inheritance
Inherited
  1. QuasiNewton
  2. Error
  3. Minimizer
  4. AnyRef
  5. Any
  1. Hide All
  2. Show All
Visibility
  1. Public
  2. Protected

Instance Constructors

  1. new QuasiNewton(f: FunctionV2S, g: FunctionV2S = null, ineq: Boolean = true, exactLS: Boolean = false)

    f

    the objective function to be minimized

    g

    the constraint function to be satisfied, if any

    ineq

    whether the constraint is treated as inequality (default) or equality

    exactLS

    whether to use exact (e.g., GoldenLS) or inexact (e.g., WolfeLS) Line Search

Type Members

  1. type Pair = (VectorD, VectorD)
    Definition Classes
    Minimizer

Value Members

  1. def fg(x: VectorD): Double

    The objective function f plus a weighted penalty based on the constraint function g.

    The objective function f plus a weighted penalty based on the constraint function g.

    x

    the coordinate values of the current point

    Definition Classes
    QuasiNewtonMinimizer
  2. final def flaw(method: String, message: String): Unit
    Definition Classes
    Error
  3. def lineSearch(x: VectorD, dir: VectorD, step: Double = STEP): Double

    Perform an exact 'GoldenSectionLS' or inexact 'WolfeLS' Line Search.

    Perform an exact 'GoldenSectionLS' or inexact 'WolfeLS' Line Search. Search in direction 'dir', returning the distance 'z' to move in that direction. Default to

    x

    the current point

    dir

    the direction to move in

    step

    the initial step size

    Definition Classes
    QuasiNewtonMinimizer
  4. def setDerivatives(partials: Array[FunctionV2S]): Unit

    Set the partial derivative functions.

    Set the partial derivative functions. If these functions are available, they are more efficient and more accurate than estimating the values using difference quotients (the default approach).

    partials

    the array of partial derivative functions

  5. def setSteepest(): Unit

    Use the Gradient Descent algorithm rather than the default BFGS algorithm.

  6. def solve(x0: VectorD, step_: Double = STEP, toler: Double = TOL): VectorD

    Solve the following Non-Linear Programming (NLP) problem using BFGS: min { f(x) | g(x) <= 0 }.

    Solve the following Non-Linear Programming (NLP) problem using BFGS: min { f(x) | g(x) <= 0 }. To use explicit functions for gradient, replace 'gradient (fg, x._1 + s)' with 'gradientD (df, x._1 + s)'.

    x0

    the starting point

    step_

    the initial step size

    toler

    the tolerance

    Definition Classes
    QuasiNewtonMinimizer
  7. def updateBinv(s: VectorD, y: VectorD): Unit

    Update the 'binv' matrix, which is used to deflect -gradient to a better search direction than steepest descent (-gradient).

    Update the 'binv' matrix, which is used to deflect -gradient to a better search direction than steepest descent (-gradient). Compute the 'binv' matrix directly using the Sherman–Morrison formula.

    s

    the step vector (next point - current point)

    y

    the difference in the gradients (next - current)

    See also

    http://en.wikipedia.org/wiki/BFGS_method