LBFGS_NoLS

scalation.optimization.quasi_newton.LBFGS_NoLS
class LBFGS_NoLS(f: FunctionV2S, m: Int, n: Int, useLS: Boolean) extends Minimize

The LBFGS_NoLS class is used to find optima for functions of vectors. The solve method finds local optima using a Quasi Newton method, the Limited Memory BFGS Method that keeps track of the most recent m changes in x-positions and gradients. The Ring class is used to store the most recent m vectors.

min f(x)    where f: R^n -> R

Value parameters

f

the vector to scalar function to find optima of

m

the memory size or number of historical s and y vectors to maintain

n

the dimensionality of the vectors

useLS

whether to use Line Search (LS)

Attributes

See also

LBFGS for one that uses a different line search.

Graph
Supertypes
trait Minimize
class Object
trait Matchable
class Any

Members list

Value members

Concrete methods

def findDir(g: VectorD, k: Int): VectorD

Find the deflected gradient by passing in the current gradient and using the last m steps or changes in x-position s and change in gradient y vectors.

Find the deflected gradient by passing in the current gradient and using the last m steps or changes in x-position s and change in gradient y vectors.

Value parameters

g

the current gradient

k

the k-th iteration

Attributes

See also
def solve(x0: VectorD, α: Double): FuncVec

Solve for an optima by finding a local optima close to the starting point/guess 'x0'. This version numerically approximates the first derivatives.

Solve for an optima by finding a local optima close to the starting point/guess 'x0'. This version numerically approximates the first derivatives.

Value parameters

x0

the starting point/guess

α

the current learning rate

Attributes

def solve2(x0: VectorD, grad: FunctionV2V, α: Double): FuncVec

Solve for an optima by finding a local optima close to the starting point/guess 'x0'. This version uses explicit functions for the gradient (partials derivatives)

Solve for an optima by finding a local optima close to the starting point/guess 'x0'. This version uses explicit functions for the gradient (partials derivatives)

Value parameters

grad

the gradient as explicit functions for partials

x0

the starting point/guess

α

the current learning rate

Attributes