Newton_NoLS

scalation.optimization.Newton_NoLS
class Newton_NoLS(f: FunctionV2S, useLS: Boolean) extends Minimize

The Newton_NoLS class is used to find optima for functions of vectors. The solve method finds local optima using the Newton method that deflects the gradient using the inverse Hessian.

min f(x)    where f: R^n -> R

Value parameters

f

the vector to scalar function to find optima of

useLS

whether to use Line Search (LS)

Attributes

See also

Newton for one that uses a different line search.

Graph
Supertypes
trait Minimize
class Object
trait Matchable
class Any

Members list

Value members

Concrete methods

def solve(x0: VectorD, α: Double): FuncVec

Solve for an optima by finding a local optima close to the starting point/guess 'x0'. This version numerically approximates the first and second derivatives.

Solve for an optima by finding a local optima close to the starting point/guess 'x0'. This version numerically approximates the first and second derivatives.

Value parameters

x0

the starting point/guess

α

the current learning rate

Attributes

def solve2(x0: VectorD, grd: Array[FunctionV2S], α: Double): FuncVec

Solve for an optima by finding a local optima close to the starting point/guess 'x0'. This version uses explicit functions for the gradient (partials derivatives)

Solve for an optima by finding a local optima close to the starting point/guess 'x0'. This version uses explicit functions for the gradient (partials derivatives)

Value parameters

grd

the gradient as explicit functions for partials (in array form)

x0

the starting point/guess

α

the current learning rate

Attributes