c

scalation.minima

StochasticGradient

class StochasticGradient extends Minimizer with Error

The StochasticGradient class solves unconstrained Non-Linear Programming (NLP) problems using the Stochastic Gradient Descent algorithm. Given a function 'f' and a starting point 'x0', the algorithm computes the gradient and takes steps in the opposite direction. The algorithm iterates until it converges. The algorithm is stochastic in sense that only a single batch is used in each step of the optimimation. Examples (a number of rows) are are chosen for each batch. FIX - provide option to randomly select samples in batch

See also

leon.bottou.org/publications/pdf/compstat-2010.pdf dir_k = -gradient (x) minimize f(x)

Linear Supertypes
Error, Minimizer, AnyRef, Any
Ordering
  1. Alphabetic
  2. By Inheritance
Inherited
  1. StochasticGradient
  2. Error
  3. Minimizer
  4. AnyRef
  5. Any
  1. Hide All
  2. Show All
Visibility
  1. Public
  2. All

Instance Constructors

  1. new StochasticGradient(fxy: (MatrixD, VectorD, VectorD) ⇒ Double, dx: MatrixD, dy: VectorD, batch: Int = 10, exactLS: Boolean = true)

    dx

    the data matrix

    dy

    the response vector

    batch

    the batch size

    exactLS

    whether to use exact (e.g., GoldenLS) or inexact (e.g., WolfeLS) Line Search

Value Members

  1. def f(x: VectorD): Double

    The objective function for the ith batch.

    The objective function for the ith batch.

    x

    the vector to optimize (e.g., model parameters)

  2. def fg(x: VectorD): Double

    The objective function 'f' plus a weighted penalty based on the constraint function 'g'.

    The objective function 'f' plus a weighted penalty based on the constraint function 'g'. Override for constrained optimization and ignore for unconstrained optimization.

    x

    the coordinate values of the current point

    Definition Classes
    Minimizer
  3. final def flaw(method: String, message: String): Unit
    Definition Classes
    Error
  4. def lineSearch(x: VectorD, dir: VectorD, step: Double = STEP): Double

    Perform an exact 'GoldenSectionLS' or inexact 'WolfeLS' line search.

    Perform an exact 'GoldenSectionLS' or inexact 'WolfeLS' line search. Search in direction 'dir', returning the distance 'z' to move in that direction.

    x

    the current point

    dir

    the direction to move in

    step

    the initial step size

    Definition Classes
    StochasticGradientMinimizer
  5. def solve(x0: VectorD, step: Double = STEP, toler: Double = EPSILON): VectorD

    Solve the Non-Linear Programming (NLP) problem using the Stochastic Gradient Descent algorithm.

    Solve the Non-Linear Programming (NLP) problem using the Stochastic Gradient Descent algorithm.

    x0

    the starting point

    step

    the initial step size

    toler

    the tolerance

    Definition Classes
    StochasticGradientMinimizer