SPSA
The SPSA
class implements the Simultaneous Perturbation Stochastic Approximation algorithm for rough approximation of gradients.
Value parameters
- checkCon
-
whether to check bounds contraints
- debug_
-
the whether to call in debug mode (does tracing)j
- f
-
the vector to scalar function whose approximate gradient is sought
- lower
-
the lower bounds vector
- max_iter
-
the maximum number of iterations
- upper
-
the upper bounds vector
Attributes
- See also
-
https://www.jhuapl.edu/spsa/PDF-SPSA/Matlab-SPSA_Alg.pdf minimize f(x)
- Graph
-
- Supertypes
-
trait MonitorEpochstrait BoundsConstrainttrait Minimizerclass Objecttrait Matchableclass AnyShow all
Members list
Value members
Concrete methods
Return a random vector of {-1, 1} values.
Return a random vector of {-1, 1} values.
Value parameters
- n
-
the size of the vector
- p
-
the probability of 1
- stream
-
the random number stream
Attributes
This method is not supported.
This method is not supported.
Attributes
Reset the parameters.
Reset the parameters.
Value parameters
- params
-
the given starting parameters of a VectorD
Attributes
Inherited methods
Constraint the current point x, so that lower <= x <= upper by bouncing back from a violated contraint. !@param x the current point
Constraint the current point x, so that lower <= x <= upper by bouncing back from a violated contraint. !@param x the current point
Attributes
- Inherited from:
- BoundsConstraint
The objective function f plus a weighted penalty based on the constraint function g. Override for constrained optimization and ignore for unconstrained optimization.
The objective function f plus a weighted penalty based on the constraint function g. Override for constrained optimization and ignore for unconstrained optimization.
Value parameters
- x
-
the coordinate values of the current point
Attributes
- Inherited from:
- Minimizer
Return the loss function for each epoch.
Attributes
- Inherited from:
- MonitorEpochs
Solve the following Non-Linear Programming (NLP) problem: min { f(x) | g(x) <= 0 }. To use explicit functions for gradient, replace gradient (fg, x._1 + s) with gradientD (df, x._1 + s). This method uses multiple random restarts.
Solve the following Non-Linear Programming (NLP) problem: min { f(x) | g(x) <= 0 }. To use explicit functions for gradient, replace gradient (fg, x._1 + s) with gradientD (df, x._1 + s). This method uses multiple random restarts.
Value parameters
- n
-
the dimensionality of the search space
- step_
-
the initial step size
- toler
-
the tolerance
Attributes
- Inherited from:
- Minimizer
Inherited fields
Attributes
- Inherited from:
- MonitorEpochs