the objective function to be minimized
the constraint function to be satisfied, if any
whether the constraint is treated as inequality (default) or equality
whether to use exact (e.g., GoldenLS) or inexact (e.g., WolfeLS) Line Search
The objective function f plus a weighted penalty based on the constraint function g.
The objective function f plus a weighted penalty based on the constraint function g.
the coordinate values of the current point
Show the flaw by printing the error message.
Show the flaw by printing the error message.
the method where the error occurred
the error message
Perform an exact (GoldenSectionLS) or inexact (WolfeLS) Line Search.
Perform an exact (GoldenSectionLS) or inexact (WolfeLS) Line Search. Search in direction 'dir', returning the distance 'z' to move in that direction. Default to
the current point
the direction to move in
the initial step size
Set the partial derivative functions.
Set the partial derivative functions. If these functions are available, they are more efficient and more accurate than estimating the values using difference quotients (the default approach).
the array of partial derivative functions
Use the Steepest-Descent algorithm rather than the default BFGS algorithm.
Solve the following Non-Linear Programming (NLP) problem using BFGS: min { f(x) | g(x) <= 0 }.
Solve the following Non-Linear Programming (NLP) problem using BFGS: min { f(x) | g(x) <= 0 }. To use explicit functions for gradient, replace 'gradient (fg, x._1 + s)' with 'gradientD (df, x._1 + s)'.
the starting point
the initial step size
the tolerence
Update the 'binv' matrix, which is used to deflect -gradient to a better search direction than steepest descent (-gradient).
Update the 'binv' matrix, which is used to deflect -gradient to a better search direction than steepest descent (-gradient). Compute the 'binv' matrix directly using the Sherman–Morrison formula.
the step vector (next point - current point)
the difference in the gradients (next - current)
http://en.wikipedia.org/wiki/BFGS_method
Broyden–Fletcher–Goldfarb–Shanno (BFGS) Quasi-Newton Algorithm for solving Non-Linear Programming (NLP) problems. BFGS determines a search direction by deflecting the steepest descent direction vector (opposite the gradient) by multiplying it by a matrix that approximates the inverse Hessian. Note, this implementation may be set up to work with the matrix 'b' (approximate Hessian) or directly with the 'binv' matrix (the inverse of b).
minimize f(x) subject to g(x) <= 0 [ optionally g(x) == 0 ]