Packages

c

scalation.analytics

RidgeRegression

class RidgeRegression[MatT <: MatriD, VecT <: VectoD] extends Predictor with Error

The RidgeRegression class supports multiple linear regression. In this case, 'x' is multi-dimensional [x_1, ... x_k]. Both the input matrix 'x' and the response vector 'y' are centered (zero mean). Fit the parameter vector 'b' in the regression equation

y = b dot x + e = b_1 * x_1 + ... b_k * x_k + e

where 'e' represents the residuals (the part not explained by the model). Use Least-Squares (minimizing the residuals) to fit the parameter vector

b = fac.solve (.) with regularization x.t * x + λ * I

Four factorization techniques are provided:

'QR' // QR Factorization: slower, more stable (default) 'Cholesky' // Cholesky Factorization: faster, less stable (reasonable choice) 'SVD' // Singular Value Decomposition: slowest, most robust 'Inverse' // Inverse/Gaussian Elimination, classical textbook technique

See also

statweb.stanford.edu/~tibs/ElemStatLearn/

Linear Supertypes
Error, Predictor, AnyRef, Any
Ordering
  1. Alphabetic
  2. By Inheritance
Inherited
  1. RidgeRegression
  2. Error
  3. Predictor
  4. AnyRef
  5. Any
  1. Hide All
  2. Show All
Visibility
  1. Public
  2. All

Instance Constructors

  1. new RidgeRegression(x: MatT, y: VecT, lambda_: Double = 0.1, technique: RegTechnique = Cholesky)

    x

    the centered input/design m-by-n matrix NOT augmented with a first column of ones

    y

    the centered response vector

    lambda_

    the shrinkage parameter (0 => OLS) in the penalty term 'lambda * b dot b'

    technique

    the technique used to solve for b in x.t*x*b = x.t*y

Type Members

  1. type Fac_QR = Fac_QR_H[MatT]

Value Members

  1. def backElim(): (Int, VectoD, VectorD)

    Perform backward elimination to remove the least predictive variable from the model, returning the variable to eliminate, the new parameter vector, the new quality of fit.

  2. def coefficient: VectoD

    Return the vector of coefficient/parameter values.

    Return the vector of coefficient/parameter values.

    Definition Classes
    Predictor
  3. def diagnose(yy: VectoD): Unit

    Compute diagostics for the regression model.

    Compute diagostics for the regression model.

    yy

    the response vector

    Definition Classes
    RidgeRegressionPredictor
  4. def fit: VectorD

    Return the quality of fit.

    Return the quality of fit.

    Definition Classes
    RidgeRegressionPredictor
  5. def fitLabels: Seq[String]

    Return the labels for the fit.

    Return the labels for the fit.

    Definition Classes
    RidgeRegressionPredictor
  6. final def flaw(method: String, message: String): Unit
    Definition Classes
    Error
  7. def gcv(yy: VectoD): Double

    Find an optimal value for the shrinkage parameter 'λ' using Generalized Cross Validation (GCV).

    Find an optimal value for the shrinkage parameter 'λ' using Generalized Cross Validation (GCV).

    yy

    the response vector

  8. def predict(z: VectoD): Double

    Predict the value of y = f(z) by evaluating the formula below.

    Predict the value of y = f(z) by evaluating the formula below.

    z

    the new vector to predict

    Definition Classes
    RidgeRegressionPredictor
  9. def predict(z: VectoI): Double

    Given a new discrete data vector z, predict the y-value of f(z).

    Given a new discrete data vector z, predict the y-value of f(z).

    z

    the vector to use for prediction

    Definition Classes
    Predictor
  10. def residual: VectoD

    Return the vector of residuals/errors.

    Return the vector of residuals/errors.

    Definition Classes
    Predictor
  11. def train(): Unit

    Train the predictor by fitting the parameter vector (b-vector) in the multiple regression equation using the least squares method on 'y'.

    Train the predictor by fitting the parameter vector (b-vector) in the multiple regression equation using the least squares method on 'y'.

    Definition Classes
    RidgeRegressionPredictor
  12. def train(yy: VectoD): Unit

    Train the predictor by fitting the parameter vector (b-vector) in the multiple regression equation

    Train the predictor by fitting the parameter vector (b-vector) in the multiple regression equation

    yy = b dot x + e = [b_1, ... b_k] dot [x_1, ... x_k] + e

    using the least squares method.

    yy

    the response vector

    Definition Classes
    RidgeRegressionPredictor
  13. def vif: VectorD

    Compute the Variance Inflation Factor 'VIF' for each variable to test for multi-collinearity by regressing 'xj' against the rest of the variables.

    Compute the Variance Inflation Factor 'VIF' for each variable to test for multi-collinearity by regressing 'xj' against the rest of the variables. A VIF over 10 indicates that over 90% of the variance of 'xj' can be predicted from the other variables, so 'xj' is a candidate for removal from the model.

  14. def xtx_λI(λ: Double): Unit