Packages

c

scalation.analytics.par

PolyRegression

class PolyRegression extends Predictor with Error

The PolyRegression class supports polynomial regression. In this case, 't' is expanded to [1, t, t2 ... tk]. Fit the parameter vector 'b' in the regression equation

y = b dot x + e = b_0 + b_1 * t + b_2 * t2 ... b_k * tk + e

where 'e' represents the residuals (the part not explained by the model). Use Least-Squares (minimizing the residuals) to fit the parameter vector

b = x_pinv * y

where 'x_pinv' is the pseudo-inverse.

See also

www.ams.sunysb.edu/~zhu/ams57213/Team3.pptx

Linear Supertypes
Error, Predictor, AnyRef, Any
Ordering
  1. Alphabetic
  2. By Inheritance
Inherited
  1. PolyRegression
  2. Error
  3. Predictor
  4. AnyRef
  5. Any
  1. Hide All
  2. Show All
Visibility
  1. Public
  2. All

Instance Constructors

  1. new PolyRegression(t: VectorD, y: VectorD, k: Int, technique: RegTechnique = QR)

    t

    the input vector: t_i expands to x_i = [1, t_i, t_i2, ... t_ik]

    y

    the response vector

    k

    the order of the polynomial

    technique

    the technique used to solve for b in x.t*x*b = x.t*y

Value Members

  1. def backElim(): (Int, VectoD, VectoD)

    Perform backward elimination to remove the least predictive variable from the model, returning the variable to eliminate, the new parameter vector, the new R-squared value and the new F statistic.

  2. def coefficient: VectoD

    Return the vector of coefficient/parameter values.

    Return the vector of coefficient/parameter values.

    Definition Classes
    Predictor
  3. def eval(): Unit

    Compute the error and useful diagnostics.

    Compute the error and useful diagnostics.

    Definition Classes
    PolyRegressionPredictor
  4. def eval(xx: MatriD, yy: VectoD): Unit

    Compute the error and useful diagnostics for the test dataset.

    Compute the error and useful diagnostics for the test dataset.

    xx

    the test data matrix

    yy

    the test response vector FIX - implement in classes

    Definition Classes
    Predictor
  5. def expand(t: Double): VectorD

    Expand the scalar 't' into a vector of powers of 't': [1, t, t2 ... tk].

    Expand the scalar 't' into a vector of powers of 't': [1, t, t2 ... tk].

    t

    the scalar to expand into the vector

  6. def fit: VectoD

    Return the quality of fit including 'rSquared'.

  7. final def flaw(method: String, message: String): Unit
    Definition Classes
    Error
  8. def predict(z: VectoD): Double

    Predict the value of y = f(z) by evaluating the formula y = b dot z, e.g., (b_0, b_1, b_2) dot (1, z_1, z_2).

    Predict the value of y = f(z) by evaluating the formula y = b dot z, e.g., (b_0, b_1, b_2) dot (1, z_1, z_2).

    z

    the new vector to predict

    Definition Classes
    PolyRegressionPredictor
  9. def predict(z: Double): Double

    Predict the value of y = f(z) by evaluating the formula y = b dot expand (z), e.g., (b_0, b_1, b_2) dot (1, z, z^2).

    Predict the value of y = f(z) by evaluating the formula y = b dot expand (z), e.g., (b_0, b_1, b_2) dot (1, z, z^2).

    z

    the new scalar to predict

  10. def predict(z: VectoI): Double

    Given a new discrete data vector z, predict the y-value of f(z).

    Given a new discrete data vector z, predict the y-value of f(z).

    z

    the vector to use for prediction

    Definition Classes
    Predictor
  11. def residual: VectoD

    Return the vector of residuals/errors.

    Return the vector of residuals/errors.

    Definition Classes
    Predictor
  12. val rg: Regression
  13. def train(yy: VectoD): Regression

    Retrain the predictor by fitting the parameter vector (b-vector) in the multiple regression equation yy = b dot x + e = [b_0, ...

    Retrain the predictor by fitting the parameter vector (b-vector) in the multiple regression equation yy = b dot x + e = [b_0, ... b_k] dot [1, t, t2 ... tk] + e using the least squares method.

    yy

    the new response vector

    Definition Classes
    PolyRegressionPredictor
  14. def train(): Regression

    Train the predictor by fitting the parameter vector (b-vector) in the regression equation y = b dot x + e = [b_0, ...

    Train the predictor by fitting the parameter vector (b-vector) in the regression equation y = b dot x + e = [b_0, ... b_k] dot [1, t, t2 ... tk] + e using the least squares method.

  15. def vif: VectorD

    Compute the Variance Inflation Factor 'VIF' for each variable to test for multi-collinearity by regressing 'xj' against the rest of the variables.

    Compute the Variance Inflation Factor 'VIF' for each variable to test for multi-collinearity by regressing 'xj' against the rest of the variables. A VIF over 10 indicates that over 90% of the variance of 'xj' can be predicted from the other variables, so 'xj' is a candidate for removal from the model.

  16. val x: MatrixD