Packages

package par

The par package contains classes, traits and objects for parallel analytics including clustering and prediction.

Linear Supertypes
AnyRef, Any
Ordering
  1. Alphabetic
  2. By Inheritance
Inherited
  1. par
  2. AnyRef
  3. Any
  1. Hide All
  2. Show All
Visibility
  1. Public
  2. All

Type Members

  1. class ANCOVA extends Regression

    The ANCOVA class supports ANalysis of COVAriance 'ANCOVA'.

    The ANCOVA class supports ANalysis of COVAriance 'ANCOVA'. It allows the addition of a categorical treatment variable 't' into a multiple linear regression. This is done by introducing dummy variables 'dj' to distinguish the treatment level. The problem is again to fit the parameter vector 'b' in the augmented regression equation

    y = b dot x + e = b0 + b_1 * x_1 + b_2 * x_2 + ... b_k * x_k + b_k+1 * d_1 + b_k+2 * d_2 + ... b_k+l * d_l + e

    where 'e' represents the residuals (the part not explained by the model). Use Least-Squares (minimizing the residuals) to solve for the parameter vector 'b' using the Normal Equations:

    x.t * x * b = x.t * y b = fac.solve (.)

    't' has categorical values/levels, e.g., treatment levels (0, ... 't.max ()')

    See also

    see.stanford.edu/materials/lsoeldsee263/05-ls.pdf

  2. class PolyRegression extends PredictorVec

    The PolyRegression class supports polynomial regression.

    The PolyRegression class supports polynomial regression. In this case, 't' is expanded to [1, t, t2 ... tk]. Fit the parameter vector 'b' in the regression equation

    y = b dot x + e = b_0 + b_1 * t + b_2 * t2 ... b_k * tk + e

    where 'e' represents the residuals (the part not explained by the model). Use Least-Squares (minimizing the residuals) to fit the parameter vector

    b = x_pinv * y

    where 'x_pinv' is the pseudo-inverse.

    See also

    www.ams.sunysb.edu/~zhu/ams57213/Team3.pptx

  3. abstract class PredictorVec extends Predictor with Error

    The PredictorVec class supports term expanded regression (work is delegated to the Regression class).

    The PredictorVec class supports term expanded regression (work is delegated to the Regression class). Fit the parameter vector 'b' in the regression equation. Use Least-Squares (minimizing the residuals) to solve for the parameter vector 'b' using the Normal Equations:

    x.t * x * b = x.t * y b = fac.solve (.)

  4. class Regression extends PredictorMat

    The Regression class supports multiple linear regression.

    The Regression class supports multiple linear regression. In this case, 'x' is multi-dimensional [1, x_1, ... x_k]. Fit the parameter vector 'b' in the regression equation

    y = b dot x + e = b_0 + b_1 * x_1 + ... b_k * x_k + e

    where 'e' represents the residuals (the part not explained by the model). Use Least-Squares (minimizing the residuals) to fit the parameter vector

    b = x_pinv * y [ alternative: b = solve (y) ]

    where 'x_pinv' is the pseudo-inverse. Three techniques are provided:

    'Fac_QR' // QR Factorization: slower, more stable (default) 'Fac_Cholesky' // Cholesky Factorization: faster, less stable (reasonable choice) 'Inverse' // Inverse/Gaussian Elimination, classical textbook technique (outdated)

    This version uses parallel processing to speed up execution.

    See also

    see.stanford.edu/materials/lsoeldsee263/05-ls.pdf

  5. class RidgeRegression extends PredictorMat

    The RidgeRegression class supports multiple linear regression.

    The RidgeRegression class supports multiple linear regression. In this case, 'x' is multi-dimensional [x_1, ... x_k]. Both the input matrix 'x' and the response vector 'y' are centered (zero mean). Fit the parameter vector 'b' in the regression equation

    y = b dot x + e = b_1 * x_1 + ... b_k * x_k + e

    where 'e' represents the residuals (the part not explained by the model). Use Least-Squares (minimizing the residuals) to fit the parameter vector

    b = x_pinv * y [ alternative: b = solve (y) ]

    where 'x_pinv' is the pseudo-inverse. Three techniques are provided:

    'Fac_QR' // QR Factorization: slower, more stable (default) 'Fac_Cholesky' // Cholesky Factorization: faster, less stable (reasonable choice) 'Inverse' // Inverse/Gaussian Elimination, classical textbook technique (outdated)

    This version uses parallel processing to speed up execution.

    See also

    statweb.stanford.edu/~tibs/ElemStatLearn/

Value Members

  1. val BASE_DIR: String

    The relative path for base directory

  2. object ANCOVA extends Error

    The ANCOVA companion object provides helper functions.

  3. object ANCOVATest extends App

    The ANCOVATest object tests the ANCOVA class using the following regression equation.

    The ANCOVATest object tests the ANCOVA class using the following regression equation.

    y = b dot x = b_0 + b_1*x_1 + b_2*x_2 + b_3*d_1 + b_4*d_2

    > runMain scalation.analytics.ANCOVATest

  4. object ANCOVATest2 extends App

    The ANCOVATest2 object tests the ANCOVA object related to related to encoding a column 'x1' of strings.

    The ANCOVATest2 object tests the ANCOVA object related to related to encoding a column 'x1' of strings. > runMain scalation.analytics.ANCOVATest2

  5. object PolyRegressionTest extends App

    The PolyRegressionTest object tests PolyRegression class using the following regression equation.

    The PolyRegressionTest object tests PolyRegression class using the following regression equation.

    y = b dot x = b_0 + b_1*t + b_2*t^2.

  6. object PredictorMatO

    The PredictorMat companion object provides a meythod for splitting a combined data matrix in predictor matrix and a response vector.

  7. object RegressionTest extends App

    The RegressionTest object tests Regression class using the following regression equation.

    The RegressionTest object tests Regression class using the following regression equation.

    y = b dot x = b_0 + b_1*x_1 + b_2*x_2.

    Test regression and backward elimination.

    See also

    http://statmaster.sdu.dk/courses/st111/module03/index.html

  8. object RegressionTest2 extends App

    The RegressionTest2 object tests Regression class using the following regression equation.

    The RegressionTest2 object tests Regression class using the following regression equation.

    y = b dot x = b_0 + b_1*x1 + b_2*x_2.

    Test regression using QR Decomposition and Gaussian Elimination for computing the pseudo-inverse.

  9. object RegressionTest3 extends App

    The RegressionTest3 object tests the multi-collinearity method in the Regression class using the following regression equation.

    The RegressionTest3 object tests the multi-collinearity method in the Regression class using the following regression equation.

    y = b dot x = b_0 + b_1*x_1 + b_2*x_2 + b_3*x_3 + b_4 * x_4

    See also

    online.stat.psu.edu/online/development/stat501/12multicollinearity/05multico_vif.html

    online.stat.psu.edu/online/development/stat501/data/bloodpress.txt

  10. object RidgeRegression

    The RidgeRegression companion object is used to center the input matrix 'x'.

    The RidgeRegression companion object is used to center the input matrix 'x'. This is done by subtracting the column means from each value.

  11. object RidgeRegressionTest extends App

    The RidgeRegressionTest object tests RidgeRegression class using the following regression equation.

    The RidgeRegressionTest object tests RidgeRegression class using the following regression equation.

    y = b dot x = b_1*x_1 + b_2*x_2.

    Test regression and backward elimination.

    See also

    http://statmaster.sdu.dk/courses/st111/module03/index.html

  12. object RidgeRegressionTest2 extends App

    The RidgeRegressionTest2 object tests RidgeRegression class using the following regression equation.

    The RidgeRegressionTest2 object tests RidgeRegression class using the following regression equation.

    y = b dot x = b_1*x1 + b_2*x_2.

    Test regression using QR Decomposition and Gaussian Elimination for computing the pseudo-inverse.

  13. object RidgeRegressionTest3 extends App

    The RidgeRegressionTest3 object tests the multi-collinearity method in the RidgeRegression class using the following regression equation.

    The RidgeRegressionTest3 object tests the multi-collinearity method in the RidgeRegression class using the following regression equation.

    y = b dot x = b_1*x_1 + b_2*x_2 + b_3*x_3 + b_4 * x_4

    See also

    online.stat.psu.edu/online/development/stat501/12multicollinearity/05multico_vif.html

    online.stat.psu.edu/online/development/stat501/data/bloodpress.txt

Inherited from AnyRef

Inherited from Any

Ungrouped