Packages

package par

The par package contains classes, traits and objects for parallel analytics including clustering and prediction.

Linear Supertypes
AnyRef, Any
Ordering
  1. Alphabetic
  2. By Inheritance
Inherited
  1. par
  2. AnyRef
  3. Any
  1. Hide All
  2. Show All
Visibility
  1. Public
  2. All

Type Members

  1. class ANCOVA extends Predictor with Error

    The ANCOVA class supports ANalysis of COVAriance (ANCOVA).

    The ANCOVA class supports ANalysis of COVAriance (ANCOVA). It allows the addition of a categorical treatment variable 't' into a multiple linear regression. This is done by introducing dummy variables 'dj' to distinguish the treatment level. The problem is again to fit the parameter vector 'b' in the augmented regression equation

    y = b dot x + e = b0 + b_1 * x_1 + b_2 * x_2 + ... b_k * x_k + b_k+1 * d_1 + b_k+2 * d_2 + ... b_k+l * d_l + e

    where 'e' represents the residuals (the part not explained by the model). Use Least-Squares (minimizing the residuals) to fit the parameter vector

    b = x_pinv * y

    where 'x_pinv' is the pseudo-inverse.

    See also

    see.stanford.edu/materials/lsoeldsee263/05-ls.pdf

  2. trait GLM extends AnyRef

    A General Linear Model 'GLM' can be developed using the GLM trait and object (see below).

    A General Linear Model 'GLM' can be developed using the GLM trait and object (see below). The implementation currently supports univariate models with multivariate models (where each response is a vector) planned for the future. This version uses parallel processing to speed up execution. It provides factory methods for the following special types of GLMs: Regression - multiple linear regression, RidgeRegression - robust multiple linear regression, TranRegression - transformed (e.g., log) multiple linear regression, PolyRegression - polynomial regression, TrigRegression - trigonometric regression ResponseSurface - response surface regression, ANCOVA - GLM form of ANalysis of COVAriance. The following special types are excluded since they do not utilize large matrices. SimpleRegression - simple linear regression, ANOVA - GLM form of ANalysis Of VAriance,

  3. class PolyRegression extends Predictor with Error

    The PolyRegression class supports polynomial regression.

    The PolyRegression class supports polynomial regression. In this case, 't' is expanded to [1, t, t2 ... tk]. Fit the parameter vector 'b' in the regression equation

    y = b dot x + e = b_0 + b_1 * t + b_2 * t2 ... b_k * tk + e

    where 'e' represents the residuals (the part not explained by the model). Use Least-Squares (minimizing the residuals) to fit the parameter vector

    b = x_pinv * y

    where 'x_pinv' is the pseudo-inverse.

    See also

    www.ams.sunysb.edu/~zhu/ams57213/Team3.pptx

  4. class Regression extends Predictor with Error

    The Regression class supports multiple linear regression.

    The Regression class supports multiple linear regression. In this case, 'x' is multi-dimensional [1, x_1, ... x_k]. Fit the parameter vector 'b' in the regression equation

    y = b dot x + e = b_0 + b_1 * x_1 + ... b_k * x_k + e

    where 'e' represents the residuals (the part not explained by the model). Use Least-Squares (minimizing the residuals) to fit the parameter vector

    b = x_pinv * y [ alternative: b = solve (y) ]

    where 'x_pinv' is the pseudo-inverse. Three techniques are provided:

    'Fac_QR' // QR Factorization: slower, more stable (default) 'Fac_Cholesky' // Cholesky Factorization: faster, less stable (reasonable choice) 'Inverse' // Inverse/Gaussian Elimination, classical textbook technique (outdated)

    This version uses parallel processing to speed up execution.

    See also

    see.stanford.edu/materials/lsoeldsee263/05-ls.pdf

  5. class ResponseSurface extends Predictor with Error

    The ResponseSurface class uses multiple regression to fit a quadratic/cubic surface to the data.

    The ResponseSurface class uses multiple regression to fit a quadratic/cubic surface to the data. For example in 2D, the quadratic regression equation is

    y = b dot x + e = [b_0, ... b_k] dot [1, x_0, x_02, x_1, x_0*x_1, x_12] + e

    See also

    scalation.metamodel.QuadraticFit

  6. class RidgeRegression extends Predictor with Error

    The RidgeRegression class supports multiple linear regression.

    The RidgeRegression class supports multiple linear regression. In this case, 'x' is multi-dimensional [x_1, ... x_k]. Both the input matrix 'x' and the response vector 'y' are centered (zero mean). Fit the parameter vector 'b' in the regression equation

    y = b dot x + e = b_1 * x_1 + ... b_k * x_k + e

    where 'e' represents the residuals (the part not explained by the model). Use Least-Squares (minimizing the residuals) to fit the parameter vector

    b = x_pinv * y [ alternative: b = solve (y) ]

    where 'x_pinv' is the pseudo-inverse. Three techniques are provided:

    'Fac_QR' // QR Factorization: slower, more stable (default) 'Fac_Cholesky' // Cholesky Factorization: faster, less stable (reasonable choice) 'Inverse' // Inverse/Gaussian Elimination, classical textbook technique (outdated)

    This version uses parallel processing to speed up execution.

    See also

    statweb.stanford.edu/~tibs/ElemStatLearn/

  7. class TranRegression extends Predictor with Error

    The TranRegression class supports transformed multiple linear regression.

    The TranRegression class supports transformed multiple linear regression. In this case, 'x' is multi-dimensional [1, x_1, ... x_k]. Fit the parameter vector 'b' in the transformed regression equation

    transform (y) = b dot x + e = b_0 + b_1 * x_1 + b_2 * x_2 ... b_k * x_k + e

    where 'e' represents the residuals (the part not explained by the model) and 'transform' is the function (defaults to log) used to transform the response vector 'y'. Use Least-Squares (minimizing the residuals) to fit the parameter vector

    b = x_pinv * y

    where 'x_pinv' is the pseudo-inverse.

    See also

    www.ams.sunysb.edu/~zhu/ams57213/Team3.pptx

  8. class TrigRegression extends Predictor with Error

    The TrigRegression class supports trigonometric regression.

    The TrigRegression class supports trigonometric regression. In this case, 't' is expanded to [1, sin (wt), cos (wt), sin (2wt), cos (2wt), ...]. Fit the parameter vector 'b' in the regression equation

    y = b dot x + e = b_0 + b_1 sin (wt) + b_2 cos (wt) + b_3 sin (2wt) + b_4 cos (2wt) + ... + e

    where 'e' represents the residuals (the part not explained by the model). Use Least-Squares (minimizing the residuals) to fit the parameter vector

    b = x_pinv * y

    where 'x_pinv' is the pseudo-inverse.

    See also

    link.springer.com/article/10.1023%2FA%3A1022436007242#page-1

Value Members

  1. val BASE_DIR: String

    The relative path for base directory

  2. object ANCOVATest extends App

    The ANCOVATest object tests the ANCOVA class using the following regression equation.

    The ANCOVATest object tests the ANCOVA class using the following regression equation.

    y = b dot x = b_0 + b_1*x_1 + b_2*x_2 + b_3*d_1 + b_4*d_2

  3. object GLM extends GLM

    The GLM object makes the GLM trait's methods directly available.

    The GLM object makes the GLM trait's methods directly available. This approach (using traits and objects) allows the methods to also be inherited.

  4. object GLMTest extends App

    The GLMTest object tests the GLM object using the following regression equation.

    The GLMTest object tests the GLM object using the following regression equation.

    y = b dot x = b_0 + b_1*x_1 + b_2*x_2 + b_3*d_1 + b_4*d_2

  5. object PolyRegressionTest extends App

    The PolyRegressionTest object tests PolyRegression class using the following regression equation.

    The PolyRegressionTest object tests PolyRegression class using the following regression equation.

    y = b dot x = b_0 + b_1*t + b_2*t^2.

  6. object RegressionTest extends App

    The RegressionTest object tests Regression class using the following regression equation.

    The RegressionTest object tests Regression class using the following regression equation.

    y = b dot x = b_0 + b_1*x_1 + b_2*x_2.

    Test regression and backward elimination.

    See also

    http://statmaster.sdu.dk/courses/st111/module03/index.html

  7. object RegressionTest2 extends App

    The RegressionTest2 object tests Regression class using the following regression equation.

    The RegressionTest2 object tests Regression class using the following regression equation.

    y = b dot x = b_0 + b_1*x1 + b_2*x_2.

    Test regression using QR Decomposition and Gaussian Elimination for computing the pseudo-inverse.

  8. object RegressionTest3 extends App

    The RegressionTest3 object tests the multi-collinearity method in the Regression class using the following regression equation.

    The RegressionTest3 object tests the multi-collinearity method in the Regression class using the following regression equation.

    y = b dot x = b_0 + b_1*x_1 + b_2*x_2 + b_3*x_3 + b_4 * x_4

    See also

    online.stat.psu.edu/online/development/stat501/data/bloodpress.txt

    online.stat.psu.edu/online/development/stat501/12multicollinearity/05multico_vif.html

  9. object ResponseSurfaceTest extends App

    The ResponseSurfaceTest object is used to test the ResponseSurface class.

  10. object RidgeRegression

    The RidgeRegression companion object is used to center the input matrix 'x'.

    The RidgeRegression companion object is used to center the input matrix 'x'. This is done by subtracting the column means from each value.

  11. object RidgeRegressionTest extends App

    The RidgeRegressionTest object tests RidgeRegression class using the following regression equation.

    The RidgeRegressionTest object tests RidgeRegression class using the following regression equation.

    y = b dot x = b_1*x_1 + b_2*x_2.

    Test regression and backward elimination.

    See also

    http://statmaster.sdu.dk/courses/st111/module03/index.html

  12. object RidgeRegressionTest2 extends App

    The RidgeRegressionTest2 object tests RidgeRegression class using the following regression equation.

    The RidgeRegressionTest2 object tests RidgeRegression class using the following regression equation.

    y = b dot x = b_1*x1 + b_2*x_2.

    Test regression using QR Decomposition and Gaussian Elimination for computing the pseudo-inverse.

  13. object RidgeRegressionTest3 extends App

    The RidgeRegressionTest3 object tests the multi-collinearity method in the RidgeRegression class using the following regression equation.

    The RidgeRegressionTest3 object tests the multi-collinearity method in the RidgeRegression class using the following regression equation.

    y = b dot x = b_1*x_1 + b_2*x_2 + b_3*x_3 + b_4 * x_4

    See also

    online.stat.psu.edu/online/development/stat501/data/bloodpress.txt

    online.stat.psu.edu/online/development/stat501/12multicollinearity/05multico_vif.html

  14. object TranRegressionTest extends App

    The TranRegressionTest object tests TranRegression class using the following regression equation.

    The TranRegressionTest object tests TranRegression class using the following regression equation.

    log (y) = b dot x = b_0 + b_1*x_1 + b_2*x_2.

  15. object TrigRegressionTest extends App

    The TrigRegressionTest object tests TrigRegression class using the following regression equation.

    The TrigRegressionTest object tests TrigRegression class using the following regression equation.

    y = b dot x = b_0 + b_1*t + b_2*t^2.

Inherited from AnyRef

Inherited from Any

Ungrouped