package analytics
The analytics
package contains classes, traits and objects for
analytics including clustering and prediction.
- Alphabetic
- By Inheritance
- analytics
- AnyRef
- Any
- Hide All
- Show All
- Public
- All
Type Members
-
class
ANCOVA extends Predictor with Error
The
ANCOVA
class supports ANalysis of COVAriance 'ANCOVA'.The
ANCOVA
class supports ANalysis of COVAriance 'ANCOVA'. It allows the addition of a categorical treatment variable 't' into a multiple linear regression. This is done by introducing dummy variables 'dj' to distinguish the treatment level. The problem is again to fit the parameter vector 'b' in the augmented regression equationy = b dot x + e = b0 + b_1 * x_1 + b_2 * x_2 + ... b_k * x_k + b_k+1 * d_1 + b_k+2 * d_2 + ... b_k+l * d_l + e
where 'e' represents the residuals (the part not explained by the model). Use Least-Squares (minimizing the residuals) to solve for the parameter vector 'b' using the Normal Equations:
x.t * x * b = x.t * y b = fac.solve (.)
- See also
see.stanford.edu/materials/lsoeldsee263/05-ls.pdf
-
class
ANOVA1 extends PredictorVec
The
ANOVA1
class supports one-way ANalysis Of VAriance (ANOVA), i.e, it allows only one binary/categorial treatment variable.The
ANOVA1
class supports one-way ANalysis Of VAriance (ANOVA), i.e, it allows only one binary/categorial treatment variable. It is framed using General Linear Model 'GLM' notation and supports the use of one binary/categorical treatment variable 't'. This is done by introducing dummy variables 'd_j' to distinguish the treatment level. The problem is again to fit the parameter vector 'b' in the following equationy = b dot x + e = b_0 + b_1 * d_1 + b_1 * d_2 ... b_k * d_k + e
where 'e' represents the residuals (the part not explained by the model). Use Least-Squares (minimizing the residuals) to solve for the parameter vector 'b' using the Normal Equations:
x.t * x * b = x.t * y b = fac.solve (.)
- See also
psych.colorado.edu/~carey/Courses/PSYC5741/handouts/GLM%20Theory.pdf
ANCOVA
for models with multiple variables
-
class
CanCorrelation extends Reducer with Error
The
CanCorrelation
class performs Canonical Correlation Analysis 'CCA' on two random vectors.The
CanCorrelation
class performs Canonical Correlation Analysis 'CCA' on two random vectors. Samples for the first one are stored in the 'x' data matrix and samples for the second are stored in the 'y' data matrix. Find vectors a and b that maximize the correlation between x * a and y * b.max {rho (x * a, y * b)}
Additional vectors orthogonal to a and b can also be found.
-
class
ExpRegression extends PredictorMat
The
ExpRegression
class supports exponential regression.The
ExpRegression
class supports exponential regression. In this case, 'x' is multi-dimensional [1, x_1, ... x_k]. Fit the parameter vector 'b' in the exponential regression equationlog (mu (x)) = b dot x = b_0 + b_1 * x_1 + ... b_k * x_k
- See also
www.stat.uni-muenchen.de/~leiten/Lehre/Material/GLM_0708/chapterGLM.pdf
-
class
Fit extends AnyRef
The
Fit
class provides methods to determine basic quality of fit measures. -
trait
GLM extends AnyRef
A General Linear Model 'GLM' can be developed using the
GLM
trait and object (see below).A General Linear Model 'GLM' can be developed using the
GLM
trait and object (see below). The implementation currently supports univariate models with multivariate models (where each response is a vector) planned for the future. It provides factory methods for the following special types of GLMs:SimpleRegression
- simple linear regression,Regression
- multiple linear regression using Ordinary Least Squares 'OLS'Regression_WLS
- multiple linear regression using Weighted Least Squares 'WLS'RidgeRegression
- robust multiple linear regression,TranRegression
- transformed (e.g., log) multiple linear regression,PolyRegression
- polynomial regression,TrigRegression
- trigonometric regressionResponseSurface
- response surface regression,ANOVA1
- GLM form of ANalysis Of VAriance,ANCOVA
- GLM form of ANalysis of COVAriance. -
class
LassoRegression extends PredictorMat
The
LassoRegression
class supports multiple linear regression.The
LassoRegression
class supports multiple linear regression. In this case, 'x' is multi-dimensional [1, x_1, ... x_k]. Fit the parameter vector 'b' in the regression equationy = b dot x + e = b_0 + b_1 * x_1 + ... b_k * x_k + e
where 'e' represents the residuals (the part not explained by the model).
- See also
see.stanford.edu/materials/lsoeldsee263/05-ls.pdf
-
class
NMFactorization extends AnyRef
The
NMFactorization
class factors a matrix 'v' into two non negative matrices 'w' and 'h' such that 'v = wh' approximately.The
NMFactorization
class factors a matrix 'v' into two non negative matrices 'w' and 'h' such that 'v = wh' approximately.- See also
en.wikipedia.org/wiki/Non-negative_matrix_factorization
-
abstract
class
NeuralNet extends Predictor with Error
The
NeuralNet
abstract class provides the basic structure and API for a variety of Neural Networks. -
class
NeuralNet_2L extends NeuralNet
The
NeuralNet_2L
class supports multi-output, 2-layer (input and output) Neural-Networks.The
NeuralNet_2L
class supports multi-output, 2-layer (input and output) Neural-Networks. It can be used for both classification and prediction, depending on the activation functions used. Given several input vectors and output vectors (training data), fit the weights/parameters 'bb' connecting the layers, so that for a new input vector 'z', the net can predict the output value, i.e.,yp_j = f (bb dot z)
where 'f' is the activation function and the parameter matrix 'bb' gives the weights between input and output layers. Note, 'b0' is treated as the bias, so 'x0' must be 1.0.
-
class
NeuralNet_3L extends NeuralNet
The
NeuralNet_3L
class supports multi-output, 3-layer (input, hidden and output) Neural-Networks.The
NeuralNet_3L
class supports multi-output, 3-layer (input, hidden and output) Neural-Networks. It can be used for both classification and prediction, depending on the activation functions used. Given several input vectors and output vectors (training data), fit the weights/parameters 'aa' and 'bb' connecting the layers, so that for a new input vector 'v', the net can predict the output value, i.e.,yp = f2 (bb * f1V (aa * v))
where 'f1' and 'f2' are the activation functions and the parameter matrices 'aa' and 'bb' gives the weights between input-hidden and hidden-output layers. Note, if 'a0' is to be treated as bias/intercept, 'x0' must be 1.0.
-
class
NeuralNet_XL extends Predictor with Error
The
NeuralNet_XL
class supports basic 3-layer (input, hidden and output) Neural Networks.The
NeuralNet_XL
class supports basic 3-layer (input, hidden and output) Neural Networks. Given several input and output vectors (training data), fit the weights connecting the layers, so that for a new input vector 'zi', the net can predict the output vector 'zo' ('zh' is the intermediate value at the hidden layer), i.e.,zi --> zh = f (w * zi) --> zo = g (v * zh)
Note, w_0 and v_0 are treated as biases, so zi_0 and zh_0 must be 1.0.
-
class
NonLinRegression extends PredictorMat
The
NonLinRegression
class supports non-linear regression.The
NonLinRegression
class supports non-linear regression. In this case, 'x' can be multi-dimensional '[1, x1, ... xk]' and the function 'f' is non-linear in the parameters 'b'. Fit the parameter vector 'b' in the regression equationy = f(x, b) + e
where 'e' represents the residuals (the part not explained by the model). Use Least-Squares (minimizing the residuals) to fit the parameter vector 'b' by using Non-linear Programming to minimize Sum of Squares Error 'SSE'.
- See also
www.bsos.umd.edu/socy/alan/stats/socy602_handouts/kut86916_ch13.pdf
-
class
NullModel extends Fit with Predictor with Error
The
NullModel
class implements the simplest type of predictive modeling technique that just predicts the response 'y' to be the mean.The
NullModel
class implements the simplest type of predictive modeling technique that just predicts the response 'y' to be the mean. Fit the parameter vector 'b' in the regression equationy = b dot x + e = b0 + e
where 'e' represents the residuals (the part not explained by the model).
-
class
Perceptron extends PredictorMat
The
Perceptron
class supports single-output, 2-layer (input and output) Neural-Networks.The
Perceptron
class supports single-output, 2-layer (input and output) Neural-Networks. Although perceptrons are typically used for classification, this class is used for prediction. Given several input vectors and output values (training data), fit the weights/parameters 'b' connecting the layers, so that for a new input vector 'z', the net can predict the output value, i.e.,z = f (b dot z)
The parameter vector 'b' (w) gives the weights between input and output layers. Note, b0 is treated as the bias, so x0 must be 1.0.
-
class
PoissonRegression extends PredictorMat
The
PoissonRegression
class supports Poisson regression.The
PoissonRegression
class supports Poisson regression. In this case, x' may be multi-dimensional '[1, x_1, ... x_k]'. Fit the parameter vector 'b' in the Poisson regression equationlog (mu(x)) = b dot x = b_0 + b_1 * x_1 + ... b_k * x_k
where 'e' represents the residuals (the part not explained by the model) and 'y' is now integer valued.
- See also
see.stanford.edu/materials/lsoeldsee263/05-ls.pdf
-
class
PolyRegression extends PredictorVec
The
PolyRegression
class supports polynomial regression.The
PolyRegression
class supports polynomial regression. In this case, 't' is expanded to '[1, t, t2 ... tk]'. Fit the parameter vector 'b' in the regression equationy = b dot x + e = b_0 + b_1 * t + b_2 * t2 ... b_k * tk + e
where 'e' represents the residuals (the part not explained by the model). Use Least-Squares (minimizing the residuals) to solve for the parameter vector 'b' using the Normal Equations:
x.t * x * b = x.t * y b = fac.solve (.)
- See also
www.ams.sunysb.edu/~zhu/ams57213/Team3.pptx
-
trait
Predictor extends AnyRef
The
Predictor
trait provides a common framework for several predictors.The
Predictor
trait provides a common framework for several predictors. A predictor is for potentially unbounded responses (real or integer). When the number of distinct responses is bounded by some relatively small integer 'k', a classifier is likdely more appropriate. Note, the 'train' method must be called first followed by 'eval'. -
abstract
class
PredictorMat extends Fit with Predictor with Error
The
PredictorMat
abstract class supports multiple predictor analytics.The
PredictorMat
abstract class supports multiple predictor analytics. In this case, 'x' is multi-dimensional [1, x_1, ... x_k]. Fit the parameter vector 'b' in for example the regression equationy = b dot x + e = b_0 + b_1 * x_1 + ... b_k * x_k + e
Note, "protected val" arguments requires by
ResponseSurface
. -
abstract
class
PredictorVec extends Predictor with Error
The
PredictorVec
class supports term expanded regression.The
PredictorVec
class supports term expanded regression. Fit the parameter vector 'b' in the regression equation. Use Least-Squares (minimizing the residuals) to solve for the parameter vector 'b' using the Normal Equations:x.t * x * b = x.t * y b = fac.solve (.)
-
class
PrincipalComponents extends Reducer with Error
The
PrincipalComponents
class performs the Principal Component Analysis 'PCA' on data matrix 'x'.The
PrincipalComponents
class performs the Principal Component Analysis 'PCA' on data matrix 'x'. It can be used to reduce the dimensionality of the data. First find the Principal Components 'PC's by calling 'findPCs' and then call 'reduce' to reduce the data (i.e., reduce matrix 'x' to a lower dimensionality matrix). -
class
QuadraticFit extends AnyRef
The
QuadraticFit
class uses multiple regression to fit a quadratic surface to the function 'f'.The
QuadraticFit
class uses multiple regression to fit a quadratic surface to the function 'f'. This is useful when computing 'f' is costly, for example in simulation optimization. The fit is over a multi-dimensional grid and can be used for interpolation and limited extrapolation. -
trait
Reducer extends AnyRef
The
Reducer
trait provides a common framework for several data reduction algorithms. -
class
Regression extends PredictorMat
The
Regression
class supports multiple linear regression.The
Regression
class supports multiple linear regression. In this case, 'x' is multi-dimensional [1, x_1, ... x_k]. Fit the parameter vector 'b' in the regression equationy = b dot x + e = b_0 + b_1 * x_1 + ... b_k * x_k + e
where 'e' represents the residuals (the part not explained by the model). Use Least-Squares (minimizing the residuals) to solve the parameter vector 'b' using the Normal Equations:
x.t * x * b = x.t * y b = fac.solve (.)
Five factorization techniques are provided:
'QR' // QR Factorization: slower, more stable (default) 'Cholesky' // Cholesky Factorization: faster, less stable (reasonable choice) 'SVD' // Singular Value Decomposition: slowest, most robust 'LU' // LU Factorization: better than Inverse 'Inverse' // Inverse/Gaussian Elimination, classical textbook technique
- See also
en.wikipedia.org/wiki/Degrees_of_freedom_(statistics)
see.stanford.edu/materials/lsoeldsee263/05-ls.pdf Note, not intended for use when the number of degrees of freedom 'df' is negative.
-
class
Regression_WLS extends Regression
The
Regression_WLS
class supports weighted multiple linear regression.The
Regression_WLS
class supports weighted multiple linear regression. In this case, 'xx' is multi-dimensional [1, xx_1, ... xx_k]. Weights are set to the inverse of a variable's variance, so they can compensate for such variability (heteroscedasticity). Fit the parameter vector 'b' in the regression equationyy = b dot xx + e = b_0 + b_1 * xx_1 + ... b_k * xx_k + e
where 'e' represents the residuals (the part not explained by the model). Use Weighted Least-Squares (minimizing the residuals) to fit the parameter vector
b = fac.solve (.)
The data matrix 'xx' is reweighted 'x = rootW * xx' and the response vector 'yy' is reweighted 'y = rootW * yy' where 'rootW' is the square root of the weights.
- See also
www.markirwin.net/stat149/Lecture/Lecture3.pdf
en.wikipedia.org/wiki/Least_squares#Weighted_least_squares These are then passed to Ordinary Least Squares (OLS) Regression. Five factorization techniques are provided: 'QR' // QR Factorization: slower, more stable (default) 'Cholesky' // Cholesky Factorization: faster, less stable (reasonable choice) 'SVD' // Singular Value Decomposition: slowest, most robust 'LU' // LU Factorization: better than Inverse 'Inverse' // Inverse/Gaussian Elimination, classical textbook technique
-
class
ResponseSurface extends Regression
The
ResponseSurface
class uses multiple regression to fit a quadratic/cubic surface to the data.The
ResponseSurface
class uses multiple regression to fit a quadratic/cubic surface to the data. For example in 2D, the quadratic regression equation isy = b dot x + e = [b_0, ... b_k] dot [1, x_0, x_02, x_1, x_0*x_1, x_12] + e
- See also
scalation.metamodel.QuadraticFit
-
class
RidgeRegression extends PredictorMat
The
RidgeRegression
class supports multiple linear ridge regression.The
RidgeRegression
class supports multiple linear ridge regression. In this case, 'x' is multi-dimensional [x_1, ... x_k]. Ridge regression puts a penalty on the L2 norm of the parameters b to reduce the chance of them taking on large values that may lead to less robust models. Both the input matrix 'x' and the response vector 'y' are centered (zero mean). Fit the parameter vector 'b' in the regression equationy = b dot x + e = b_1 * x_1 + ... b_k * x_k + e
where 'e' represents the residuals (the part not explained by the model). Use Least-Squares (minimizing the residuals) to solve for the parameter vector 'b' using the regularized Normal Equations:
b = fac.solve (.) with regularization x.t * x + λ * I
Five factorization techniques are provided:
'QR' // QR Factorization: slower, more stable (default) 'Cholesky' // Cholesky Factorization: faster, less stable (reasonable choice) 'SVD' // Singular Value Decomposition: slowest, most robust 'LU' // LU Factorization: similar, but better than inverse 'Inverse' // Inverse/Gaussian Elimination, classical textbook technique
- See also
statweb.stanford.edu/~tibs/ElemStatLearn/
-
class
SimpleRegression extends PredictorMat
The
SimpleRegression
class supports simple linear regression.The
SimpleRegression
class supports simple linear regression. In this case, the vector 'x' consists of the constant one and a single variable 'x1', i.e., (1, x1). Fit the parameter vector 'b' in the regression equationy = b dot x + e = [b0, b1] dot [1, x1] + e = b0 + b1 * x1 + e
where 'e' represents the residuals (the part not explained by the model).
- class SimpleTest[VecT <: VectoD] extends AnyRef
-
class
SimplerRegression extends PredictorMat
The
SimplerRegression
class supports simpler linear regression.The
SimplerRegression
class supports simpler linear regression. In this case, the vector 'x' consists of a single variable 'x0'. Fit the parameter vector 'b' in the regression equationy = b dot x + e = [b0] dot [x0] + e = b0 * x0 + e
where 'e' represents the residuals (the part not explained by the model). The simpler regression model has no intercept parameter, only a slope parameter.
- See also
SimpleRegression
for both intercept and slope parameters
-
class
TranRegression extends Regression
The
TranRegression
class supports transformed multiple linear regression.The
TranRegression
class supports transformed multiple linear regression. In this case, 'x' is multi-dimensional [1, x_1, ... x_k]. Fit the parameter vector 'b' in the transformed regression equationtransform (y) = b dot x + e = b_0 + b_1 * x_1 + b_2 * x_2 ... b_k * x_k + e
where 'e' represents the residuals (the part not explained by the model) and 'transform' is the function (defaults to log) used to transform the response vector 'y'. Common transforms include 'log (y)', 'sqrt (y)' when 'y > 0', or even 'sq (y)', 'exp (y)'. More generally, a Box-Cox Transformation may be applied.
- See also
www.ams.sunysb.edu/~zhu/ams57213/Team3.pptx
citeseerx.ist.psu.edu/viewdoc/download?doi=10.1.1.469.7176&rep=rep1&type=pdf Use Least-Squares (minimizing the residuals) to fit the parameter vector 'b' Note: this class does not provide transformations on columns of matrix 'x'.
-
class
TrigRegression extends PredictorVec
The
TrigRegression
class supports trigonometric regression.The
TrigRegression
class supports trigonometric regression. In this case, 't' is expanded to '[1, sin (wt), cos (wt), sin (2wt), cos (2wt), ...]'. Fit the parameter vector 'b' in the regression equationy = b dot x + e = b_0 + b_1 sin (wt) + b_2 cos (wt) + b_3 sin (2wt) + b_4 cos (2wt) + ... + e
where 'e' represents the residuals (the part not explained by the model). Use Least-Squares (minimizing the residuals) to solve for the parameter vector 'b' using the Normal Equations:
x.t * x * b = x.t * y b = fac.solve (.)
- See also
link.springer.com/article/10.1023%2FA%3A1022436007242#page-1
Value Members
-
val
BASE_DIR: String
The relative path for base directory
-
object
ANCOVATest extends App
The
ANCOVATest
object tests theANCOVA
class using the following regression equation.The
ANCOVATest
object tests theANCOVA
class using the following regression equation.y = b dot x = b_0 + b_1*x_1 + b_2*x_2 + b_3*d_1 + b_4*d_2
> runMain scalation.analytics.ANCOVATest
-
object
ANOVA1Test extends App
The
ANOVA1Test
object tests theANOVA1
class using the following regression equation.The
ANOVA1Test
object tests theANOVA1
class using the following regression equation.y = b dot x = b_0 + b_1*d_1 + b_2*d_2
> runMain scalation.analytics.ANOVA1Test
-
object
ActivationFun
The
ActivationFun
object contains common Activation functions and provides both scalar and vector versions.The
ActivationFun
object contains common Activation functions and provides both scalar and vector versions.- See also
en.wikipedia.org/wiki/Activation_function Convention: fun activation function (e.g., sigmoid) funV vector version of activation function (e.g., sigmoidV) funM matrix version of activation function (e.g., sigmoidM) funDV vector version of dervivative (e.g., sigmoidDV) funDM matrix version of dervivative (e.g., sigmoidDM) ------------------------------------------------------------------------------ Supports: id, reLU, tanh, sigmoid, gaussain, softmax Related functions: logistic, logit
-
object
ActivationFunTest extends App
The
ActivationFunTest
is used to test theActivationFun
object.The
ActivationFunTest
is used to test theActivationFun
object. > runMain scalation.analytics.ActivationFunTest -
object
ActivationFunTest2 extends App
The
ActivationFunTest2
is used to test theActivationFun
object.The
ActivationFunTest2
is used to test theActivationFun
object.- See also
en.wikipedia.org/wiki/Softmax_function > runMain scalation.analytics.ActivationFunTest2
-
object
ExampleConcrete
The
ExampleConcrete
class stores a medium-sized example dataset from the UCI Machine Learning Repository, "Abstract: Concrete is a highly complex material.The
ExampleConcrete
class stores a medium-sized example dataset from the UCI Machine Learning Repository, "Abstract: Concrete is a highly complex material. The slump flow of concrete is not only determined by the water content, but that is also influenced by other concrete ingredients." -
object
ExampleConcreteTest extends App
The
ExampleConcreteTest
object is used to test theExampleConcrete
object.The
ExampleConcreteTest
object is used to test theExampleConcrete
object. It comparesRegression
,Perceptron
, andNeuralNet_2L
. This one runsRegression
> runMain scalation.analytics.ExampleConcreteTest -
object
ExampleConcreteTest2 extends App
The
ExampleConcreteTest2
object is used to test theExampleConcrete
object.The
ExampleConcreteTest2
object is used to test theExampleConcrete
object. It comparesRegression
,Perceptron
, andNeuralNet_2L
. This one runsPerceptron
> runMain scalation.analytics.ExampleConcreteTest2 -
object
ExampleConcreteTest3 extends App
The
ExampleConcreteTest3
object is used to test theExampleConcrete
object.The
ExampleConcreteTest3
object is used to test theExampleConcrete
object. It comparesRegression
,Perceptron
, andNeuralNet_2L
. This one runsNeuralNet_2L
with 'sigmoid' > runMain scalation.analytics.ExampleConcreteTest3 -
object
ExampleConcreteTest4 extends App
The
ExampleConcreteTest4
object is used to test theExampleConcrete
object.The
ExampleConcreteTest4
object is used to test theExampleConcrete
object. It comparesRegression
,Perceptron
, andNeuralNet_2L
. This one runsNeuralNet_2L
with 'tanh' > runMain scalation.analytics.ExampleConcreteTest4 -
object
ExampleConcreteTest5 extends App
The
ExampleConcreteTest5
object is used to test theExampleConcrete
object.The
ExampleConcreteTest5
object is used to test theExampleConcrete
object. It comparesRegression
,Perceptron
, andNeuralNet_2L
. This one runsNeuralNet_2L
with 'id' > runMain scalation.analytics.ExampleConcreteTest5 -
object
ExampleConcreteTest6 extends App
The
ExampleConcreteTest6
object is used to test theExampleConcrete
object.The
ExampleConcreteTest6
object is used to test theExampleConcrete
object. It comparesRegression
,Perceptron
,NeuralNet_2L
, and NeuralNet_3L. This one runsNeuralNet_3L
with 'sigmoid' > runMain scalation.analytics.ExampleConcreteTest6 -
object
ExpRegressionTest extends App
The
ExpRegressionTest
object testsExpRegression
class using the following exponential regression problem.The
ExpRegressionTest
object testsExpRegression
class using the following exponential regression problem. > runMain scalation.analytics.ExpRegressionTest -
object
ExpRegressionTest2 extends App
The
ExpRegressionTest2
object has a basic test for theExpRegression
class.The
ExpRegressionTest2
object has a basic test for theExpRegression
class. > runMain scalation.analytics.ExpRegressionTest -
object
Fit
The
Fit
companion object provides factory methods for assessing quality of fit for standard types of modeling techniques. -
object
GLM extends GLM
The
GLM
object makes theGLM
trait's methods directly available.The
GLM
object makes theGLM
trait's methods directly available. This approach (using traits and objects) allows the methods to also be inherited. -
object
GLMTest extends App
The
GLMTest
object tests theGLM
object using the following regression equation.The
GLMTest
object tests theGLM
object using the following regression equation.y = b dot x = b_0 + b_1*x_1 + b_2*x_2 + b_3*d_1 + b_4*d_2
-
object
GZLM extends GLM
A Generalized Linear Model 'GZLM' can be developed using the
GZLM
object.A Generalized Linear Model 'GZLM' can be developed using the
GZLM
object. It provides factory methods for General Linear Models 'GLM' via inheritance and for proper Generalized Linear Models:LogisticRegression
- logistic regression, (@seeclassifier
package)PoissonRegression
- Poisson regression,ExpRegression
- Exponential regression, -
object
GZLMTest extends App
The
GZLMTest
object tests theGZLM
object using the following regression equation.The
GZLMTest
object tests theGZLM
object using the following regression equation.y = b dot x = b_0 + b_1*x_1 + b_2*x_2 + b_3*d_1 + b_4*d_2
-
object
LassoRegressionTest extends App
The
LassoRegressionTest
object testsLassoRegression
class using the following regression equation.The
LassoRegressionTest
object testsLassoRegression
class using the following regression equation.y = b dot x = b_0 + b_1*x_1 + b_2*x_2.
- See also
http://statmaster.sdu.dk/courses/st111/module03/index.html > runMain scalation.analytics.LassoRegressionTest
-
object
MatrixTransform
The
MatrixTransform
object is used to transform the columns of a data matrix 'x'.The
MatrixTransform
object is used to transform the columns of a data matrix 'x'. Such pre-processing of the data is required by some modeling techniques. -
object
MatrixTransformTest extends App
The
MatrixTransformTest
is used to test theMatrixTransform
object.The
MatrixTransformTest
is used to test theMatrixTransform
object. > runMain scalation.analytics.MatrixTransformTest -
object
NMFactorizationTest extends App
The
NMFactorizationTest
object to testNMFactorizationTest
class. -
object
NeuralNet_2LTest extends App
The
NeuralNet_2LTest
object is used to test theNeuralNet_2L
class.The
NeuralNet_2LTest
object is used to test theNeuralNet_2L
class. For this test, training data is used to fit the weights before using them for prediction.- See also
www4.rgu.ac.uk/files/chapter3%20-%20bp.pdf > runMain scalation.analytics.NeuralNet_2LTest
-
object
NeuralNet_3LTest extends App
The
NeuralNet_3LTest
object is used to test theNeuralNet_3L
class.The
NeuralNet_3LTest
object is used to test theNeuralNet_3L
class. For this test, training data is used to fit the weights before using them for prediction.- See also
www4.rgu.ac.uk/files/chapter3%20-%20bp.pdf > runMain scalation.analytics.NeuralNet_3LTest
-
object
NeuralNet_XLTest extends App
The
NeuralNet_XLTest
object is used to test theNeuralNet_XL
class.The
NeuralNet_XLTest
object is used to test theNeuralNet_XL
class. For this test, the initial weights are used for prediction. > runMain scalation.analytics.NeuralNet_XLTest -
object
NeuralNet_XLTest2 extends App
The
NeuralNet_XLTest2
object is used to test theNeuralNet_XL
class.The
NeuralNet_XLTest2
object is used to test theNeuralNet_XL
class. For this test, training data is used to fit the weights before using them for prediction.- See also
http://www4.rgu.ac.uk/files/chapter3%20-%20bp.pdf > runMain scalation.analytics.NeuralNet_XLTest2
-
object
NonLinRegressionTest extends App
The
NonLinRegressionTest
object tests theNonLinRegression
class: y = f(x; b) = b0 + exp (b1 * x0).The
NonLinRegressionTest
object tests theNonLinRegression
class: y = f(x; b) = b0 + exp (b1 * x0).- See also
www.bsos.umd.edu/socy/alan/stats/socy602_handouts/kut86916_ch13.pdf Answers: sse = 49.45929986243339 fit = (VectorD (58.606566327280426, -0.03958645286504356), 0.9874574894685292) predict (VectorD (50.0)) = 8.09724678182599 FIX: check this example
-
object
NullModelTest extends App
The
NullModelTest
object is used to test theNullModel
class.The
NullModelTest
object is used to test theNullModel
class.y = b dot x + e = b0 + e
> runMain scalation.analytics.NullModelTest
-
object
NullModelTest2 extends App
The
NullModelTest2
object is used to test theNullModel
class.The
NullModelTest2
object is used to test theNullModel
class.y = b dot x + e = b0 + e
> runMain scalation.analytics.NullModelTest2
-
object
PerceptronTest extends App
The
PerceptronTest
object is used to test thePerceptron
class.The
PerceptronTest
object is used to test thePerceptron
class. For this test, training data is used to fit the weights before using them for prediction.- See also
www4.rgu.ac.uk/files/chapter3%20-%20bp.pdf > runMain scalation.analytics.PerceptronTest
-
object
PerceptronTest2 extends App
The
PerceptronTest2
object trains a perceptron on a small dataset of temperatures from counties in Texas where the variables/factors to consider are Latitude (x1), Elevation (x2) and Longitude (x3).The
PerceptronTest2
object trains a perceptron on a small dataset of temperatures from counties in Texas where the variables/factors to consider are Latitude (x1), Elevation (x2) and Longitude (x3). The regression equation is the following:y = sigmoid (w dot x) = sigmoid (w0 + w1*x1 + w2*x2 + w3*x3)
> runMain scalation.analytics.PerceptronTest2
- object PoissonRegression
-
object
PoissonRegressionTest extends App
The
PoissonRegression
object tests thePoissonRegression
class.The
PoissonRegression
object tests thePoissonRegression
class.- See also
http://www.cookbook-r.com/Statistical_analysis/Logistic_regression/ Answer: b = (-8.8331, 0.4304), n_dev = 43.860, r_dev = 25.533, aci = 29.533, pseudo_rSq = 0.4178
-
object
PoissonRegressionTest2 extends App
The
PoissonRegressionTest2
object tests thePoissonRegression
class.The
PoissonRegressionTest2
object tests thePoissonRegression
class.- See also
www.stat.wisc.edu/~mchung/teaching/.../GLM.logistic.Rpackage.pdf
statmaster.sdu.dk/courses/st111/module03/index.html
-
object
PolyRegressionTest extends App
The
PolyRegressionTest
object testsPolyRegression
class using the following regression equation.The
PolyRegressionTest
object testsPolyRegression
class using the following regression equation.y = b dot x = b_0 + b_1*t + b_2*t^2 + ... b_k*t_k
Note, the 'order' at which R-Squared drops is QR(7), Cholesky(14), SVD(6), Inverse(13). > runMain scalation.analytics.PolyRegressionTest
-
object
PolyRegressionTest2 extends App
The
PolyRegressionTest2
object testsPolyRegression
class using the following regression equation.The
PolyRegressionTest2
object testsPolyRegression
class using the following regression equation. This version uses orthogonal polynomials.y = b dot x = b_0 + b_1*t + b_2*t^2 + ... b_k*t_k
> runMain scalation.analytics.PolyRegressionTest2
-
object
PredictorTest extends App
The
PredictorTest
object tests all the classes in thescalation.analytics
package that directly or indirectly extend thePredictor
trait.The
PredictorTest
object tests all the classes in thescalation.analytics
package that directly or indirectly extend thePredictor
trait. FIX - make first test uniform so that the modeling techniques may be compared > runMain scalation.analytics.PredictorTest -
object
PrincipalComponentsTest extends App
The
PrincipalComponentsTest
object is used to test thePrincipalComponents
class.The
PrincipalComponentsTest
object is used to test thePrincipalComponents
class.- See also
www.ce.yildiz.edu.tr/personal/songul/file/1097/principal_components.pdf > runMain scalation.analytics.PrincipalComponentsTest
-
object
Probability extends Error
The
Probability
object provides methods for operating on univariate and bivariate probability distributions of discrete random variables 'X' and 'Y'.The
Probability
object provides methods for operating on univariate and bivariate probability distributions of discrete random variables 'X' and 'Y'. A probability distribution is specified by its probability mass functions (pmf) stored either as a "probability vector" for a univariate distribution or a "probability matrix" for a bivariate distribution.joint probability matrix: pxy(i, j) = P(X = x_i, Y = y_j) marginal probability vector: px(i) = P(X = x_i) conditional probability matrix: px_y(i, j) = P(X = x_i|Y = y_j)
In addition to computing joint, marginal and conditional probabilities, methods for computing entropy and mutual information are also provided. Entropy provides a measure of disorder or randomness. If there is little randomness, entropy will close to 0, while when randomness is high, entropy will be close to, e.g., 'log2 (px.dim)'. Mutual information provides a robust measure of dependency between random variables (contrast with correlation).
- See also
scalation.stat.StatVector
-
object
ProbabilityTest extends App
The
ProbabilityTest
object is used to test theProbability
object. -
object
ProbabilityTest2 extends App
The
ProbabilityTest2
provides upper bound for 'entropy' and 'entropy_k'. -
object
QuadraticFitTest extends App
The
QuadraticFitTest
object is used to test theQuadraticFit
class for a two dimensional case. -
object
QuadraticFitTest2 extends App
The
QuadraticFitTest2
object is used to test theQuadraticFit
class for a three dimensional case. -
object
QuadraticFitTest3 extends App
The
QuadraticFitTest3
object is used to test theQuadraticFit
class for a three dimensional case with noise. -
object
RegTechnique extends Enumeration
The
RegTechnique
object defines the implementation techniques available. -
object
Regression
The
Regression
companion object provides a testing method. -
object
RegressionTest extends App
The
RegressionTest
object testsRegression
class using the following regression equation.The
RegressionTest
object testsRegression
class using the following regression equation.y = b dot x = b_0 + b_1*x_1 + b_2*x_2.
- See also
statmaster.sdu.dk/courses/st111/module03/index.html > runMain scalation.analytics.RegressionTest
-
object
RegressionTest2 extends App
The
RegressionTest2
object testsRegression
class using the following regression equation.The
RegressionTest2
object testsRegression
class using the following regression equation.y = b dot x = b_0 + b_1*x1 + b_2*x_2.
> runMain scalation.analytics.RegressionTest2
-
object
RegressionTest3 extends App
The
RegressionTest3
object tests the multi-collinearity method in theRegression
class using the following regression equation.The
RegressionTest3
object tests the multi-collinearity method in theRegression
class using the following regression equation.y = b dot x = b_0 + b_1*x_1 + b_2*x_2 + b_3*x_3 + b_4 * x_4
- See also
online.stat.psu.edu/online/development/stat501/data/bloodpress.txt > runMain scalation.analytics.RegressionTest3
online.stat.psu.edu/online/development/stat501/12multicollinearity/05multico_vif.html
-
object
RegressionTest4 extends App
The
RegressionTest4
object tests the multi-collinearity method in theRegression
class using the following regression equation.The
RegressionTest4
object tests the multi-collinearity method in theRegression
class using the following regression equation.y = b dot x = b_0 + b_1*x_1 + b_2*x_2 + b_3*x_3 + b_4 * x_4
- See also
online.stat.psu.edu/online/development/stat501/data/bloodpress.txt > runMain scalation.analytics.RegressionTest4
online.stat.psu.edu/online/development/stat501/12multicollinearity/05multico_vif.html
-
object
RegressionTest5 extends App
The
RegressionTest5
object testsRegression
class using the following regression equation.The
RegressionTest5
object testsRegression
class using the following regression equation.y = b dot x = b_0 + b_1*x1 + b_2*x_2.
> runMain scalation.analytics.RegressionTest5
-
object
Regression_WLS
The
Regression_WLS
companion object provides methods for setting weights and testing. -
object
Regression_WLSTest extends App
The
Regression_WLSTest
object testsRegression_WLS
class using the following regression equation.The
Regression_WLSTest
object testsRegression_WLS
class using the following regression equation.y = b dot x = b_0 + b_1*x_1 + b_2*x_2.
- See also
statmaster.sdu.dk/courses/st111/module03/index.html > runMain scalation.analytics.Regression_WLSTest
-
object
ResponseSurface
The
ResponseSurface
companion object provides methods for creating functional forms. -
object
ResponseSurfaceTest extends App
The
ResponseSurfaceTest
object is used to test theResponseSurface
class.The
ResponseSurfaceTest
object is used to test theResponseSurface
class. > runMain scalation.analytics.ResponseSurfaceTest -
object
RidgeRegressionTest extends App
The
RidgeRegressionTest
object testsRidgeRegression
class using the following regression equation.The
RidgeRegressionTest
object testsRidgeRegression
class using the following regression equation.y = b dot x = b_1*x_1 + b_2*x_2.
Test regression and backward elimination.
- See also
http://statmaster.sdu.dk/courses/st111/module03/index.html
-
object
RidgeRegressionTest2 extends App
The
RidgeRegressionTest2
object testsRidgeRegression
class using the following regression equation.The
RidgeRegressionTest2
object testsRidgeRegression
class using the following regression equation.y = b dot x = b_1*x1 + b_2*x_2.
-
object
RidgeRegressionTest3 extends App
The
RidgeRegressionTest3
object tests the multi-collinearity method in theRidgeRegression
class using the following regression equation.The
RidgeRegressionTest3
object tests the multi-collinearity method in theRidgeRegression
class using the following regression equation.y = b dot x = b_1*x_1 + b_2*x_2 + b_3*x_3 + b_4 * x_4
- See also
online.stat.psu.edu/online/development/stat501/data/bloodpress.txt
online.stat.psu.edu/online/development/stat501/12multicollinearity/05multico_vif.html
-
object
SimpleRegression
The
SimpleRegression
companion object provides a simple factory method for building simple regression linear regression models. -
object
SimpleRegressionTest extends App
The
SimpleRegressionTest
object to test theSimpleRegression
class:The
SimpleRegressionTest
object to test theSimpleRegression
class:y = b0 + b1 * x
> runMain scalation.analytics.SimpleRegressionTest
-
object
SimpleRegressionTest2 extends App
The
SimpleRegressionTest2
object is used to test theSimpleRegression
class.The
SimpleRegressionTest2
object is used to test theSimpleRegression
class.y = b dot x = [b0, b1] dot [1, x1]
- See also
http://www.analyzemath.com/statistics/linear_regression.html > runMain scalation.analytics.SimpleRegressionTest2
-
object
SimpleRegressionTest3 extends App
The
SimpleRegressionTest3
object is used to test theSimpleRegression
classThe
SimpleRegressionTest3
object is used to test theSimpleRegression
classy = b dot x = b0 + b1 * x1
- See also
http://mathbits.com/mathbits/tisection/Statistics2/linear.htm > runMain scalation.analytics.SimpleRegressionTest3
- object SimpleTest extends App
-
object
SimplerRegression
The
SimplerRegression
companion object provides a simple factory method for building simple regression linear regression models. -
object
SimplerRegressionTest extends App
The
SimplerRegressionTest
object is used to test theSimplerRegression
class.The
SimplerRegressionTest
object is used to test theSimplerRegression
class.y = b0 * x + e
> runMain scalation.analytics.SimplerRegressionTest
-
object
SimplerRegressionTest2 extends App
The
SimplerRegressionTest2
object is used to test theSimplerRegression
class.The
SimplerRegressionTest2
object is used to test theSimplerRegression
class.y = b dot x + e = [b0] dot [x0] + e
> runMain scalation.analytics.SimplerRegressionTest2
-
object
SimplerRegressionTest3 extends App
The
SimplerRegressionTest3
object is used to test theSimplerRegression
class.The
SimplerRegressionTest3
object is used to test theSimplerRegression
class.y = b dot x = b0 * x0
- See also
http://mathbits.com/mathbits/tisection/Statistics2/linear.htm > runMain scalation.analytics.SimplerRegressionTest3
-
object
TranRegression
The
TranRegression
companion object provides transformation and inverse transformation function based on the parameter 'lambda'.The
TranRegression
companion object provides transformation and inverse transformation function based on the parameter 'lambda'. It support the family of Box-Cox transformations. -
object
TranRegressionTest extends App
The
TranRegressionTest
object testsTranRegression
class using the following regression equation.The
TranRegressionTest
object testsTranRegression
class using the following regression equation.log (y) = b dot x = b_0 + b_1*x_1 + b_2*x_2.
> runMain scalation.analytics.TranRegressionTest
-
object
TranRegressionTest2 extends App
The
TranRegressionTest2
object testsTranRegression
class using the following regression equation.The
TranRegressionTest2
object testsTranRegression
class using the following regression equation.sqrt (y) = b dot x = b_0 + b_1*x_1 + b_2*x_2.
> runMain scalation.analytics.TranRegressionTest2
-
object
TrigRegressionTest extends App
The
TrigRegressionTest
object testsTrigRegression
class using the following regression equation.The
TrigRegressionTest
object testsTrigRegression
class using the following regression equation.y = b dot x = b_0 + b_1 sin wt + b_2 cos wt + ... b_2k-1 sin kwt + b_2k cos kwt + e
The data is generated from a noisy cubic function. > runMain scalation.analytics.TrigRegressionTest
-
object
TrigRegressionTest2 extends App
The
TrigRegressionTest2
object testsTrigRegression
class using the following regression equation.The
TrigRegressionTest2
object testsTrigRegression
class using the following regression equation.y = b dot x = b_0 + b_1 sin wt + b_2 cos wt + ... b_2k-1 sin kwt + b_2k cos kwt + e
The data is generated from periodic noisy cubic functions. > runMain scalation.analytics.TrigRegressionTest2