package analytics
The analytics
package contains classes, traits and objects for
analytics including clustering and prediction.
- Alphabetic
- By Inheritance
- analytics
- AnyRef
- Any
- Hide All
- Show All
- Public
- All
Type Members
-
case class
AFF(f: FunctionS2S, fV: FunctionV_2V, fM: FunctionM_2M, dV: FunctionV_2V, dM: FunctionM_2M, bounds: (Double, Double) = null) extends Product with Serializable
The
AFF
class holds an Activation Function Family (AFF).The
AFF
class holds an Activation Function Family (AFF).- f
the activation function itself
- fV
the vector version of the activation function
- fM
the matrix version of the activation function
- dV
the vector version of the activation function derivative
- dM
the matrix version of the activation function derivative
- bounds
the (lower, upper) bounds on the range of the activation function
-
class
ANCOVA extends Regression
The
ANCOVA
class supports ANalysis of COVAriance 'ANCOVA'.The
ANCOVA
class supports ANalysis of COVAriance 'ANCOVA'. It allows the addition of a categorical treatment variable 't' into a multiple linear regression. This is done by introducing dummy variables 'dj' to distinguish the treatment level. The problem is again to fit the parameter vector 'b' in the augmented regression equationy = b dot x + e = b0 + b_1 * x_1 + b_2 * x_2 + ... b_k * x_k + b_k+1 * d_1 + b_k+2 * d_2 + ... b_k+l * d_l + e
where 'e' represents the residuals (the part not explained by the model). Use Least-Squares (minimizing the residuals) to solve for the parameter vector 'b' using the Normal Equations:
x.t * x * b = x.t * y b = fac.solve (.)
't' has categorical values/levels, e.g., treatment levels (0, ... 't.max ()')
- See also
see.stanford.edu/materials/lsoeldsee263/05-ls.pdf
-
class
ANCOVA1 extends Regression
The
ANCOVA1
class supports ANalysis of COVAriance 'ANCOVA1'.The
ANCOVA1
class supports ANalysis of COVAriance 'ANCOVA1'. It allows the addition of a categorical treatment variable 't' into a multiple linear regression. This is done by introducing dummy variables 'dj' to distinguish the treatment level. The problem is again to fit the parameter vector 'b' in the augmented regression equationy = b dot x + e = b0 + b_1 * x_1 + b_2 * x_2 + ... b_k * x_k + b_k+1 * d_1 + b_k+2 * d_2 + ... b_k+l * d_l + e
where 'e' represents the residuals (the part not explained by the model). Use Least-Squares (minimizing the residuals) to solve for the parameter vector 'b' using the Normal Equations:
x.t * x * b = x.t * y b = fac.solve (.)
't' has categorical values/levels, e.g., treatment levels (0, ... 't.max ()')
- See also
see.stanford.edu/materials/lsoeldsee263/05-ls.pdf
-
class
ANOVA1 extends PredictorVec
The
ANOVA1
class supports one-way ANalysis Of VAriance (ANOVA), i.e, it allows only one binary/categorial treatment variable.The
ANOVA1
class supports one-way ANalysis Of VAriance (ANOVA), i.e, it allows only one binary/categorial treatment variable. It is framed using General Linear Model 'GLM' notation and supports the use of one binary/categorical treatment variable 't'. This is done by introducing dummy variables 'd_j' to distinguish the treatment level. The problem is again to fit the parameter vector 'b' in the following equationy = b dot x + e = b_0 + b_1 * d_1 + b_1 * d_2 ... b_k * d_k + e
where 'e' represents the residuals (the part not explained by the model). Use Least-Squares (minimizing the residuals) to solve for the parameter vector 'b' using the Normal Equations:
x.t * x * b = x.t * y b = fac.solve (.)
- See also
psych.colorado.edu/~carey/Courses/PSYC5741/handouts/GLM%20Theory.pdf
ANCOVA
for models with multiple variables
-
case class
AddGate() extends Product with Serializable
Perform Add for vectors.
-
class
CanCorrelation extends Reducer with Error
The
CanCorrelation
class performs Canonical Correlation Analysis 'CCA' on two random vectors.The
CanCorrelation
class performs Canonical Correlation Analysis 'CCA' on two random vectors. Samples for the first one are stored in the 'x' data matrix and samples for the second are stored in the 'y' data matrix. Find vectors a and b that maximize the correlation between x * a and y * b.max {rho (x * a, y * b)}
Additional vectors orthogonal to a and b can also be found.
-
class
ExpRegression extends PredictorMat
The
ExpRegression
class supports exponential regression.The
ExpRegression
class supports exponential regression. In this case, 'x' is multi-dimensional [1, x_1, ... x_k]. Fit the parameter vector 'b' in the exponential regression equationlog (mu (x)) = b dot x = b_0 + b_1 * x_1 + ... b_k * x_k
- See also
www.stat.uni-muenchen.de/~leiten/Lehre/Material/GLM_0708/chapterGLM.pdf
-
class
Fit extends AnyRef
The
Fit
class provides methods to determine basic quality of fit measures. -
trait
GLM extends AnyRef
A General Linear Model 'GLM' can be developed using the
GLM
trait and object (see below).A General Linear Model 'GLM' can be developed using the
GLM
trait and object (see below). The implementation currently supports univariate models with multivariate models (where each response is a vector) planned for the future. It provides factory methods for the following special types of GLMs:SimpleRegression
- simple linear regression,Regression
- multiple linear regression using Ordinary Least Squares 'OLS'Regression_WLS
- multiple linear regression using Weighted Least Squares 'WLS'RidgeRegression
- robust multiple linear regression,TranRegression
- transformed (e.g., log) multiple linear regression,PolyRegression
- polynomial regression,TrigRegression
- trigonometric regressionResponseSurface
- response surface regression,ANOVA1
- GLM form of ANalysis Of VAriance,ANCOVA1
- GLM form of ANalysis of COVAriance. -
class
HyperParameter extends Cloneable
The
HyperParameter
class provides a simple and flexible means for handling model hyper-parameters.The
HyperParameter
class provides a simple and flexible means for handling model hyper-parameters. A model may have one or more hyper-parameters that are organized into a map 'name -> (value, defaultV)'. -
class
KNN_Predictor extends PredictorMat
The
KNN_Predictor
class is used to predict a response value for new vector 'z'.The
KNN_Predictor
class is used to predict a response value for new vector 'z'. It works by finding its 'kappa' nearest neighbors. These neighbors essentially vote according to their prediction. The consensus is the average individual predictions for 'z'. Using a distance metric, the 'kappa' vectors nearest to 'z' are found in the training data, which are stored row-wise in data matrix 'x'. The corresponding response values are given in the vector 'y', such that the response value for vector 'x(i)' is given by 'y(i)'. -
class
LassoRegression extends PredictorMat
The
LassoRegression
class supports multiple linear regression.The
LassoRegression
class supports multiple linear regression. In this case, 'x' is multi-dimensional [1, x_1, ... x_k]. Fit the parameter vector 'b' in the regression equationy = b dot x + e = b_0 + b_1 * x_1 + ... b_k * x_k + e
where 'e' represents the residuals (the part not explained by the model).
- See also
see.stanford.edu/materials/lsoeldsee263/05-ls.pdf
-
case class
MultiplyGate() extends Product with Serializable
MultiplyGate is to perfrom dot product from input and weights 'w'.
-
class
NMFactorization extends AnyRef
The
NMFactorization
class factors a matrix 'v' into two non negative matrices 'w' and 'h' such that 'v = wh' approximately.The
NMFactorization
class factors a matrix 'v' into two non negative matrices 'w' and 'h' such that 'v = wh' approximately.- See also
en.wikipedia.org/wiki/Non-negative_matrix_factorization
-
abstract
class
NeuralNet extends Predictor with Error
The
NeuralNet
abstract class provides the basic structure and API for a variety of Neural Networks. -
class
NeuralNet_2L extends NeuralNet
The
NeuralNet_2L
class supports multi-output, 2-layer (input and output) Neural-Networks.The
NeuralNet_2L
class supports multi-output, 2-layer (input and output) Neural-Networks. It can be used for both classification and prediction, depending on the activation functions used. Given several input vectors and output vectors (training data), fit the weights/parameters 'bb' connecting the layers, so that for a new input vector 'z', the net can predict the output value, i.e.,yp_j = f (bb dot z)
where 'f' is the activation function and the parameter matrix 'bb' gives the weights between input and output layers. No batching is used for this algorithm. Note, 'b0' is treated as the bias, so 'x0' must be 1.0.
-
class
NeuralNet_3L extends NeuralNet
The
NeuralNet_3L
class supports multi-output, 3-layer (input, hidden and output) Neural-Networks.The
NeuralNet_3L
class supports multi-output, 3-layer (input, hidden and output) Neural-Networks. It can be used for both classification and prediction, depending on the activation functions used. Given several input vectors and output vectors (training data), fit the weights/parameters 'aa' and 'bb' connecting the layers, so that for a new input vector 'v', the net can predict the output value, i.e.,yp = f2 (bb * f1V (aa * v))
where 'f1' and 'f2' are the activation functions and the parameter matrices 'aa' and 'bb' gives the weights between input-hidden and hidden-output layers. Note, if 'a0' is to be treated as bias/intercept, 'x0' must be 1.0.
-
class
NeuralNet_XL extends NeuralNet
The
NeuralNet_XL
class supports multi-output, multi-layer (input, multiple hidden and output) Neural-Networks.The
NeuralNet_XL
class supports multi-output, multi-layer (input, multiple hidden and output) Neural-Networks. It can be used for both classification and prediction, depending on the activation functions used. Given several input vectors and output vectors (training data), fit the weight and bias parameters connecting the layers, so that for a new input vector 'v', the net can predict the output value This implementation is partially adapted from Michael Nielsen's Python implementation found in- See also
github.com/MichalDanielDobrzanski/DeepLearningPython35/blob/master/network2.py ------------------------------------------------------------------------------
github.com/mnielsen/neural-networks-and-deep-learning/blob/master/src/network2.py
-
case class
Node(f: Int, branch: Int, yp: Double, thresh: Double, depth: Int, pthresh: Double, pfea: Int, leaf: Boolean = false) extends Product with Serializable
Class that contains information for a tree node.
Class that contains information for a tree node.
- f
the feature of the node, if it is leaf, contains the feature of its parent
- branch
the branch value (0 => left, 1 => right)
- yp
leaf node's prediction for y
- thresh
the threshold for continuous feature
- depth
the current depth of the node
- pthresh
the threshold for parent node
- pfea
the feature of parent node
- leaf
Boolean
value indicate whether is a leaf node
-
class
NonLinRegression extends PredictorMat
The
NonLinRegression
class supports non-linear regression.The
NonLinRegression
class supports non-linear regression. In this case, 'x' can be multi-dimensional '[1, x1, ... xk]' and the function 'f' is non-linear in the parameters 'b'. Fit the parameter vector 'b' in the regression equationy = f(x, b) + e
where 'e' represents the residuals (the part not explained by the model). Use Least-Squares (minimizing the residuals) to fit the parameter vector 'b' by using Non-linear Programming to minimize Sum of Squares Error 'SSE'.
- See also
www.bsos.umd.edu/socy/alan/stats/socy602_handouts/kut86916_ch13.pdf
-
class
NullModel extends Fit with Predictor with Error
The
NullModel
class implements the simplest type of predictive modeling technique that just predicts the response 'y' to be the mean.The
NullModel
class implements the simplest type of predictive modeling technique that just predicts the response 'y' to be the mean. Fit the parameter vector 'b' in the regression equationy = b dot x + e = b0 + e
where 'e' represents the residuals (the part not explained by the model).
-
class
Perceptron extends PredictorMat
The
Perceptron
class supports single-output, 2-layer (input and output) Neural-Networks.The
Perceptron
class supports single-output, 2-layer (input and output) Neural-Networks. Although perceptrons are typically used for classification, this class is used for prediction. Given several input vectors and output values (training data), fit the weights/parameters 'b' connecting the layers, so that for a new input vector 'z', the net can predict the output value, i.e.,z = f (b dot z)
The parameter vector 'b' (w) gives the weights between input and output layers. Note, 'b0' is treated as the bias, so 'x0' must be 1.0.
-
class
PoissonRegression extends PredictorMat
The
PoissonRegression
class supports Poisson regression.The
PoissonRegression
class supports Poisson regression. In this case, x' may be multi-dimensional '[1, x_1, ... x_k]'. Fit the parameter vector 'b' in the Poisson regression equationlog (mu(x)) = b dot x = b_0 + b_1 * x_1 + ... b_k * x_k
where 'e' represents the residuals (the part not explained by the model) and 'y' is now integer valued.
- See also
see.stanford.edu/materials/lsoeldsee263/05-ls.pdf
-
class
PolyRegression extends PredictorVec
The
PolyRegression
class supports polynomial regression.The
PolyRegression
class supports polynomial regression. In this case, 't' is expanded to '[1, t, t2 ... tk]'. Fit the parameter vector 'b' in the regression equationy = b dot x + e = b_0 + b_1 * t + b_2 * t2 ... b_k * tk + e
where 'e' represents the residuals (the part not explained by the model). Use Least-Squares (minimizing the residuals) to solve for the parameter vector 'b' using the Normal Equations:
x.t * x * b = x.t * y b = fac.solve (.)
- See also
www.ams.sunysb.edu/~zhu/ams57213/Team3.pptx
-
trait
Predictor extends AnyRef
The
Predictor
trait provides a common framework for several predictors.The
Predictor
trait provides a common framework for several predictors. A predictor is for potentially unbounded responses (real or integer). When the number of distinct responses is bounded by some relatively small integer 'k', a classifier is likdely more appropriate. Note, the 'train' method must be called first followed by 'eval'. -
abstract
class
PredictorMat extends Fit with Predictor with Error
The
PredictorMat
abstract class supports multiple predictor analytics.The
PredictorMat
abstract class supports multiple predictor analytics. In this case, 'x' is multi-dimensional [1, x_1, ... x_k]. Fit the parameter vector 'b' in for example the regression equationy = b dot x + e = b_0 + b_1 * x_1 + ... b_k * x_k + e
Note, "protected val" arguments required by
ResponseSurface
. -
abstract
class
PredictorVec extends Predictor with Error
The
PredictorVec
class supports term expanded regression.The
PredictorVec
class supports term expanded regression. Fit the parameter vector 'b' in the regression equation. Use Least-Squares (minimizing the residuals) to solve for the parameter vector 'b' using the Normal Equations:x.t * x * b = x.t * y b = fac.solve (.)
-
class
PrincipalComponents extends Reducer with Error
The
PrincipalComponents
class performs the Principal Component Analysis 'PCA' on data matrix 'x'.The
PrincipalComponents
class performs the Principal Component Analysis 'PCA' on data matrix 'x'. It can be used to reduce the dimensionality of the data. First find the Principal Components 'PC's by calling 'findPCs' and then call 'reduce' to reduce the data (i.e., reduce matrix 'x' to a lower dimensionality matrix). -
class
QuadRegression extends Regression
The
QuadRegression
class uses multiple regression to fit a quadratic surface to the data.The
QuadRegression
class uses multiple regression to fit a quadratic surface to the data. For example in 2D, the quadratic regression equation isy = b dot x + e = [b_0, ... b_k] dot [1, x_0, x_02, x_1, x_12] + e
Has no interaction/cross-terms and adds an a constant term for intercept (must not include intercept, column of ones in initial data matrix).
- See also
scalation.metamodel.QuadraticFit
-
class
QuadraticFit extends AnyRef
The
QuadraticFit
class uses multiple regression to fit a quadratic surface to the function 'f'.The
QuadraticFit
class uses multiple regression to fit a quadratic surface to the function 'f'. This is useful when computing 'f' is costly, for example in simulation optimization. The fit is over a multi-dimensional grid and can be used for interpolation and limited extrapolation. -
class
RecurrentNeuralNet extends Error
The
RecurrentNeuralNet
class feeds input in sequential time into hidden layer.The
RecurrentNeuralNet
class feeds input in sequential time into hidden layer. It uses parameter U, W, V in network. where U is parameter for input x, W is for hidden layer z, and V is for output y We have 'St = Activate (U dot x(t) + W dot x(t-1))' and 'y(t) = softmax(V dot St)'- See also
github.com/pangolulu/rnn-from-scratch ----------------------------------------------------------------------------
-
class
RecurrentNeuralNetLayer extends AnyRef
The
RecurrentNeuralNetLayer
is a 3-layer where x denotes the input, 'y 'denotes the output and 's' is the intermediate/hidden value.The
RecurrentNeuralNetLayer
is a 3-layer where x denotes the input, 'y 'denotes the output and 's' is the intermediate/hidden value. We have 'St = Activate (U dot x(t) + W dot x(t-1))' and 'y(t) = softmax(V dot St)'. -
trait
Reducer extends AnyRef
The
Reducer
trait provides a common framework for several data reduction algorithms. -
class
Regression extends PredictorMat
The
Regression
class supports multiple linear regression.The
Regression
class supports multiple linear regression. In this case, 'x' is multi-dimensional [1, x_1, ... x_k]. Fit the parameter vector 'b' in the regression equationy = b dot x + e = b_0 + b_1 * x_1 + ... b_k * x_k + e
where 'e' represents the residuals (the part not explained by the model). Use Least-Squares (minimizing the residuals) to solve the parameter vector 'b' using the Normal Equations:
x.t * x * b = x.t * y b = fac.solve (.)
Five factorization techniques are provided:
'QR' // QR Factorization: slower, more stable (default) 'Cholesky' // Cholesky Factorization: faster, less stable (reasonable choice) 'SVD' // Singular Value Decomposition: slowest, most robust 'LU' // LU Factorization: better than Inverse 'Inverse' // Inverse/Gaussian Elimination, classical textbook technique
- See also
en.wikipedia.org/wiki/Degrees_of_freedom_(statistics)
see.stanford.edu/materials/lsoeldsee263/05-ls.pdf Note, not intended for use when the number of degrees of freedom 'df' is negative.
-
class
RegressionTree extends PredictorMat
The
RegressionTree
class implements a RegressionTree selecting splitting features using minimal variance in children nodes.The
RegressionTree
class implements a RegressionTree selecting splitting features using minimal variance in children nodes. To avoid exponential choices in the selection, supporting ordinal features currently. Use companion object is recommended for generate Regression Tree. -
class
RegressionTree_GB extends PredictorMat
The
RegressionTree_GB
class uses Gradient Boosting onRegressionTree
.The
RegressionTree_GB
class uses Gradient Boosting onRegressionTree
. One Tree is included in the model at a time wisely chosen for reducing gradient. -
class
Regression_WLS extends Regression
The
Regression_WLS
class supports weighted multiple linear regression.The
Regression_WLS
class supports weighted multiple linear regression. In this case, 'xx' is multi-dimensional [1, xx_1, ... xx_k]. Weights are set to the inverse of a variable's variance, so they can compensate for such variability (heteroscedasticity). Fit the parameter vector 'b' in the regression equationyy = b dot xx + e = b_0 + b_1 * xx_1 + ... b_k * xx_k + e
where 'e' represents the residuals (the part not explained by the model). Use Weighted Least-Squares (minimizing the residuals) to fit the parameter vector
b = fac.solve (.)
The data matrix 'xx' is reweighted 'x = rootW * xx' and the response vector 'yy' is reweighted 'y = rootW * yy' where 'rootW' is the square root of the weights.
- See also
www.markirwin.net/stat149/Lecture/Lecture3.pdf
en.wikipedia.org/wiki/Least_squares#Weighted_least_squares These are then passed to Ordinary Least Squares (OLS) Regression. Five factorization techniques are provided: 'QR' // QR Factorization: slower, more stable (default) 'Cholesky' // Cholesky Factorization: faster, less stable (reasonable choice) 'SVD' // Singular Value Decomposition: slowest, most robust 'LU' // LU Factorization: better than Inverse 'Inverse' // Inverse/Gaussian Elimination, classical textbook technique
-
class
ResponseSurface extends Regression
The
ResponseSurface
class uses multiple regression to fit a quadratic/cubic surface to the data.The
ResponseSurface
class uses multiple regression to fit a quadratic/cubic surface to the data. For example in 2D, the quadratic regression equation isy = b dot x + e = [b_0, ... b_k] dot [1, x_0, x_02, x_1, x_0*x_1, x_12] + e
Adds an a constant term for intercept (must not include intercept, column of ones in initial data matrix).
- See also
scalation.metamodel.QuadraticFit
-
class
RidgeRegression extends PredictorMat
The
RidgeRegression
class supports multiple linear ridge regression.The
RidgeRegression
class supports multiple linear ridge regression. In this case, 'x' is multi-dimensional [x_1, ... x_k]. Ridge regression puts a penalty on the L2 norm of the parameters b to reduce the chance of them taking on large values that may lead to less robust models. Both the input matrix 'x' and the response vector 'y' are centered (zero mean). Fit the parameter vector 'b' in the regression equationy = b dot x + e = b_1 * x_1 + ... b_k * x_k + e
where 'e' represents the residuals (the part not explained by the model). Use Least-Squares (minimizing the residuals) to solve for the parameter vector 'b' using the regularized Normal Equations:
b = fac.solve (.) with regularization x.t * x + λ * I
Five factorization techniques are provided:
'QR' // QR Factorization: slower, more stable (default) 'Cholesky' // Cholesky Factorization: faster, less stable (reasonable choice) 'SVD' // Singular Value Decomposition: slowest, most robust 'LU' // LU Factorization: similar, but better than inverse 'Inverse' // Inverse/Gaussian Elimination, classical textbook technique
- See also
statweb.stanford.edu/~tibs/ElemStatLearn/
-
class
RoundRegression extends Regression
The
RoundRegression
class supports rounded multiple linear regression.The
RoundRegression
class supports rounded multiple linear regression. In this case, 'x' is multi-dimensional [1, x_1, ... x_k]. Fit the parameter vector 'b' in the transformed regression equationy = round (b dot x) + e = round (b_0 + b_1 * x_1 + b_2 * x_2 ... b_k * x_k) + e
where 'e' represents the residuals (the part not explained by the model). Use Least-Squares (minimizing the residuals) to fit the parameter vector 'b'
-
class
SimpleRegression extends PredictorMat
The
SimpleRegression
class supports simple linear regression.The
SimpleRegression
class supports simple linear regression. In this case, the vector 'x' consists of the constant one and a single variable 'x1', i.e., (1, x1). Fit the parameter vector 'b' in the regression equationy = b dot x + e = [b0, b1] dot [1, x1] + e = b0 + b1 * x1 + e
where 'e' represents the residuals (the part not explained by the model).
-
class
SimplerRegression extends PredictorMat
The
SimplerRegression
class supports simpler linear regression.The
SimplerRegression
class supports simpler linear regression. In this case, the vector 'x' consists of a single variable 'x0'. Fit the parameter vector 'b' in the regression equationy = b dot x + e = [b0] dot [x0] + e = b0 * x0 + e
where 'e' represents the residuals (the part not explained by the model). The simpler regression model has no intercept parameter, only a slope parameter.
- See also
SimpleRegression
for both intercept and slope parameters
-
class
Softmax extends AnyRef
Softmax class calculate softmax regularization for the input
-
type
Strings = Array[String]
Shorthand for array of strings
-
class
Tanh extends AnyRef
The
Tanh
class implements Tanh and derivative for vector version -
class
TranRegression extends Regression
The
TranRegression
class supports transformed multiple linear regression.The
TranRegression
class supports transformed multiple linear regression. In this case, 'x' is multi-dimensional [1, x_1, ... x_k]. Fit the parameter vector 'b' in the transformed regression equationtransform (y) = b dot x + e = b_0 + b_1 * x_1 + b_2 * x_2 ... b_k * x_k + e
where 'e' represents the residuals (the part not explained by the model) and 'transform' is the function (defaults to log) used to transform the response vector 'y'. Common transforms include 'log (y)', 'sqrt (y)' when 'y > 0', or even 'sq (y)', 'exp (y)'. More generally, a Box-Cox Transformation may be applied.
- See also
www.ams.sunysb.edu/~zhu/ams57213/Team3.pptx
citeseerx.ist.psu.edu/viewdoc/download?doi=10.1.1.469.7176&rep=rep1&type=pdf Use Least-Squares (minimizing the residuals) to fit the parameter vector 'b' Note: this class does not provide transformations on columns of matrix 'x'.
-
class
TrigRegression extends PredictorVec
The
TrigRegression
class supports trigonometric regression.The
TrigRegression
class supports trigonometric regression. In this case, 't' is expanded to '[1, sin (wt), cos (wt), sin (2wt), cos (2wt), ...]'. Fit the parameter vector 'b' in the regression equationy = b dot x + e = b_0 + b_1 sin (wt) + b_2 cos (wt) + b_3 sin (2wt) + b_4 cos (2wt) + ... + e
where 'e' represents the residuals (the part not explained by the model). Use Least-Squares (minimizing the residuals) to solve for the parameter vector 'b' using the Normal Equations:
x.t * x * b = x.t * y b = fac.solve (.)
- See also
link.springer.com/article/10.1023%2FA%3A1022436007242#page-1
Value Members
-
val
BASE_DIR: String
The relative path for base directory
-
object
ANCOVA extends Error
The
ANCOVA
companion object provides helper functions. -
object
ANCOVA1 extends Error
The
ANCOVA1
companion object provides helper functions. -
object
ANCOVA1Test extends App
The
ANCOVA1Test
object tests theANCOVA1
class using the following regression equation.The
ANCOVA1Test
object tests theANCOVA1
class using the following regression equation.y = b dot x = b_0 + b_1*x_1 + b_2*x_2 + b_3*d_1 + b_4*d_2
> runMain scalation.analytics.ANCOVA1Test
-
object
ANCOVATest extends App
The
ANCOVATest
object tests theANCOVA
class using the following regression equation.The
ANCOVATest
object tests theANCOVA
class using the following regression equation.y = b dot x = b_0 + b_1*x_1 + b_2*x_2 + b_3*d_1 + b_4*d_2
> runMain scalation.analytics.ANCOVATest
-
object
ANOVA1Test extends App
The
ANOVA1Test
object tests theANOVA1
class using the following regression equation.The
ANOVA1Test
object tests theANOVA1
class using the following regression equation.y = b dot x = b_0 + b_1*d_1 + b_2*d_2
> runMain scalation.analytics.ANOVA1Test
-
object
ActivationFun
The
ActivationFun
object contains common Activation functions and provides both scalar and vector versions.The
ActivationFun
object contains common Activation functions and provides both scalar and vector versions.- See also
en.wikipedia.org/wiki/Activation_function Convention: fun activation function (e.g., sigmoid) funV vector version of activation function (e.g., sigmoidV) funM matrix version of activation function (e.g., sigmoidM) funDV vector version of dervivative (e.g., sigmoidDV) funDM matrix version of dervivative (e.g., sigmoidDM) ---------------------------------------------------------------------------------- Supports: id, reLU, lreLU, eLU, tanh, sigmoid, gaussian, softmax Related functions: logistic, logit
-
object
ActivationFunTest extends App
The
ActivationFunTest
is used to test theActivationFun
object.The
ActivationFunTest
is used to test theActivationFun
object. > runMain scalation.analytics.ActivationFunTest -
object
ActivationFunTest2 extends App
The
ActivationFunTest2
is used to test theActivationFun
object.The
ActivationFunTest2
is used to test theActivationFun
object.- See also
en.wikipedia.org/wiki/Softmax_function > runMain scalation.analytics.ActivationFunTest2
-
object
ExampleBasketBall
The
ExampleBasketBall
class stores a medium-sized example dataset with data about basketball player that can be used to predict their scoring average. -
object
ExampleConcrete
The
ExampleConcrete
class stores a medium-sized example dataset from the UCI Machine Learning Repository, "Abstract: Concrete is a highly complex material.The
ExampleConcrete
class stores a medium-sized example dataset from the UCI Machine Learning Repository, "Abstract: Concrete is a highly complex material. The slump flow of concrete is not only determined by the water content, but that is also influenced by other concrete ingredients." -
object
ExampleConcreteTest extends App
The
ExampleConcreteTest
object is used to test theExampleConcrete
object.The
ExampleConcreteTest
object is used to test theExampleConcrete
object. It compares several modeling techniques. This one runsRegression
. > runMain scalation.analytics.ExampleConcreteTest -
object
ExampleConcreteTest10 extends App
The
ExampleConcreteTest10
object is used to test theExampleConcrete
object.The
ExampleConcreteTest10
object is used to test theExampleConcrete
object. It compares several modeling techniques. This one runsNeuralNet_XL
with 4 layers. > runMain scalation.analytics.ExampleConcreteTes10t -
object
ExampleConcreteTest2 extends App
The
ExampleConcreteTest2
object is used to test theExampleConcrete
object.The
ExampleConcreteTest2
object is used to test theExampleConcrete
object. It compares several modeling techniques. This one runsPerceptron
. > runMain scalation.analytics.ExampleConcreteTest2 -
object
ExampleConcreteTest3 extends App
The
ExampleConcreteTest3
object is used to test theExampleConcrete
object.The
ExampleConcreteTest3
object is used to test theExampleConcrete
object. It compares several modeling techniques. This one runsPerceptron
. > runMain scalation.analytics.ExampleConcreteTest3 -
object
ExampleConcreteTest4 extends App
The
ExampleConcreteTest4
object is used to test theExampleConcrete
object.The
ExampleConcreteTest4
object is used to test theExampleConcrete
object. It compares several modeling techniques. This one runsNeuralNet_2L
with 'sigmoid'. > runMain scalation.analytics.ExampleConcreteTest4 -
object
ExampleConcreteTest5 extends App
The
ExampleConcreteTest5
object is used to test theExampleConcrete
object.The
ExampleConcreteTest5
object is used to test theExampleConcrete
object. It compares several modeling techniques. This one runsNeuralNet_2L
with 'sigmoid'. > runMain scalation.analytics.ExampleConcreteTest5 -
object
ExampleConcreteTest6 extends App
The
ExampleConcreteTest6
object is used to test theExampleConcrete
object.The
ExampleConcreteTest6
object is used to test theExampleConcrete
object. It compares several modeling techniques. This one runsNeuralNet_2L
with 'tanh'. > runMain scalation.analytics.ExampleConcreteTest6 -
object
ExampleConcreteTest7 extends App
The
ExampleConcreteTest7
object is used to test theExampleConcrete
object.The
ExampleConcreteTest7
object is used to test theExampleConcrete
object. It compares several modeling techniques. This one runsNeuralNet_2L
with 'id'. > runMain scalation.analytics.ExampleConcreteTest7 -
object
ExampleConcreteTest8 extends App
The
ExampleConcreteTest8
object is used to test theExampleConcrete
object.The
ExampleConcreteTest8
object is used to test theExampleConcrete
object. It compares several modeling techniques. This one runsNeuralNet_3L
with 'sigmiod'. > runMain scalation.analytics.ExampleConcreteTest8 -
object
ExampleConcreteTest9 extends App
The
ExampleConcreteTest9
object is used to test theExampleConcrete
object.The
ExampleConcreteTest9
object is used to test theExampleConcrete
object. It compares several modeling techniques. This one runsNeuralNet_3L
with 'sigmiod'. > runMain scalation.analytics.ExampleConcreteTest9 -
object
ExpRegressionTest extends App
The
ExpRegressionTest
object testsExpRegression
class using the following exponential regression problem.The
ExpRegressionTest
object testsExpRegression
class using the following exponential regression problem. > runMain scalation.analytics.ExpRegressionTest -
object
ExpRegressionTest2 extends App
The
ExpRegressionTest2
object has a basic test for theExpRegression
class.The
ExpRegressionTest2
object has a basic test for theExpRegression
class. > runMain scalation.analytics.ExpRegressionTest -
object
Fit
The
Fit
companion object provides factory methods for assessing quality of fit for standard types of modeling techniques. -
object
GLM extends GLM
The
GLM
object makes theGLM
trait's methods directly available.The
GLM
object makes theGLM
trait's methods directly available. This approach (using traits and objects) allows the methods to also be inherited. -
object
GLMTest extends App
The
GLMTest
object tests theGLM
object using the following regression equation.The
GLMTest
object tests theGLM
object using the following regression equation.y = b dot x = b_0 + b_1*x_1 + b_2*x_2 + b_3*d_1 + b_4*d_2
-
object
GZLM extends GLM
A Generalized Linear Model 'GZLM' can be developed using the
GZLM
object.A Generalized Linear Model 'GZLM' can be developed using the
GZLM
object. It provides factory methods for General Linear Models 'GLM' via inheritance and for proper Generalized Linear Models:LogisticRegression
- logistic regression, (@seeclassifier
package)PoissonRegression
- Poisson regression,ExpRegression
- Exponential regression, -
object
GZLMTest extends App
The
GZLMTest
object tests theGZLM
object using the following regression equation.The
GZLMTest
object tests theGZLM
object using the following regression equation.y = b dot x = b_0 + b_1*x_1 + b_2*x_2 + b_3*d_1 + b_4*d_2
-
object
HyperParameterTest extends App
The
HyperParameterTest
object is used to test theHyperParameter
class.The
HyperParameterTest
object is used to test theHyperParameter
class. runMain scalation.analytics.HyperParameterTest -
object
KNN_Predictor
The
KNN_Predictor
companion object provides a factory functions. -
object
KNN_PredictorTest extends App
The
KNN_PredictorTest
object is used to test theKNN_Predictor
class.The
KNN_PredictorTest
object is used to test theKNN_Predictor
class. > runMain scalation.analytics.KNN_PredictorTest -
object
KNN_PredictorTest2 extends App
The
KNN_PredictorTest2
object is used to test theKNN_Predictor
class.The
KNN_PredictorTest2
object is used to test theKNN_Predictor
class. > runMain scalation.analytics.KNN_PredictorTest2 -
object
LassoRegression extends Error
The
LassoRegression
companion object provides factory methods for theRidgeRegression
class. -
object
LassoRegressionTest extends App
The
LassoRegressionTest
object testsLassoRegression
class using the following regression equation.The
LassoRegressionTest
object testsLassoRegression
class using the following regression equation.y = b dot x = b_0 + b_1*x_1 + b_2*x_2.
- See also
http://statmaster.sdu.dk/courses/st111/module03/index.html > runMain scalation.analytics.LassoRegressionTest
-
object
MatrixTransform
The
MatrixTransform
object is used to transform the columns of a data matrix 'x'.The
MatrixTransform
object is used to transform the columns of a data matrix 'x'. Such pre-processing of the data is required by some modeling techniques. -
object
MatrixTransformTest extends App
The
MatrixTransformTest
is used to test theMatrixTransform
object.The
MatrixTransformTest
is used to test theMatrixTransform
object. > runMain scalation.analytics.MatrixTransformTest -
object
NMFactorizationTest extends App
The
NMFactorizationTest
object to testNMFactorizationTest
class. -
object
NeuralNet_2LTest extends App
The
NeuralNet_2LTest
object is used to test theNeuralNet_2L
class.The
NeuralNet_2LTest
object is used to test theNeuralNet_2L
class. For this test, training data is used to fit the weights before using them for prediction.- See also
www4.rgu.ac.uk/files/chapter3%20-%20bp.pdf > runMain scalation.analytics.NeuralNet_2LTest
-
object
NeuralNet_3LTest extends App
The
NeuralNet_3LTest
object is used to test theNeuralNet_3L
class.The
NeuralNet_3LTest
object is used to test theNeuralNet_3L
class.- See also
www4.rgu.ac.uk/files/chapter3%20-%20bp.pdf > runMain scalation.analytics.NeuralNet_3LTest
-
object
NeuralNet_XLTest extends App
The
NeuralNet_XLTest
object is used to test theNeuralNet_XL
class.The
NeuralNet_XLTest
object is used to test theNeuralNet_XL
class.- See also
www4.rgu.ac.uk/files/chapter3%20-%20bp.pdf > runMain scalation.analytics.NeuralNet_XLTest
-
object
NonLinRegressionTest extends App
The
NonLinRegressionTest
object tests theNonLinRegression
class: y = f(x; b) = b0 + exp (b1 * x0).The
NonLinRegressionTest
object tests theNonLinRegression
class: y = f(x; b) = b0 + exp (b1 * x0).- See also
www.bsos.umd.edu/socy/alan/stats/socy602_handouts/kut86916_ch13.pdf Answers: sse = 49.45929986243339 fit = (VectorD (58.606566327280426, -0.03958645286504356), 0.9874574894685292) predict (VectorD (50.0)) = 8.09724678182599 FIX: check this example
-
object
NullModel extends Error
The
NullModel
companion object provides a simple factory method for building null models. -
object
NullModelTest extends App
The
NullModelTest
object is used to test theNullModel
class.The
NullModelTest
object is used to test theNullModel
class.y = b dot x + e = b0 + e
> runMain scalation.analytics.NullModelTest
-
object
NullModelTest2 extends App
The
NullModelTest2
object is used to test theNullModel
class.The
NullModelTest2
object is used to test theNullModel
class.y = b dot x + e = b0 + e
> runMain scalation.analytics.NullModelTest2
-
object
Optimizer
The
Optimizer
object provides functions to optimize the parameters/weights of Neural Networks with various numbers of layers. -
object
Perceptron
The
Perceptron
companion object provides factory methods for buidling perceptrons. -
object
PerceptronTest extends App
The
PerceptronTest
object is used to test thePerceptron
class.The
PerceptronTest
object is used to test thePerceptron
class. For this test, training data is used to fit the weights before using them for prediction.- See also
www4.rgu.ac.uk/files/chapter3%20-%20bp.pdf > runMain scalation.analytics.PerceptronTest
-
object
PerceptronTest2 extends App
The
PerceptronTest2
object trains a perceptron on a small dataset of temperatures from counties in Texas where the variables/factors to consider are Latitude (x1), Elevation (x2) and Longitude (x3).The
PerceptronTest2
object trains a perceptron on a small dataset of temperatures from counties in Texas where the variables/factors to consider are Latitude (x1), Elevation (x2) and Longitude (x3). The regression equation is the following:y = sigmoid (w dot x) = sigmoid (w0 + w1*x1 + w2*x2 + w3*x3)
This test case illustrates how to transform the columns of the matrix so that the 'sigmoid' activation function can work effectively. Since the dataset is very small, should use 'train0' which does no batching. > runMain scalation.analytics.PerceptronTest2
-
object
PerceptronTest3 extends App
The
PerceptronTest3
object trains a perceptron on a small dataset with variables x1 and x2.The
PerceptronTest3
object trains a perceptron on a small dataset with variables x1 and x2. The regression equation is the following:y = sigmoid (b dot x) = sigmoid (b0 + b1*x1 + b2*x2)
Does not call the 'train' method; improvements steps for sigmoid are explicitly in code below. > runMain scalation.analytics.PerceptronTest3
-
object
PerceptronTest4 extends App
The
PerceptronTest4
object trains a perceptron on a small dataset with variables x1 and x2.The
PerceptronTest4
object trains a perceptron on a small dataset with variables x1 and x2. The regression equation is the following:y = sigmoid (b dot x) = sigmoid (b0 + b1*x1 + b2*x2)
This version calls the 'train0' method. > runMain scalation.analytics.PerceptronTest4
-
object
PerceptronTest5 extends App
The
PerceptronTest5
object trains a perceptron on a small dataset with variables > runMain scalation.analytics.PerceptronTest5 -
object
PerceptronTest6 extends App
The
PerceptronTest6
object trains a perceptron on a small dataset with 4 variables.The
PerceptronTest6
object trains a perceptron on a small dataset with 4 variables. > runMain scalation.analytics.PerceptronTest6 -
object
PerceptronTest7 extends App
The
PerceptronTest7
object trains a perceptron on theExampleBasketBall dataset. > runMain scalation.analytics.PerceptronTest7
-
object
PoissonRegression
The
PoissonRegression
companion object provides factory functions. -
object
PoissonRegressionTest extends App
The
PoissonRegressionTest
object tests thePoissonRegression
class.The
PoissonRegressionTest
object tests thePoissonRegression
class.- See also
http://www.cookbook-r.com/Statistical_analysis/Logistic_regression/ Answer: b = (-8.8331, 0.4304), n_dev = 43.860, r_dev = 25.533, aci = 29.533, pseudo_rSq = 0.4178 > runMain scalation.analytics.PoissonRegressionTest
-
object
PoissonRegressionTest2 extends App
The
PoissonRegressionTest2
object tests thePoissonRegression
class.The
PoissonRegressionTest2
object tests thePoissonRegression
class.- See also
www.stat.wisc.edu/~mchung/teaching/.../GLM.logistic.Rpackage.pdf > runMain scalation.analytics.PoissonRegressionTest2
statmaster.sdu.dk/courses/st111/module03/index.html
-
object
PolyRegressionTest extends App
The
PolyRegressionTest
object testsPolyRegression
class using the following regression equation.The
PolyRegressionTest
object testsPolyRegression
class using the following regression equation.y = b dot x = b_0 + b_1*t + b_2*t^2 + ... b_k*t_k
Note, the 'order' at which R-Squared drops is QR(7), Cholesky(14), SVD(6), Inverse(13). > runMain scalation.analytics.PolyRegressionTest
-
object
PolyRegressionTest2 extends App
The
PolyRegressionTest2
object testsPolyRegression
class using the following regression equation.The
PolyRegressionTest2
object testsPolyRegression
class using the following regression equation. This version uses orthogonal polynomials.y = b dot x = b_0 + b_1*t + b_2*t^2 + ... b_k*t_k
> runMain scalation.analytics.PolyRegressionTest2
-
object
PredictorMat
The
PredictorMat
companion object provides a meythod for splitting a combined data matrix in predictor matrix and a response vector. -
object
PredictorMatTest extends App
The
PredictorMatTest
is used to test thePredictorMat
abstract class and its derived classes using theExampleBasketBall
dataset containing data matrix 'x' and response vector 'y'.The
PredictorMatTest
is used to test thePredictorMat
abstract class and its derived classes using theExampleBasketBall
dataset containing data matrix 'x' and response vector 'y'. > runMain scalation.analytics.PredictorMatTest -
object
PredictorTest extends App
The
PredictorTest
object tests all the classes in thescalation.analytics
package that directly or indirectly extend thePredictor
trait.The
PredictorTest
object tests all the classes in thescalation.analytics
package that directly or indirectly extend thePredictor
trait. FIX - make first test uniform so that the modeling techniques may be compared > runMain scalation.analytics.PredictorTest -
object
PrincipalComponentsTest extends App
The
PrincipalComponentsTest
object is used to test thePrincipalComponents
class.The
PrincipalComponentsTest
object is used to test thePrincipalComponents
class.- See also
www.ce.yildiz.edu.tr/personal/songul/file/1097/principal_components.pdf > runMain scalation.analytics.PrincipalComponentsTest
-
object
Probability extends Error
The
Probability
object provides methods for operating on univariate and bivariate probability distributions of discrete random variables 'X' and 'Y'.The
Probability
object provides methods for operating on univariate and bivariate probability distributions of discrete random variables 'X' and 'Y'. A probability distribution is specified by its probability mass functions (pmf) stored either as a "probability vector" for a univariate distribution or a "probability matrix" for a bivariate distribution.joint probability matrix: pxy(i, j) = P(X = x_i, Y = y_j) marginal probability vector: px(i) = P(X = x_i) conditional probability matrix: px_y(i, j) = P(X = x_i|Y = y_j)
In addition to computing joint, marginal and conditional probabilities, methods for computing entropy and mutual information are also provided. Entropy provides a measure of disorder or randomness. If there is little randomness, entropy will close to 0, while when randomness is high, entropy will be close to, e.g., 'log2 (px.dim)'. Mutual information provides a robust measure of dependency between random variables (contrast with correlation).
- See also
scalation.stat.StatVector
-
object
ProbabilityTest extends App
The
ProbabilityTest
object is used to test theProbability
object. -
object
ProbabilityTest2 extends App
The
ProbabilityTest2
provides upper bound for 'entropy' and 'entropy_k'. -
object
QuadRegression
The
QuadRegression
companion object provides methods for creating functional forms. -
object
QuadRegressionTest extends App
The
QuadRegressionTest
object is used to test theQuadRegression
class.The
QuadRegressionTest
object is used to test theQuadRegression
class. > runMain scalation.analytics.QuadRegressionTest -
object
QuadraticFitTest extends App
The
QuadraticFitTest
object is used to test theQuadraticFit
class for a two dimensional case. -
object
QuadraticFitTest2 extends App
The
QuadraticFitTest2
object is used to test theQuadraticFit
class for a three dimensional case. -
object
QuadraticFitTest3 extends App
The
QuadraticFitTest3
object is used to test theQuadraticFit
class for a three dimensional case with noise. -
object
RecurrentNeuralNetTest extends App
The
RecurrentNeuralNetTest
object is used to test theRecurrentNeuralNet
class.The
RecurrentNeuralNetTest
object is used to test theRecurrentNeuralNet
class. > runMain scalation.analytics.RecurrentNeuralNetTest -
object
RegTechnique extends Enumeration
The
RegTechnique
object defines the implementation techniques available. -
object
Regression extends Error
The
Regression
companion object provides factory apply functions and a testing method. -
object
RegressionTest extends App
The
RegressionTest
object testsRegression
class using the following regression equation.The
RegressionTest
object testsRegression
class using the following regression equation.y = b dot x = b_0 + b_1*x_1 + b_2*x_2.
- See also
statmaster.sdu.dk/courses/st111/module03/index.html > runMain scalation.analytics.RegressionTest
-
object
RegressionTest2 extends App
The
RegressionTest2
object testsRegression
class using the following regression equation.The
RegressionTest2
object testsRegression
class using the following regression equation.y = b dot x = b_0 + b_1*x1 + b_2*x_2.
> runMain scalation.analytics.RegressionTest2
-
object
RegressionTest3 extends App
The
RegressionTest3
object tests the multi-collinearity method in theRegression
class using the following regression equation.The
RegressionTest3
object tests the multi-collinearity method in theRegression
class using the following regression equation.y = b dot x = b_0 + b_1*x_1 + b_2*x_2 + b_3*x_3 + b_4 * x_4
- See also
online.stat.psu.edu/online/development/stat501/data/bloodpress.txt > runMain scalation.analytics.RegressionTest3
online.stat.psu.edu/online/development/stat501/12multicollinearity/05multico_vif.html
-
object
RegressionTest4 extends App
The
RegressionTest4
object tests the multi-collinearity method in theRegression
class using the following regression equation.The
RegressionTest4
object tests the multi-collinearity method in theRegression
class using the following regression equation.y = b dot x = b_0 + b_1*x_1 + b_2*x_2 + b_3*x_3 + b_4 * x_4
- See also
online.stat.psu.edu/online/development/stat501/data/bloodpress.txt > runMain scalation.analytics.RegressionTest4
online.stat.psu.edu/online/development/stat501/12multicollinearity/05multico_vif.html
-
object
RegressionTest5 extends App
The
RegressionTest5
object testsRegression
class using the following regression equation.The
RegressionTest5
object testsRegression
class using the following regression equation.y = b dot x = b_0 + b_1*x1 + b_2*x_2.
> runMain scalation.analytics.RegressionTest5
-
object
RegressionTest6 extends App
The
RegressionTest6
object testsRegression
class using the following regression equation.The
RegressionTest6
object testsRegression
class using the following regression equation.y = b dot x = b_0 + b_1*x1 + b_2*x_2.
> runMain scalation.analytics.RegressionTest6
-
object
RegressionTree
The
RegressionTree
companion object is used to count the number of leaves. -
object
RegressionTreeTest extends App
The
RegressionTreeTest
object is used to test theRegressionTree
class.The
RegressionTreeTest
object is used to test theRegressionTree
class. It tests a simple case that does not require a file to be read.- See also
translate.google.com/translate?hl=en&sl=zh-CN&u=https://www.hrwhisper.me/machine-learning-decision-tree/&prev=search > runMain scalation.analytics.RegressionTreeTest
-
object
RegressionTree_GB
The
RegressionTree_GB
companion object defines hyper-parameters and provides a factory function. -
object
RegressionTree_GBTest extends App
The
RegressionTree_GBTest
object is used to test theRegressionTree_GB
class.The
RegressionTree_GBTest
object is used to test theRegressionTree_GB
class. It tests a simple case that does not require a file to be read. > runMain scalation.analytics.RegressionTree_GBTest -
object
Regression_WLS
The
Regression_WLS
companion object provides methods for setting weights and testing. -
object
Regression_WLSTest extends App
The
Regression_WLSTest
object testsRegression_WLS
class using the following regression equation.The
Regression_WLSTest
object testsRegression_WLS
class using the following regression equation.y = b dot x = b_0 + b_1*x_1 + b_2*x_2.
- See also
statmaster.sdu.dk/courses/st111/module03/index.html > runMain scalation.analytics.Regression_WLSTest
-
object
ResponseSurface
The
ResponseSurface
companion object provides methods for creating functional forms. -
object
ResponseSurfaceTest extends App
The
ResponseSurfaceTest
object is used to test theResponseSurface
class.The
ResponseSurfaceTest
object is used to test theResponseSurface
class. > runMain scalation.analytics.ResponseSurfaceTest -
object
RidgeRegression extends Error
The
RidgeRegression
companion object defines hyper-paramters and provides factory functions for theRidgeRegression
class. -
object
RidgeRegressionTest extends App
The
RidgeRegressionTest
object testsRidgeRegression
class using the following regression equation.The
RidgeRegressionTest
object testsRidgeRegression
class using the following regression equation.y = b dot x = b_1*x_1 + b_2*x_2.
Test regression and backward elimination.
- See also
http://statmaster.sdu.dk/courses/st111/module03/index.html > runMain scalation.analytics.RidgeRegressionTest
-
object
RidgeRegressionTest2 extends App
The
RidgeRegressionTest2
object testsRidgeRegression
class using the following regression equation.The
RidgeRegressionTest2
object testsRidgeRegression
class using the following regression equation.y = b dot x = b_1*x1 + b_2*x_2.
> runMain scalation.analytics.RidgeRegressionTest2
-
object
RidgeRegressionTest3 extends App
The
RidgeRegressionTest3
object tests the multi-collinearity method in theRidgeRegression
class using the following regression equation.The
RidgeRegressionTest3
object tests the multi-collinearity method in theRidgeRegression
class using the following regression equation.y = b dot x = b_1*x_1 + b_2*x_2 + b_3*x_3 + b_4 * x_4
- See also
online.stat.psu.edu/online/development/stat501/data/bloodpress.txt > runMain scalation.analytics.RidgeRegressionTest3
online.stat.psu.edu/online/development/stat501/12multicollinearity/05multico_vif.html
-
object
RoundRegression
The
RoundRegression
companion object provides a factory method. -
object
RoundRegressionTest extends App
The
RoundRegressionTest
object testsRoundRegression
class using the following regression equation.The
RoundRegressionTest
object testsRoundRegression
class using the following regression equation.y = round (b dot x) = round (b_0 + b_1*x_1 + b_2*x_2).
> runMain scalation.analytics.RoundRegressionTest
-
object
SimpleRegression extends Error
The
SimpleRegression
companion object provides a simple factory method for building simple regression linear regression models. -
object
SimpleRegressionTest extends App
The
SimpleRegressionTest
object to test theSimpleRegression
class:The
SimpleRegressionTest
object to test theSimpleRegression
class:y = b0 + b1 * x
> runMain scalation.analytics.SimpleRegressionTest
-
object
SimpleRegressionTest2 extends App
The
SimpleRegressionTest2
object is used to test theSimpleRegression
class.The
SimpleRegressionTest2
object is used to test theSimpleRegression
class.y = b dot x = [b0, b1] dot [1, x1]
- See also
http://www.analyzemath.com/statistics/linear_regression.html > runMain scalation.analytics.SimpleRegressionTest2
-
object
SimpleRegressionTest3 extends App
The
SimpleRegressionTest3
object is used to test theSimpleRegression
classThe
SimpleRegressionTest3
object is used to test theSimpleRegression
classy = b dot x = b0 + b1 * x1
- See also
http://mathbits.com/mathbits/tisection/Statistics2/linear.htm > runMain scalation.analytics.SimpleRegressionTest3
-
object
SimplerRegression extends Error
The
SimplerRegression
companion object provides a simple factory method for building simple regression linear regression models. -
object
SimplerRegressionTest extends App
The
SimplerRegressionTest
object is used to test theSimplerRegression
class.The
SimplerRegressionTest
object is used to test theSimplerRegression
class.y = b0 * x + e
> runMain scalation.analytics.SimplerRegressionTest
-
object
SimplerRegressionTest2 extends App
The
SimplerRegressionTest2
object is used to test theSimplerRegression
class.The
SimplerRegressionTest2
object is used to test theSimplerRegression
class.y = b dot x + e = [b0] dot [x0] + e
> runMain scalation.analytics.SimplerRegressionTest2
-
object
SimplerRegressionTest3 extends App
The
SimplerRegressionTest3
object is used to test theSimplerRegression
class.The
SimplerRegressionTest3
object is used to test theSimplerRegression
class.y = b dot x = b0 * x0
- See also
http://mathbits.com/mathbits/tisection/Statistics2/linear.htm > runMain scalation.analytics.SimplerRegressionTest3
- object SimplerRegression_exer_1 extends App
-
object
TranRegression
The
TranRegression
companion object provides transformation and inverse transformation function based on the parameter 'lambda'.The
TranRegression
companion object provides transformation and inverse transformation function based on the parameter 'lambda'. It support the family of Box-Cox transformations. -
object
TranRegressionTest extends App
The
TranRegressionTest
object testsTranRegression
class using the following regression equation.The
TranRegressionTest
object testsTranRegression
class using the following regression equation.log (y) = b dot x = b_0 + b_1*x_1 + b_2*x_2.
> runMain scalation.analytics.TranRegressionTest
-
object
TranRegressionTest2 extends App
The
TranRegressionTest2
object testsTranRegression
class using the following regression equation.The
TranRegressionTest2
object testsTranRegression
class using the following regression equation.sqrt (y) = b dot x = b_0 + b_1*x_1 + b_2*x_2.
> runMain scalation.analytics.TranRegressionTest2
-
object
TrigRegressionTest extends App
The
TrigRegressionTest
object testsTrigRegression
class using the following regression equation.The
TrigRegressionTest
object testsTrigRegression
class using the following regression equation.y = b dot x = b_0 + b_1 sin wt + b_2 cos wt + ... b_2k-1 sin kwt + b_2k cos kwt + e
The data is generated from a noisy cubic function. > runMain scalation.analytics.TrigRegressionTest
-
object
TrigRegressionTest2 extends App
The
TrigRegressionTest2
object testsTrigRegression
class using the following regression equation.The
TrigRegressionTest2
object testsTrigRegression
class using the following regression equation.y = b dot x = b_0 + b_1 sin wt + b_2 cos wt + ... b_2k-1 sin kwt + b_2k cos kwt + e
The data is generated from periodic noisy cubic functions. > runMain scalation.analytics.TrigRegressionTest2