Packages

p

scalation

analytics

package analytics

The analytics package contains classes, traits and objects for analytics including clustering and prediction.

Linear Supertypes
AnyRef, Any
Ordering
  1. Alphabetic
  2. By Inheritance
Inherited
  1. analytics
  2. AnyRef
  3. Any
  1. Hide All
  2. Show All
Visibility
  1. Public
  2. All

Type Members

  1. case class AFF(f: FunctionS2S, fV: FunctionV_2V, fM: FunctionM_2M, dV: FunctionV_2V, dM: FunctionM_2M, bounds: (Double, Double) = null) extends Product with Serializable

    The AFF class holds an Activation Function Family (AFF).

    The AFF class holds an Activation Function Family (AFF).

    f

    the activation function itself

    fV

    the vector version of the activation function

    fM

    the matrix version of the activation function

    dV

    the vector version of the activation function derivative

    dM

    the matrix version of the activation function derivative

    bounds

    the (lower, upper) bounds on the range of the activation function

  2. class ANCOVA extends Regression

    The ANCOVA class supports ANalysis of COVAriance 'ANCOVA'.

    The ANCOVA class supports ANalysis of COVAriance 'ANCOVA'. It allows the addition of a categorical treatment variable 't' into a multiple linear regression. This is done by introducing dummy variables 'dj' to distinguish the treatment level. The problem is again to fit the parameter vector 'b' in the augmented regression equation

    y = b dot x + e = b0 + b_1 * x_1 + b_2 * x_2 + ... b_k * x_k + b_k+1 * d_1 + b_k+2 * d_2 + ... b_k+l * d_l + e

    where 'e' represents the residuals (the part not explained by the model). Use Least-Squares (minimizing the residuals) to solve for the parameter vector 'b' using the Normal Equations:

    x.t * x * b = x.t * y b = fac.solve (.)

    't' has categorical values/levels, e.g., treatment levels (0, ... 't.max ()')

    See also

    see.stanford.edu/materials/lsoeldsee263/05-ls.pdf

  3. class ANCOVA1 extends Regression

    The ANCOVA1 class supports ANalysis of COVAriance 'ANCOVA1'.

    The ANCOVA1 class supports ANalysis of COVAriance 'ANCOVA1'. It allows the addition of a categorical treatment variable 't' into a multiple linear regression. This is done by introducing dummy variables 'dj' to distinguish the treatment level. The problem is again to fit the parameter vector 'b' in the augmented regression equation

    y = b dot x + e = b0 + b_1 * x_1 + b_2 * x_2 + ... b_k * x_k + b_k+1 * d_1 + b_k+2 * d_2 + ... b_k+l * d_l + e

    where 'e' represents the residuals (the part not explained by the model). Use Least-Squares (minimizing the residuals) to solve for the parameter vector 'b' using the Normal Equations:

    x.t * x * b = x.t * y b = fac.solve (.)

    't' has categorical values/levels, e.g., treatment levels (0, ... 't.max ()')

    See also

    see.stanford.edu/materials/lsoeldsee263/05-ls.pdf

  4. class ANOVA1 extends PredictorVec

    The ANOVA1 class supports one-way ANalysis Of VAriance (ANOVA), i.e, it allows only one binary/categorial treatment variable.

    The ANOVA1 class supports one-way ANalysis Of VAriance (ANOVA), i.e, it allows only one binary/categorial treatment variable. It is framed using General Linear Model 'GLM' notation and supports the use of one binary/categorical treatment variable 't'. This is done by introducing dummy variables 'd_j' to distinguish the treatment level. The problem is again to fit the parameter vector 'b' in the following equation

    y = b dot x + e = b_0 + b_1 * d_1 + b_1 * d_2 ... b_k * d_k + e

    where 'e' represents the residuals (the part not explained by the model). Use Least-Squares (minimizing the residuals) to solve for the parameter vector 'b' using the Normal Equations:

    x.t * x * b = x.t * y b = fac.solve (.)

    See also

    psych.colorado.edu/~carey/Courses/PSYC5741/handouts/GLM%20Theory.pdf

    ANCOVA for models with multiple variables

  5. case class AddGate() extends Product with Serializable

    Perform Add for vectors.

  6. class CanCorrelation extends Reducer with Error

    The CanCorrelation class performs Canonical Correlation Analysis 'CCA' on two random vectors.

    The CanCorrelation class performs Canonical Correlation Analysis 'CCA' on two random vectors. Samples for the first one are stored in the 'x' data matrix and samples for the second are stored in the 'y' data matrix. Find vectors a and b that maximize the correlation between x * a and y * b.

    max {rho (x * a, y * b)}

    Additional vectors orthogonal to a and b can also be found.

  7. class ExpRegression extends PredictorMat

    The ExpRegression class supports exponential regression.

    The ExpRegression class supports exponential regression. In this case, 'x' is multi-dimensional [1, x_1, ... x_k]. Fit the parameter vector 'b' in the exponential regression equation

    log (mu (x)) = b dot x = b_0 + b_1 * x_1 + ... b_k * x_k

    See also

    www.stat.uni-muenchen.de/~leiten/Lehre/Material/GLM_0708/chapterGLM.pdf

  8. class Fit extends AnyRef

    The Fit class provides methods to determine basic quality of fit measures.

  9. trait GLM extends AnyRef

    A General Linear Model 'GLM' can be developed using the GLM trait and object (see below).

    A General Linear Model 'GLM' can be developed using the GLM trait and object (see below). The implementation currently supports univariate models with multivariate models (where each response is a vector) planned for the future. It provides factory methods for the following special types of GLMs: SimpleRegression - simple linear regression, Regression - multiple linear regression using Ordinary Least Squares 'OLS' Regression_WLS - multiple linear regression using Weighted Least Squares 'WLS' RidgeRegression - robust multiple linear regression, TranRegression - transformed (e.g., log) multiple linear regression, PolyRegression - polynomial regression, TrigRegression - trigonometric regression ResponseSurface - response surface regression, ANOVA1 - GLM form of ANalysis Of VAriance, ANCOVA1 - GLM form of ANalysis of COVAriance.

  10. class HyperParameter extends Cloneable

    The HyperParameter class provides a simple and flexible means for handling model hyper-parameters.

    The HyperParameter class provides a simple and flexible means for handling model hyper-parameters. A model may have one or more hyper-parameters that are organized into a map 'name -> (value, defaultV)'.

  11. class KNN_Predictor extends PredictorMat

    The KNN_Predictor class is used to predict a response value for new vector 'z'.

    The KNN_Predictor class is used to predict a response value for new vector 'z'. It works by finding its 'kappa' nearest neighbors. These neighbors essentially vote according to their prediction. The consensus is the average individual predictions for 'z'. Using a distance metric, the 'kappa' vectors nearest to 'z' are found in the training data, which are stored row-wise in data matrix 'x'. The corresponding response values are given in the vector 'y', such that the response value for vector 'x(i)' is given by 'y(i)'.

  12. class LassoRegression extends PredictorMat

    The LassoRegression class supports multiple linear regression.

    The LassoRegression class supports multiple linear regression. In this case, 'x' is multi-dimensional [1, x_1, ... x_k]. Fit the parameter vector 'b' in the regression equation

    y = b dot x + e = b_0 + b_1 * x_1 + ... b_k * x_k + e

    where 'e' represents the residuals (the part not explained by the model).

    See also

    see.stanford.edu/materials/lsoeldsee263/05-ls.pdf

  13. case class MultiplyGate() extends Product with Serializable

    MultiplyGate is to perfrom dot product from input and weights 'w'.

  14. class NMFactorization extends AnyRef

    The NMFactorization class factors a matrix 'v' into two non negative matrices 'w' and 'h' such that 'v = wh' approximately.

    The NMFactorization class factors a matrix 'v' into two non negative matrices 'w' and 'h' such that 'v = wh' approximately.

    See also

    en.wikipedia.org/wiki/Non-negative_matrix_factorization

  15. abstract class NeuralNet extends Predictor with Error

    The NeuralNet abstract class provides the basic structure and API for a variety of Neural Networks.

  16. class NeuralNet_2L extends NeuralNet

    The NeuralNet_2L class supports multi-output, 2-layer (input and output) Neural-Networks.

    The NeuralNet_2L class supports multi-output, 2-layer (input and output) Neural-Networks. It can be used for both classification and prediction, depending on the activation functions used. Given several input vectors and output vectors (training data), fit the weights/parameters 'bb' connecting the layers, so that for a new input vector 'z', the net can predict the output value, i.e.,

    yp_j = f (bb dot z)

    where 'f' is the activation function and the parameter matrix 'bb' gives the weights between input and output layers. No batching is used for this algorithm. Note, 'b0' is treated as the bias, so 'x0' must be 1.0.

  17. class NeuralNet_3L extends NeuralNet

    The NeuralNet_3L class supports multi-output, 3-layer (input, hidden and output) Neural-Networks.

    The NeuralNet_3L class supports multi-output, 3-layer (input, hidden and output) Neural-Networks. It can be used for both classification and prediction, depending on the activation functions used. Given several input vectors and output vectors (training data), fit the weights/parameters 'aa' and 'bb' connecting the layers, so that for a new input vector 'v', the net can predict the output value, i.e.,

    yp = f2 (bb * f1V (aa * v))

    where 'f1' and 'f2' are the activation functions and the parameter matrices 'aa' and 'bb' gives the weights between input-hidden and hidden-output layers. Note, if 'a0' is to be treated as bias/intercept, 'x0' must be 1.0.

  18. class NeuralNet_XL extends NeuralNet

    The NeuralNet_XL class supports multi-output, multi-layer (input, multiple hidden and output) Neural-Networks.

    The NeuralNet_XL class supports multi-output, multi-layer (input, multiple hidden and output) Neural-Networks. It can be used for both classification and prediction, depending on the activation functions used. Given several input vectors and output vectors (training data), fit the weight and bias parameters connecting the layers, so that for a new input vector 'v', the net can predict the output value This implementation is partially adapted from Michael Nielsen's Python implementation found in

    See also

    github.com/MichalDanielDobrzanski/DeepLearningPython35/blob/master/network2.py ------------------------------------------------------------------------------

    github.com/mnielsen/neural-networks-and-deep-learning/blob/master/src/network2.py

  19. case class Node(f: Int, branch: Int, yp: Double, thresh: Double, depth: Int, pthresh: Double, pfea: Int, leaf: Boolean = false) extends Product with Serializable

    Class that contains information for a tree node.

    Class that contains information for a tree node.

    f

    the feature of the node, if it is leaf, contains the feature of its parent

    branch

    the branch value (0 => left, 1 => right)

    yp

    leaf node's prediction for y

    thresh

    the threshold for continuous feature

    depth

    the current depth of the node

    pthresh

    the threshold for parent node

    pfea

    the feature of parent node

    leaf

    Boolean value indicate whether is a leaf node

  20. class NonLinRegression extends PredictorMat

    The NonLinRegression class supports non-linear regression.

    The NonLinRegression class supports non-linear regression. In this case, 'x' can be multi-dimensional '[1, x1, ... xk]' and the function 'f' is non-linear in the parameters 'b'. Fit the parameter vector 'b' in the regression equation

    y = f(x, b) + e

    where 'e' represents the residuals (the part not explained by the model). Use Least-Squares (minimizing the residuals) to fit the parameter vector 'b' by using Non-linear Programming to minimize Sum of Squares Error 'SSE'.

    See also

    www.bsos.umd.edu/socy/alan/stats/socy602_handouts/kut86916_ch13.pdf

  21. class NullModel extends Fit with Predictor with Error

    The NullModel class implements the simplest type of predictive modeling technique that just predicts the response 'y' to be the mean.

    The NullModel class implements the simplest type of predictive modeling technique that just predicts the response 'y' to be the mean. Fit the parameter vector 'b' in the regression equation

    y = b dot x + e = b0 + e

    where 'e' represents the residuals (the part not explained by the model).

  22. class Perceptron extends PredictorMat

    The Perceptron class supports single-output, 2-layer (input and output) Neural-Networks.

    The Perceptron class supports single-output, 2-layer (input and output) Neural-Networks. Although perceptrons are typically used for classification, this class is used for prediction. Given several input vectors and output values (training data), fit the weights/parameters 'b' connecting the layers, so that for a new input vector 'z', the net can predict the output value, i.e.,

    z = f (b dot z)

    The parameter vector 'b' (w) gives the weights between input and output layers. Note, 'b0' is treated as the bias, so 'x0' must be 1.0.

  23. class PoissonRegression extends PredictorMat

    The PoissonRegression class supports Poisson regression.

    The PoissonRegression class supports Poisson regression. In this case, x' may be multi-dimensional '[1, x_1, ... x_k]'. Fit the parameter vector 'b' in the Poisson regression equation

    log (mu(x)) = b dot x = b_0 + b_1 * x_1 + ... b_k * x_k

    where 'e' represents the residuals (the part not explained by the model) and 'y' is now integer valued.

    See also

    see.stanford.edu/materials/lsoeldsee263/05-ls.pdf

  24. class PolyRegression extends PredictorVec

    The PolyRegression class supports polynomial regression.

    The PolyRegression class supports polynomial regression. In this case, 't' is expanded to '[1, t, t2 ... tk]'. Fit the parameter vector 'b' in the regression equation

    y = b dot x + e = b_0 + b_1 * t + b_2 * t2 ... b_k * tk + e

    where 'e' represents the residuals (the part not explained by the model). Use Least-Squares (minimizing the residuals) to solve for the parameter vector 'b' using the Normal Equations:

    x.t * x * b = x.t * y b = fac.solve (.)

    See also

    www.ams.sunysb.edu/~zhu/ams57213/Team3.pptx

  25. trait Predictor extends AnyRef

    The Predictor trait provides a common framework for several predictors.

    The Predictor trait provides a common framework for several predictors. A predictor is for potentially unbounded responses (real or integer). When the number of distinct responses is bounded by some relatively small integer 'k', a classifier is likdely more appropriate. Note, the 'train' method must be called first followed by 'eval'.

  26. abstract class PredictorMat extends Fit with Predictor with Error

    The PredictorMat abstract class supports multiple predictor analytics.

    The PredictorMat abstract class supports multiple predictor analytics. In this case, 'x' is multi-dimensional [1, x_1, ... x_k]. Fit the parameter vector 'b' in for example the regression equation

    y = b dot x + e = b_0 + b_1 * x_1 + ... b_k * x_k + e

    Note, "protected val" arguments required by ResponseSurface.

  27. abstract class PredictorVec extends Predictor with Error

    The PredictorVec class supports term expanded regression.

    The PredictorVec class supports term expanded regression. Fit the parameter vector 'b' in the regression equation. Use Least-Squares (minimizing the residuals) to solve for the parameter vector 'b' using the Normal Equations:

    x.t * x * b = x.t * y b = fac.solve (.)

  28. class PrincipalComponents extends Reducer with Error

    The PrincipalComponents class performs the Principal Component Analysis 'PCA' on data matrix 'x'.

    The PrincipalComponents class performs the Principal Component Analysis 'PCA' on data matrix 'x'. It can be used to reduce the dimensionality of the data. First find the Principal Components 'PC's by calling 'findPCs' and then call 'reduce' to reduce the data (i.e., reduce matrix 'x' to a lower dimensionality matrix).

  29. class QuadRegression extends Regression

    The QuadRegression class uses multiple regression to fit a quadratic surface to the data.

    The QuadRegression class uses multiple regression to fit a quadratic surface to the data. For example in 2D, the quadratic regression equation is

    y = b dot x + e = [b_0, ... b_k] dot [1, x_0, x_02, x_1, x_12] + e

    Has no interaction/cross-terms and adds an a constant term for intercept (must not include intercept, column of ones in initial data matrix).

    See also

    scalation.metamodel.QuadraticFit

  30. class QuadraticFit extends AnyRef

    The QuadraticFit class uses multiple regression to fit a quadratic surface to the function 'f'.

    The QuadraticFit class uses multiple regression to fit a quadratic surface to the function 'f'. This is useful when computing 'f' is costly, for example in simulation optimization. The fit is over a multi-dimensional grid and can be used for interpolation and limited extrapolation.

  31. class RecurrentNeuralNet extends Error

    The RecurrentNeuralNet class feeds input in sequential time into hidden layer.

    The RecurrentNeuralNet class feeds input in sequential time into hidden layer. It uses parameter U, W, V in network. where U is parameter for input x, W is for hidden layer z, and V is for output y We have 'St = Activate (U dot x(t) + W dot x(t-1))' and 'y(t) = softmax(V dot St)'

    See also

    github.com/pangolulu/rnn-from-scratch ----------------------------------------------------------------------------

  32. class RecurrentNeuralNetLayer extends AnyRef

    The RecurrentNeuralNetLayer is a 3-layer where x denotes the input, 'y 'denotes the output and 's' is the intermediate/hidden value.

    The RecurrentNeuralNetLayer is a 3-layer where x denotes the input, 'y 'denotes the output and 's' is the intermediate/hidden value. We have 'St = Activate (U dot x(t) + W dot x(t-1))' and 'y(t) = softmax(V dot St)'.

  33. trait Reducer extends AnyRef

    The Reducer trait provides a common framework for several data reduction algorithms.

  34. class Regression extends PredictorMat

    The Regression class supports multiple linear regression.

    The Regression class supports multiple linear regression. In this case, 'x' is multi-dimensional [1, x_1, ... x_k]. Fit the parameter vector 'b' in the regression equation

    y = b dot x + e = b_0 + b_1 * x_1 + ... b_k * x_k + e

    where 'e' represents the residuals (the part not explained by the model). Use Least-Squares (minimizing the residuals) to solve the parameter vector 'b' using the Normal Equations:

    x.t * x * b = x.t * y b = fac.solve (.)

    Five factorization techniques are provided:

    'QR' // QR Factorization: slower, more stable (default) 'Cholesky' // Cholesky Factorization: faster, less stable (reasonable choice) 'SVD' // Singular Value Decomposition: slowest, most robust 'LU' // LU Factorization: better than Inverse 'Inverse' // Inverse/Gaussian Elimination, classical textbook technique

    See also

    en.wikipedia.org/wiki/Degrees_of_freedom_(statistics)

    see.stanford.edu/materials/lsoeldsee263/05-ls.pdf Note, not intended for use when the number of degrees of freedom 'df' is negative.

  35. class RegressionTree extends PredictorMat

    The RegressionTree class implements a RegressionTree selecting splitting features using minimal variance in children nodes.

    The RegressionTree class implements a RegressionTree selecting splitting features using minimal variance in children nodes. To avoid exponential choices in the selection, supporting ordinal features currently. Use companion object is recommended for generate Regression Tree.

  36. class RegressionTree_GB extends PredictorMat

    The RegressionTree_GB class uses Gradient Boosting on RegressionTree.

    The RegressionTree_GB class uses Gradient Boosting on RegressionTree. One Tree is included in the model at a time wisely chosen for reducing gradient.

  37. class Regression_WLS extends Regression

    The Regression_WLS class supports weighted multiple linear regression.

    The Regression_WLS class supports weighted multiple linear regression. In this case, 'xx' is multi-dimensional [1, xx_1, ... xx_k]. Weights are set to the inverse of a variable's variance, so they can compensate for such variability (heteroscedasticity). Fit the parameter vector 'b' in the regression equation

    yy = b dot xx + e = b_0 + b_1 * xx_1 + ... b_k * xx_k + e

    where 'e' represents the residuals (the part not explained by the model). Use Weighted Least-Squares (minimizing the residuals) to fit the parameter vector

    b = fac.solve (.)

    The data matrix 'xx' is reweighted 'x = rootW * xx' and the response vector 'yy' is reweighted 'y = rootW * yy' where 'rootW' is the square root of the weights.

    See also

    www.markirwin.net/stat149/Lecture/Lecture3.pdf

    en.wikipedia.org/wiki/Least_squares#Weighted_least_squares These are then passed to Ordinary Least Squares (OLS) Regression. Five factorization techniques are provided: 'QR' // QR Factorization: slower, more stable (default) 'Cholesky' // Cholesky Factorization: faster, less stable (reasonable choice) 'SVD' // Singular Value Decomposition: slowest, most robust 'LU' // LU Factorization: better than Inverse 'Inverse' // Inverse/Gaussian Elimination, classical textbook technique

  38. class ResponseSurface extends Regression

    The ResponseSurface class uses multiple regression to fit a quadratic/cubic surface to the data.

    The ResponseSurface class uses multiple regression to fit a quadratic/cubic surface to the data. For example in 2D, the quadratic regression equation is

    y = b dot x + e = [b_0, ... b_k] dot [1, x_0, x_02, x_1, x_0*x_1, x_12] + e

    Adds an a constant term for intercept (must not include intercept, column of ones in initial data matrix).

    See also

    scalation.metamodel.QuadraticFit

  39. class RidgeRegression extends PredictorMat

    The RidgeRegression class supports multiple linear ridge regression.

    The RidgeRegression class supports multiple linear ridge regression. In this case, 'x' is multi-dimensional [x_1, ... x_k]. Ridge regression puts a penalty on the L2 norm of the parameters b to reduce the chance of them taking on large values that may lead to less robust models. Both the input matrix 'x' and the response vector 'y' are centered (zero mean). Fit the parameter vector 'b' in the regression equation

    y = b dot x + e = b_1 * x_1 + ... b_k * x_k + e

    where 'e' represents the residuals (the part not explained by the model). Use Least-Squares (minimizing the residuals) to solve for the parameter vector 'b' using the regularized Normal Equations:

    b = fac.solve (.) with regularization x.t * x + λ * I

    Five factorization techniques are provided:

    'QR' // QR Factorization: slower, more stable (default) 'Cholesky' // Cholesky Factorization: faster, less stable (reasonable choice) 'SVD' // Singular Value Decomposition: slowest, most robust 'LU' // LU Factorization: similar, but better than inverse 'Inverse' // Inverse/Gaussian Elimination, classical textbook technique

    See also

    statweb.stanford.edu/~tibs/ElemStatLearn/

  40. class RoundRegression extends Regression

    The RoundRegression class supports rounded multiple linear regression.

    The RoundRegression class supports rounded multiple linear regression. In this case, 'x' is multi-dimensional [1, x_1, ... x_k]. Fit the parameter vector 'b' in the transformed regression equation

    y = round (b dot x) + e = round (b_0 + b_1 * x_1 + b_2 * x_2 ... b_k * x_k) + e

    where 'e' represents the residuals (the part not explained by the model). Use Least-Squares (minimizing the residuals) to fit the parameter vector 'b'

  41. class SimpleRegression extends PredictorMat

    The SimpleRegression class supports simple linear regression.

    The SimpleRegression class supports simple linear regression. In this case, the vector 'x' consists of the constant one and a single variable 'x1', i.e., (1, x1). Fit the parameter vector 'b' in the regression equation

    y = b dot x + e = [b0, b1] dot [1, x1] + e = b0 + b1 * x1 + e

    where 'e' represents the residuals (the part not explained by the model).

  42. class SimplerRegression extends PredictorMat

    The SimplerRegression class supports simpler linear regression.

    The SimplerRegression class supports simpler linear regression. In this case, the vector 'x' consists of a single variable 'x0'. Fit the parameter vector 'b' in the regression equation

    y = b dot x + e = [b0] dot [x0] + e = b0 * x0 + e

    where 'e' represents the residuals (the part not explained by the model). The simpler regression model has no intercept parameter, only a slope parameter.

    See also

    SimpleRegression for both intercept and slope parameters

  43. class Softmax extends AnyRef

    Softmax class calculate softmax regularization for the input

  44. type Strings = Array[String]

    Shorthand for array of strings

  45. class Tanh extends AnyRef

    The Tanh class implements Tanh and derivative for vector version

  46. class TranRegression extends Regression

    The TranRegression class supports transformed multiple linear regression.

    The TranRegression class supports transformed multiple linear regression. In this case, 'x' is multi-dimensional [1, x_1, ... x_k]. Fit the parameter vector 'b' in the transformed regression equation

    transform (y) = b dot x + e = b_0 + b_1 * x_1 + b_2 * x_2 ... b_k * x_k + e

    where 'e' represents the residuals (the part not explained by the model) and 'transform' is the function (defaults to log) used to transform the response vector 'y'. Common transforms include 'log (y)', 'sqrt (y)' when 'y > 0', or even 'sq (y)', 'exp (y)'. More generally, a Box-Cox Transformation may be applied.

    See also

    www.ams.sunysb.edu/~zhu/ams57213/Team3.pptx

    citeseerx.ist.psu.edu/viewdoc/download?doi=10.1.1.469.7176&rep=rep1&type=pdf Use Least-Squares (minimizing the residuals) to fit the parameter vector 'b' Note: this class does not provide transformations on columns of matrix 'x'.

  47. class TrigRegression extends PredictorVec

    The TrigRegression class supports trigonometric regression.

    The TrigRegression class supports trigonometric regression. In this case, 't' is expanded to '[1, sin (wt), cos (wt), sin (2wt), cos (2wt), ...]'. Fit the parameter vector 'b' in the regression equation

    y = b dot x + e = b_0 + b_1 sin (wt) + b_2 cos (wt) + b_3 sin (2wt) + b_4 cos (2wt) + ... + e

    where 'e' represents the residuals (the part not explained by the model). Use Least-Squares (minimizing the residuals) to solve for the parameter vector 'b' using the Normal Equations:

    x.t * x * b = x.t * y b = fac.solve (.)

    See also

    link.springer.com/article/10.1023%2FA%3A1022436007242#page-1

Value Members

  1. val BASE_DIR: String

    The relative path for base directory

  2. object ANCOVA extends Error

    The ANCOVA companion object provides helper functions.

  3. object ANCOVA1 extends Error

    The ANCOVA1 companion object provides helper functions.

  4. object ANCOVA1Test extends App

    The ANCOVA1Test object tests the ANCOVA1 class using the following regression equation.

    The ANCOVA1Test object tests the ANCOVA1 class using the following regression equation.

    y = b dot x = b_0 + b_1*x_1 + b_2*x_2 + b_3*d_1 + b_4*d_2

    > runMain scalation.analytics.ANCOVA1Test

  5. object ANCOVATest extends App

    The ANCOVATest object tests the ANCOVA class using the following regression equation.

    The ANCOVATest object tests the ANCOVA class using the following regression equation.

    y = b dot x = b_0 + b_1*x_1 + b_2*x_2 + b_3*d_1 + b_4*d_2

    > runMain scalation.analytics.ANCOVATest

  6. object ANOVA1Test extends App

    The ANOVA1Test object tests the ANOVA1 class using the following regression equation.

    The ANOVA1Test object tests the ANOVA1 class using the following regression equation.

    y = b dot x = b_0 + b_1*d_1 + b_2*d_2

    > runMain scalation.analytics.ANOVA1Test

  7. object ActivationFun

    The ActivationFun object contains common Activation functions and provides both scalar and vector versions.

    The ActivationFun object contains common Activation functions and provides both scalar and vector versions.

    See also

    en.wikipedia.org/wiki/Activation_function Convention: fun activation function (e.g., sigmoid) funV vector version of activation function (e.g., sigmoidV) funM matrix version of activation function (e.g., sigmoidM) funDV vector version of dervivative (e.g., sigmoidDV) funDM matrix version of dervivative (e.g., sigmoidDM) ---------------------------------------------------------------------------------- Supports: id, reLU, lreLU, eLU, tanh, sigmoid, gaussian, softmax Related functions: logistic, logit

  8. object ActivationFunTest extends App

    The ActivationFunTest is used to test the ActivationFun object.

    The ActivationFunTest is used to test the ActivationFun object. > runMain scalation.analytics.ActivationFunTest

  9. object ActivationFunTest2 extends App

    The ActivationFunTest2 is used to test the ActivationFun object.

    The ActivationFunTest2 is used to test the ActivationFun object.

    See also

    en.wikipedia.org/wiki/Softmax_function > runMain scalation.analytics.ActivationFunTest2

  10. object ExampleBasketBall

    The ExampleBasketBall class stores a medium-sized example dataset with data about basketball player that can be used to predict their scoring average.

  11. object ExampleConcrete

    The ExampleConcrete class stores a medium-sized example dataset from the UCI Machine Learning Repository, "Abstract: Concrete is a highly complex material.

    The ExampleConcrete class stores a medium-sized example dataset from the UCI Machine Learning Repository, "Abstract: Concrete is a highly complex material. The slump flow of concrete is not only determined by the water content, but that is also influenced by other concrete ingredients."

  12. object ExampleConcreteTest extends App

    The ExampleConcreteTest object is used to test the ExampleConcrete object.

    The ExampleConcreteTest object is used to test the ExampleConcrete object. It compares several modeling techniques. This one runs Regression. > runMain scalation.analytics.ExampleConcreteTest

  13. object ExampleConcreteTest10 extends App

    The ExampleConcreteTest10 object is used to test the ExampleConcrete object.

    The ExampleConcreteTest10 object is used to test the ExampleConcrete object. It compares several modeling techniques. This one runs NeuralNet_XL with 4 layers. > runMain scalation.analytics.ExampleConcreteTes10t

  14. object ExampleConcreteTest2 extends App

    The ExampleConcreteTest2 object is used to test the ExampleConcrete object.

    The ExampleConcreteTest2 object is used to test the ExampleConcrete object. It compares several modeling techniques. This one runs Perceptron. > runMain scalation.analytics.ExampleConcreteTest2

  15. object ExampleConcreteTest3 extends App

    The ExampleConcreteTest3 object is used to test the ExampleConcrete object.

    The ExampleConcreteTest3 object is used to test the ExampleConcrete object. It compares several modeling techniques. This one runs Perceptron. > runMain scalation.analytics.ExampleConcreteTest3

  16. object ExampleConcreteTest4 extends App

    The ExampleConcreteTest4 object is used to test the ExampleConcrete object.

    The ExampleConcreteTest4 object is used to test the ExampleConcrete object. It compares several modeling techniques. This one runs NeuralNet_2L with 'sigmoid'. > runMain scalation.analytics.ExampleConcreteTest4

  17. object ExampleConcreteTest5 extends App

    The ExampleConcreteTest5 object is used to test the ExampleConcrete object.

    The ExampleConcreteTest5 object is used to test the ExampleConcrete object. It compares several modeling techniques. This one runs NeuralNet_2L with 'sigmoid'. > runMain scalation.analytics.ExampleConcreteTest5

  18. object ExampleConcreteTest6 extends App

    The ExampleConcreteTest6 object is used to test the ExampleConcrete object.

    The ExampleConcreteTest6 object is used to test the ExampleConcrete object. It compares several modeling techniques. This one runs NeuralNet_2L with 'tanh'. > runMain scalation.analytics.ExampleConcreteTest6

  19. object ExampleConcreteTest7 extends App

    The ExampleConcreteTest7 object is used to test the ExampleConcrete object.

    The ExampleConcreteTest7 object is used to test the ExampleConcrete object. It compares several modeling techniques. This one runs NeuralNet_2L with 'id'. > runMain scalation.analytics.ExampleConcreteTest7

  20. object ExampleConcreteTest8 extends App

    The ExampleConcreteTest8 object is used to test the ExampleConcrete object.

    The ExampleConcreteTest8 object is used to test the ExampleConcrete object. It compares several modeling techniques. This one runs NeuralNet_3L with 'sigmiod'. > runMain scalation.analytics.ExampleConcreteTest8

  21. object ExampleConcreteTest9 extends App

    The ExampleConcreteTest9 object is used to test the ExampleConcrete object.

    The ExampleConcreteTest9 object is used to test the ExampleConcrete object. It compares several modeling techniques. This one runs NeuralNet_3L with 'sigmiod'. > runMain scalation.analytics.ExampleConcreteTest9

  22. object ExpRegressionTest extends App

    The ExpRegressionTest object tests ExpRegression class using the following exponential regression problem.

    The ExpRegressionTest object tests ExpRegression class using the following exponential regression problem. > runMain scalation.analytics.ExpRegressionTest

  23. object ExpRegressionTest2 extends App

    The ExpRegressionTest2 object has a basic test for the ExpRegression class.

    The ExpRegressionTest2 object has a basic test for the ExpRegression class. > runMain scalation.analytics.ExpRegressionTest

  24. object Fit

    The Fit companion object provides factory methods for assessing quality of fit for standard types of modeling techniques.

  25. object GLM extends GLM

    The GLM object makes the GLM trait's methods directly available.

    The GLM object makes the GLM trait's methods directly available. This approach (using traits and objects) allows the methods to also be inherited.

  26. object GLMTest extends App

    The GLMTest object tests the GLM object using the following regression equation.

    The GLMTest object tests the GLM object using the following regression equation.

    y = b dot x = b_0 + b_1*x_1 + b_2*x_2 + b_3*d_1 + b_4*d_2

  27. object GZLM extends GLM

    A Generalized Linear Model 'GZLM' can be developed using the GZLM object.

    A Generalized Linear Model 'GZLM' can be developed using the GZLM object. It provides factory methods for General Linear Models 'GLM' via inheritance and for proper Generalized Linear Models: LogisticRegression - logistic regression, (@see classifier package) PoissonRegression - Poisson regression, ExpRegression - Exponential regression,

  28. object GZLMTest extends App

    The GZLMTest object tests the GZLM object using the following regression equation.

    The GZLMTest object tests the GZLM object using the following regression equation.

    y = b dot x = b_0 + b_1*x_1 + b_2*x_2 + b_3*d_1 + b_4*d_2

  29. object HyperParameterTest extends App

    The HyperParameterTest object is used to test the HyperParameter class.

    The HyperParameterTest object is used to test the HyperParameter class. runMain scalation.analytics.HyperParameterTest

  30. object KNN_Predictor

    The KNN_Predictor companion object provides a factory functions.

  31. object KNN_PredictorTest extends App

    The KNN_PredictorTest object is used to test the KNN_Predictor class.

    The KNN_PredictorTest object is used to test the KNN_Predictor class. > runMain scalation.analytics.KNN_PredictorTest

  32. object KNN_PredictorTest2 extends App

    The KNN_PredictorTest2 object is used to test the KNN_Predictor class.

    The KNN_PredictorTest2 object is used to test the KNN_Predictor class. > runMain scalation.analytics.KNN_PredictorTest2

  33. object LassoRegression extends Error

    The LassoRegression companion object provides factory methods for the RidgeRegression class.

  34. object LassoRegressionTest extends App

    The LassoRegressionTest object tests LassoRegression class using the following regression equation.

    The LassoRegressionTest object tests LassoRegression class using the following regression equation.

    y = b dot x = b_0 + b_1*x_1 + b_2*x_2.

    See also

    http://statmaster.sdu.dk/courses/st111/module03/index.html > runMain scalation.analytics.LassoRegressionTest

  35. object MatrixTransform

    The MatrixTransform object is used to transform the columns of a data matrix 'x'.

    The MatrixTransform object is used to transform the columns of a data matrix 'x'. Such pre-processing of the data is required by some modeling techniques.

  36. object MatrixTransformTest extends App

    The MatrixTransformTest is used to test the MatrixTransform object.

    The MatrixTransformTest is used to test the MatrixTransform object. > runMain scalation.analytics.MatrixTransformTest

  37. object NMFactorizationTest extends App

    The NMFactorizationTest object to test NMFactorizationTest class.

  38. object NeuralNet_2LTest extends App

    The NeuralNet_2LTest object is used to test the NeuralNet_2L class.

    The NeuralNet_2LTest object is used to test the NeuralNet_2L class. For this test, training data is used to fit the weights before using them for prediction.

    See also

    www4.rgu.ac.uk/files/chapter3%20-%20bp.pdf > runMain scalation.analytics.NeuralNet_2LTest

  39. object NeuralNet_3LTest extends App

    The NeuralNet_3LTest object is used to test the NeuralNet_3L class.

    The NeuralNet_3LTest object is used to test the NeuralNet_3L class.

    See also

    www4.rgu.ac.uk/files/chapter3%20-%20bp.pdf > runMain scalation.analytics.NeuralNet_3LTest

  40. object NeuralNet_XLTest extends App

    The NeuralNet_XLTest object is used to test the NeuralNet_XL class.

    The NeuralNet_XLTest object is used to test the NeuralNet_XL class.

    See also

    www4.rgu.ac.uk/files/chapter3%20-%20bp.pdf > runMain scalation.analytics.NeuralNet_XLTest

  41. object NonLinRegressionTest extends App

    The NonLinRegressionTest object tests the NonLinRegression class: y = f(x; b) = b0 + exp (b1 * x0).

    The NonLinRegressionTest object tests the NonLinRegression class: y = f(x; b) = b0 + exp (b1 * x0).

    See also

    www.bsos.umd.edu/socy/alan/stats/socy602_handouts/kut86916_ch13.pdf Answers: sse = 49.45929986243339 fit = (VectorD (58.606566327280426, -0.03958645286504356), 0.9874574894685292) predict (VectorD (50.0)) = 8.09724678182599 FIX: check this example

  42. object NullModel extends Error

    The NullModel companion object provides a simple factory method for building null models.

  43. object NullModelTest extends App

    The NullModelTest object is used to test the NullModel class.

    The NullModelTest object is used to test the NullModel class.

    y = b dot x + e = b0 + e

    > runMain scalation.analytics.NullModelTest

  44. object NullModelTest2 extends App

    The NullModelTest2 object is used to test the NullModel class.

    The NullModelTest2 object is used to test the NullModel class.

    y = b dot x + e = b0 + e

    > runMain scalation.analytics.NullModelTest2

  45. object Optimizer

    The Optimizer object provides functions to optimize the parameters/weights of Neural Networks with various numbers of layers.

  46. object Perceptron

    The Perceptron companion object provides factory methods for buidling perceptrons.

  47. object PerceptronTest extends App

    The PerceptronTest object is used to test the Perceptron class.

    The PerceptronTest object is used to test the Perceptron class. For this test, training data is used to fit the weights before using them for prediction.

    See also

    www4.rgu.ac.uk/files/chapter3%20-%20bp.pdf > runMain scalation.analytics.PerceptronTest

  48. object PerceptronTest2 extends App

    The PerceptronTest2 object trains a perceptron on a small dataset of temperatures from counties in Texas where the variables/factors to consider are Latitude (x1), Elevation (x2) and Longitude (x3).

    The PerceptronTest2 object trains a perceptron on a small dataset of temperatures from counties in Texas where the variables/factors to consider are Latitude (x1), Elevation (x2) and Longitude (x3). The regression equation is the following:

    y = sigmoid (w dot x) = sigmoid (w0 + w1*x1 + w2*x2 + w3*x3)

    This test case illustrates how to transform the columns of the matrix so that the 'sigmoid' activation function can work effectively. Since the dataset is very small, should use 'train0' which does no batching. > runMain scalation.analytics.PerceptronTest2

  49. object PerceptronTest3 extends App

    The PerceptronTest3 object trains a perceptron on a small dataset with variables x1 and x2.

    The PerceptronTest3 object trains a perceptron on a small dataset with variables x1 and x2. The regression equation is the following:

    y = sigmoid (b dot x) = sigmoid (b0 + b1*x1 + b2*x2)

    Does not call the 'train' method; improvements steps for sigmoid are explicitly in code below. > runMain scalation.analytics.PerceptronTest3

  50. object PerceptronTest4 extends App

    The PerceptronTest4 object trains a perceptron on a small dataset with variables x1 and x2.

    The PerceptronTest4 object trains a perceptron on a small dataset with variables x1 and x2. The regression equation is the following:

    y = sigmoid (b dot x) = sigmoid (b0 + b1*x1 + b2*x2)

    This version calls the 'train0' method. > runMain scalation.analytics.PerceptronTest4

  51. object PerceptronTest5 extends App

    The PerceptronTest5 object trains a perceptron on a small dataset with variables > runMain scalation.analytics.PerceptronTest5

  52. object PerceptronTest6 extends App

    The PerceptronTest6 object trains a perceptron on a small dataset with 4 variables.

    The PerceptronTest6 object trains a perceptron on a small dataset with 4 variables. > runMain scalation.analytics.PerceptronTest6

  53. object PerceptronTest7 extends App

    The PerceptronTest7 object trains a perceptron on the ExampleBasketBall dataset. > runMain scalation.analytics.PerceptronTest7

  54. object PoissonRegression

    The PoissonRegression companion object provides factory functions.

  55. object PoissonRegressionTest extends App

    The PoissonRegressionTest object tests the PoissonRegression class.

    The PoissonRegressionTest object tests the PoissonRegression class.

    See also

    http://www.cookbook-r.com/Statistical_analysis/Logistic_regression/ Answer: b = (-8.8331, 0.4304), n_dev = 43.860, r_dev = 25.533, aci = 29.533, pseudo_rSq = 0.4178 > runMain scalation.analytics.PoissonRegressionTest

  56. object PoissonRegressionTest2 extends App

    The PoissonRegressionTest2 object tests the PoissonRegression class.

    The PoissonRegressionTest2 object tests the PoissonRegression class.

    See also

    www.stat.wisc.edu/~mchung/teaching/.../GLM.logistic.Rpackage.pdf > runMain scalation.analytics.PoissonRegressionTest2

    statmaster.sdu.dk/courses/st111/module03/index.html

  57. object PolyRegressionTest extends App

    The PolyRegressionTest object tests PolyRegression class using the following regression equation.

    The PolyRegressionTest object tests PolyRegression class using the following regression equation.

    y = b dot x = b_0 + b_1*t + b_2*t^2 + ... b_k*t_k

    Note, the 'order' at which R-Squared drops is QR(7), Cholesky(14), SVD(6), Inverse(13). > runMain scalation.analytics.PolyRegressionTest

  58. object PolyRegressionTest2 extends App

    The PolyRegressionTest2 object tests PolyRegression class using the following regression equation.

    The PolyRegressionTest2 object tests PolyRegression class using the following regression equation. This version uses orthogonal polynomials.

    y = b dot x = b_0 + b_1*t + b_2*t^2 + ... b_k*t_k

    > runMain scalation.analytics.PolyRegressionTest2

  59. object PredictorMat

    The PredictorMat companion object provides a meythod for splitting a combined data matrix in predictor matrix and a response vector.

  60. object PredictorMatTest extends App

    The PredictorMatTest is used to test the PredictorMat abstract class and its derived classes using the ExampleBasketBall dataset containing data matrix 'x' and response vector 'y'.

    The PredictorMatTest is used to test the PredictorMat abstract class and its derived classes using the ExampleBasketBall dataset containing data matrix 'x' and response vector 'y'. > runMain scalation.analytics.PredictorMatTest

  61. object PredictorTest extends App

    The PredictorTest object tests all the classes in the scalation.analytics package that directly or indirectly extend the Predictor trait.

    The PredictorTest object tests all the classes in the scalation.analytics package that directly or indirectly extend the Predictor trait. FIX - make first test uniform so that the modeling techniques may be compared > runMain scalation.analytics.PredictorTest

  62. object PrincipalComponentsTest extends App

    The PrincipalComponentsTest object is used to test the PrincipalComponents class.

    The PrincipalComponentsTest object is used to test the PrincipalComponents class.

    See also

    www.ce.yildiz.edu.tr/personal/songul/file/1097/principal_components.pdf > runMain scalation.analytics.PrincipalComponentsTest

  63. object Probability extends Error

    The Probability object provides methods for operating on univariate and bivariate probability distributions of discrete random variables 'X' and 'Y'.

    The Probability object provides methods for operating on univariate and bivariate probability distributions of discrete random variables 'X' and 'Y'. A probability distribution is specified by its probability mass functions (pmf) stored either as a "probability vector" for a univariate distribution or a "probability matrix" for a bivariate distribution.

    joint probability matrix: pxy(i, j) = P(X = x_i, Y = y_j) marginal probability vector: px(i) = P(X = x_i) conditional probability matrix: px_y(i, j) = P(X = x_i|Y = y_j)

    In addition to computing joint, marginal and conditional probabilities, methods for computing entropy and mutual information are also provided. Entropy provides a measure of disorder or randomness. If there is little randomness, entropy will close to 0, while when randomness is high, entropy will be close to, e.g., 'log2 (px.dim)'. Mutual information provides a robust measure of dependency between random variables (contrast with correlation).

    See also

    scalation.stat.StatVector

  64. object ProbabilityTest extends App

    The ProbabilityTest object is used to test the Probability object.

  65. object ProbabilityTest2 extends App

    The ProbabilityTest2 provides upper bound for 'entropy' and 'entropy_k'.

  66. object QuadRegression

    The QuadRegression companion object provides methods for creating functional forms.

  67. object QuadRegressionTest extends App

    The QuadRegressionTest object is used to test the QuadRegression class.

    The QuadRegressionTest object is used to test the QuadRegression class. > runMain scalation.analytics.QuadRegressionTest

  68. object QuadraticFitTest extends App

    The QuadraticFitTest object is used to test the QuadraticFit class for a two dimensional case.

  69. object QuadraticFitTest2 extends App

    The QuadraticFitTest2 object is used to test the QuadraticFit class for a three dimensional case.

  70. object QuadraticFitTest3 extends App

    The QuadraticFitTest3 object is used to test the QuadraticFit class for a three dimensional case with noise.

  71. object RecurrentNeuralNetTest extends App

    The RecurrentNeuralNetTest object is used to test the RecurrentNeuralNet class.

    The RecurrentNeuralNetTest object is used to test the RecurrentNeuralNet class. > runMain scalation.analytics.RecurrentNeuralNetTest

  72. object RegTechnique extends Enumeration

    The RegTechnique object defines the implementation techniques available.

  73. object Regression extends Error

    The Regression companion object provides factory apply functions and a testing method.

  74. object RegressionTest extends App

    The RegressionTest object tests Regression class using the following regression equation.

    The RegressionTest object tests Regression class using the following regression equation.

    y = b dot x = b_0 + b_1*x_1 + b_2*x_2.

    See also

    statmaster.sdu.dk/courses/st111/module03/index.html > runMain scalation.analytics.RegressionTest

  75. object RegressionTest2 extends App

    The RegressionTest2 object tests Regression class using the following regression equation.

    The RegressionTest2 object tests Regression class using the following regression equation.

    y = b dot x = b_0 + b_1*x1 + b_2*x_2.

    > runMain scalation.analytics.RegressionTest2

  76. object RegressionTest3 extends App

    The RegressionTest3 object tests the multi-collinearity method in the Regression class using the following regression equation.

    The RegressionTest3 object tests the multi-collinearity method in the Regression class using the following regression equation.

    y = b dot x = b_0 + b_1*x_1 + b_2*x_2 + b_3*x_3 + b_4 * x_4

    See also

    online.stat.psu.edu/online/development/stat501/data/bloodpress.txt > runMain scalation.analytics.RegressionTest3

    online.stat.psu.edu/online/development/stat501/12multicollinearity/05multico_vif.html

  77. object RegressionTest4 extends App

    The RegressionTest4 object tests the multi-collinearity method in the Regression class using the following regression equation.

    The RegressionTest4 object tests the multi-collinearity method in the Regression class using the following regression equation.

    y = b dot x = b_0 + b_1*x_1 + b_2*x_2 + b_3*x_3 + b_4 * x_4

    See also

    online.stat.psu.edu/online/development/stat501/data/bloodpress.txt > runMain scalation.analytics.RegressionTest4

    online.stat.psu.edu/online/development/stat501/12multicollinearity/05multico_vif.html

  78. object RegressionTest5 extends App

    The RegressionTest5 object tests Regression class using the following regression equation.

    The RegressionTest5 object tests Regression class using the following regression equation.

    y = b dot x = b_0 + b_1*x1 + b_2*x_2.

    > runMain scalation.analytics.RegressionTest5

  79. object RegressionTest6 extends App

    The RegressionTest6 object tests Regression class using the following regression equation.

    The RegressionTest6 object tests Regression class using the following regression equation.

    y = b dot x = b_0 + b_1*x1 + b_2*x_2.

    > runMain scalation.analytics.RegressionTest6

  80. object RegressionTree

    The RegressionTree companion object is used to count the number of leaves.

  81. object RegressionTreeTest extends App

    The RegressionTreeTest object is used to test the RegressionTree class.

    The RegressionTreeTest object is used to test the RegressionTree class. It tests a simple case that does not require a file to be read.

    See also

    translate.google.com/translate?hl=en&sl=zh-CN&u=https://www.hrwhisper.me/machine-learning-decision-tree/&prev=search > runMain scalation.analytics.RegressionTreeTest

  82. object RegressionTree_GB

    The RegressionTree_GB companion object defines hyper-parameters and provides a factory function.

  83. object RegressionTree_GBTest extends App

    The RegressionTree_GBTest object is used to test the RegressionTree_GB class.

    The RegressionTree_GBTest object is used to test the RegressionTree_GB class. It tests a simple case that does not require a file to be read. > runMain scalation.analytics.RegressionTree_GBTest

  84. object Regression_WLS

    The Regression_WLS companion object provides methods for setting weights and testing.

  85. object Regression_WLSTest extends App

    The Regression_WLSTest object tests Regression_WLS class using the following regression equation.

    The Regression_WLSTest object tests Regression_WLS class using the following regression equation.

    y = b dot x = b_0 + b_1*x_1 + b_2*x_2.

    See also

    statmaster.sdu.dk/courses/st111/module03/index.html > runMain scalation.analytics.Regression_WLSTest

  86. object ResponseSurface

    The ResponseSurface companion object provides methods for creating functional forms.

  87. object ResponseSurfaceTest extends App

    The ResponseSurfaceTest object is used to test the ResponseSurface class.

    The ResponseSurfaceTest object is used to test the ResponseSurface class. > runMain scalation.analytics.ResponseSurfaceTest

  88. object RidgeRegression extends Error

    The RidgeRegression companion object defines hyper-paramters and provides factory functions for the RidgeRegression class.

  89. object RidgeRegressionTest extends App

    The RidgeRegressionTest object tests RidgeRegression class using the following regression equation.

    The RidgeRegressionTest object tests RidgeRegression class using the following regression equation.

    y = b dot x = b_1*x_1 + b_2*x_2.

    Test regression and backward elimination.

    See also

    http://statmaster.sdu.dk/courses/st111/module03/index.html > runMain scalation.analytics.RidgeRegressionTest

  90. object RidgeRegressionTest2 extends App

    The RidgeRegressionTest2 object tests RidgeRegression class using the following regression equation.

    The RidgeRegressionTest2 object tests RidgeRegression class using the following regression equation.

    y = b dot x = b_1*x1 + b_2*x_2.

    > runMain scalation.analytics.RidgeRegressionTest2

  91. object RidgeRegressionTest3 extends App

    The RidgeRegressionTest3 object tests the multi-collinearity method in the RidgeRegression class using the following regression equation.

    The RidgeRegressionTest3 object tests the multi-collinearity method in the RidgeRegression class using the following regression equation.

    y = b dot x = b_1*x_1 + b_2*x_2 + b_3*x_3 + b_4 * x_4

    See also

    online.stat.psu.edu/online/development/stat501/data/bloodpress.txt > runMain scalation.analytics.RidgeRegressionTest3

    online.stat.psu.edu/online/development/stat501/12multicollinearity/05multico_vif.html

  92. object RoundRegression

    The RoundRegression companion object provides a factory method.

  93. object RoundRegressionTest extends App

    The RoundRegressionTest object tests RoundRegression class using the following regression equation.

    The RoundRegressionTest object tests RoundRegression class using the following regression equation.

    y = round (b dot x) = round (b_0 + b_1*x_1 + b_2*x_2).

    > runMain scalation.analytics.RoundRegressionTest

  94. object SimpleRegression extends Error

    The SimpleRegression companion object provides a simple factory method for building simple regression linear regression models.

  95. object SimpleRegressionTest extends App

    The SimpleRegressionTest object to test the SimpleRegression class:

    The SimpleRegressionTest object to test the SimpleRegression class:

    y = b0 + b1 * x

    > runMain scalation.analytics.SimpleRegressionTest

  96. object SimpleRegressionTest2 extends App

    The SimpleRegressionTest2 object is used to test the SimpleRegression class.

    The SimpleRegressionTest2 object is used to test the SimpleRegression class.

    y = b dot x = [b0, b1] dot [1, x1]

    See also

    http://www.analyzemath.com/statistics/linear_regression.html > runMain scalation.analytics.SimpleRegressionTest2

  97. object SimpleRegressionTest3 extends App

    The SimpleRegressionTest3 object is used to test the SimpleRegression class

    The SimpleRegressionTest3 object is used to test the SimpleRegression class

    y = b dot x = b0 + b1 * x1

    See also

    http://mathbits.com/mathbits/tisection/Statistics2/linear.htm > runMain scalation.analytics.SimpleRegressionTest3

  98. object SimplerRegression extends Error

    The SimplerRegression companion object provides a simple factory method for building simple regression linear regression models.

  99. object SimplerRegressionTest extends App

    The SimplerRegressionTest object is used to test the SimplerRegression class.

    The SimplerRegressionTest object is used to test the SimplerRegression class.

    y = b0 * x + e

    > runMain scalation.analytics.SimplerRegressionTest

  100. object SimplerRegressionTest2 extends App

    The SimplerRegressionTest2 object is used to test the SimplerRegression class.

    The SimplerRegressionTest2 object is used to test the SimplerRegression class.

    y = b dot x + e = [b0] dot [x0] + e

    > runMain scalation.analytics.SimplerRegressionTest2

  101. object SimplerRegressionTest3 extends App

    The SimplerRegressionTest3 object is used to test the SimplerRegression class.

    The SimplerRegressionTest3 object is used to test the SimplerRegression class.

    y = b dot x = b0 * x0

    See also

    http://mathbits.com/mathbits/tisection/Statistics2/linear.htm > runMain scalation.analytics.SimplerRegressionTest3

  102. object SimplerRegression_exer_1 extends App
  103. object TranRegression

    The TranRegression companion object provides transformation and inverse transformation function based on the parameter 'lambda'.

    The TranRegression companion object provides transformation and inverse transformation function based on the parameter 'lambda'. It support the family of Box-Cox transformations.

  104. object TranRegressionTest extends App

    The TranRegressionTest object tests TranRegression class using the following regression equation.

    The TranRegressionTest object tests TranRegression class using the following regression equation.

    log (y) = b dot x = b_0 + b_1*x_1 + b_2*x_2.

    > runMain scalation.analytics.TranRegressionTest

  105. object TranRegressionTest2 extends App

    The TranRegressionTest2 object tests TranRegression class using the following regression equation.

    The TranRegressionTest2 object tests TranRegression class using the following regression equation.

    sqrt (y) = b dot x = b_0 + b_1*x_1 + b_2*x_2.

    > runMain scalation.analytics.TranRegressionTest2

  106. object TrigRegressionTest extends App

    The TrigRegressionTest object tests TrigRegression class using the following regression equation.

    The TrigRegressionTest object tests TrigRegression class using the following regression equation.

    y = b dot x = b_0 + b_1 sin wt + b_2 cos wt + ... b_2k-1 sin kwt + b_2k cos kwt + e

    The data is generated from a noisy cubic function. > runMain scalation.analytics.TrigRegressionTest

  107. object TrigRegressionTest2 extends App

    The TrigRegressionTest2 object tests TrigRegression class using the following regression equation.

    The TrigRegressionTest2 object tests TrigRegression class using the following regression equation.

    y = b dot x = b_0 + b_1 sin wt + b_2 cos wt + ... b_2k-1 sin kwt + b_2k cos kwt + e

    The data is generated from periodic noisy cubic functions. > runMain scalation.analytics.TrigRegressionTest2

Inherited from AnyRef

Inherited from Any

Ungrouped