class PolyRegression extends Predictor with Error
The PolyRegression
class supports polynomial regression. In this case,
't' is expanded to '[1, t, t2 ... tk]'. Fit the parameter vector 'b' in the
regression equation
y = b dot x + e = b_0 + b_1 * t + b_2 * t2 ... b_k * tk + e
where 'e' represents the residuals (the part not explained by the model). Use Least-Squares (minimizing the residuals) to solve for the parameter vector 'b' using the Normal Equations:
x.t * x * b = x.t * y b = fac.solve (.)
- See also
www.ams.sunysb.edu/~zhu/ams57213/Team3.pptx
- Alphabetic
- By Inheritance
- PolyRegression
- Error
- Predictor
- AnyRef
- Any
- Hide All
- Show All
- Public
- All
Instance Constructors
-
new
PolyRegression(t: VectoD, y: VectoD, k: Int, technique: RegTechnique = Cholesky, raw: Boolean = true)
- t
the input vector: t_i expands to x_i = [1, t_i, t_i2, ... t_ik]
- y
the response vector
- k
the order of the polynomial (max degree)
- technique
the technique used to solve for b in x.t*x*b = x.t*y
- raw
whether the polynomials are raw or orthogonal
Value Members
-
final
def
!=(arg0: Any): Boolean
- Definition Classes
- AnyRef → Any
-
final
def
##(): Int
- Definition Classes
- AnyRef → Any
-
final
def
==(arg0: Any): Boolean
- Definition Classes
- AnyRef → Any
-
final
def
asInstanceOf[T0]: T0
- Definition Classes
- Any
-
val
b: VectoD
- Attributes
- protected
- Definition Classes
- Predictor
-
def
backwardElim(cols: Set[Int]): (Int, VectoD, VectoD)
Perform backward elimination to remove the least predictive variable from the existing model, returning the variable to eliminate, the new parameter vector and the new quality of fit.
Perform backward elimination to remove the least predictive variable from the existing model, returning the variable to eliminate, the new parameter vector and the new quality of fit. May be called repeatedly.
- cols
the columns of matrix x included in the existing model
-
def
build(x: MatriD, y: VectoD): Predictor
- Definition Classes
- Predictor
-
def
clone(): AnyRef
- Attributes
- protected[java.lang]
- Definition Classes
- AnyRef
- Annotations
- @native() @throws( ... )
-
def
coefficient: VectoD
Return the vector of coefficients.
Return the vector of coefficients.
- Definition Classes
- PolyRegression → Predictor
-
def
corrMatrix: MatriD
Return the correlation matrix for the columns in data matrix 'x'.
-
def
diagnose(yy: VectoD): Unit
Compute diagostics for the predictor.
Compute diagostics for the predictor. Override to add more diagostics. Note, for 'mse' and 'rmse', 'sse' is divided by the number of instances 'm' rather than the degrees of freedom.
- yy
the response vector
- Attributes
- protected
- Definition Classes
- Predictor
- See also
en.wikipedia.org/wiki/Mean_squared_error
-
val
e: VectoD
- Attributes
- protected
- Definition Classes
- Predictor
-
final
def
eq(arg0: AnyRef): Boolean
- Definition Classes
- AnyRef
-
def
equals(arg0: Any): Boolean
- Definition Classes
- AnyRef → Any
-
def
eval(yy: VectoD = y): Unit
Compute the error and useful diagnostics.
Compute the error and useful diagnostics.
- yy
the response vector
- Definition Classes
- PolyRegression → Predictor
-
def
expand(t: Double): VectoD
Expand the scalar 't' into a vector of powers of 't: [1, t, t2 ... tk]'.
Expand the scalar 't' into a vector of powers of 't: [1, t, t2 ... tk]'.
- t
the scalar to expand into the vector
-
def
finalize(): Unit
- Attributes
- protected[java.lang]
- Definition Classes
- AnyRef
- Annotations
- @throws( classOf[java.lang.Throwable] )
-
def
fit: VectoD
Return the quality of fit including 'rSquared'.
Return the quality of fit including 'rSquared'.
- Definition Classes
- PolyRegression → Predictor
-
def
fitLabels: Seq[String]
Return the labels for the fit.
Return the labels for the fit.
- Definition Classes
- PolyRegression → Predictor
-
final
def
flaw(method: String, message: String): Unit
- Definition Classes
- Error
-
def
forwardSel(cols: Set[Int]): (Int, VectoD, VectoD)
Perform forward selection to add the most predictive variable to the existing model, returning the variable to add, the new parameter vector and the new quality of fit.
Perform forward selection to add the most predictive variable to the existing model, returning the variable to add, the new parameter vector and the new quality of fit. May be called repeatedly.
- cols
the columns of matrix x included in the existing model
-
final
def
getClass(): Class[_]
- Definition Classes
- AnyRef → Any
- Annotations
- @native()
-
def
hashCode(): Int
- Definition Classes
- AnyRef → Any
- Annotations
- @native()
-
val
index_rSq: Int
- Definition Classes
- Predictor
-
final
def
isInstanceOf[T0]: Boolean
- Definition Classes
- Any
-
val
mae: Double
- Attributes
- protected
- Definition Classes
- Predictor
-
def
metrics: Map[String, Any]
Build a map of selected quality of fit measures/metrics.
Build a map of selected quality of fit measures/metrics.
- Definition Classes
- Predictor
-
val
mse: Double
- Attributes
- protected
- Definition Classes
- Predictor
-
final
def
ne(arg0: AnyRef): Boolean
- Definition Classes
- AnyRef
-
final
def
notify(): Unit
- Definition Classes
- AnyRef
- Annotations
- @native()
-
final
def
notifyAll(): Unit
- Definition Classes
- AnyRef
- Annotations
- @native()
-
def
orthoVector(v: VectoD): VectoD
Follow the same transformations used to orthogonalize the data/design matrix 'x', on vector 'v', so its elements are correctly mapped.
Follow the same transformations used to orthogonalize the data/design matrix 'x', on vector 'v', so its elements are correctly mapped.
- v
the vector to be transformed based the orthogonalize procedure
-
def
orthogonalize(x: MatrixD): (MatrixD, MatrixD)
Orthogonalize the data/design matrix 'x' using Gram-Schmidt Orthogonalization, returning the a new orthogonal matrix 'z' and the orthogonalization multipliers 'a'.
Orthogonalize the data/design matrix 'x' using Gram-Schmidt Orthogonalization, returning the a new orthogonal matrix 'z' and the orthogonalization multipliers 'a'. This will eliminate the multi-collinearity problem.
- x
the matrix to orthogonalize
-
def
predict(z: VectoD): Double
Predict the value of y = f(z) by evaluating the formula y = b dot z, e.g., (b_0, b_1, b_2) dot (1, z_1, z_2).
Predict the value of y = f(z) by evaluating the formula y = b dot z, e.g., (b_0, b_1, b_2) dot (1, z_1, z_2).
- z
the new expanded/orhogonalized vector to predict
- Definition Classes
- PolyRegression → Predictor
-
def
predict(z: Double): Double
Predict the value of 'y = f(z)' by evaluating the formula 'y = b dot expand (z)', e.g., '(b_0, b_1, b_2) dot (1, z, z^2)'.
Predict the value of 'y = f(z)' by evaluating the formula 'y = b dot expand (z)', e.g., '(b_0, b_1, b_2) dot (1, z, z^2)'.
- z
the new scalar to predict
-
def
predict(z: VectoI): Double
Given a new discrete data vector z, predict the y-value of f(z).
Given a new discrete data vector z, predict the y-value of f(z).
- z
the vector to use for prediction
- Definition Classes
- Predictor
-
val
rSq: Double
- Attributes
- protected
- Definition Classes
- Predictor
-
def
residual: VectoD
Return the vector of residuals/errors.
Return the vector of residuals/errors.
- Definition Classes
- PolyRegression → Predictor
-
val
rmse: Double
- Attributes
- protected
- Definition Classes
- Predictor
-
val
sse: Double
- Attributes
- protected
- Definition Classes
- Predictor
-
val
ssr: Double
- Attributes
- protected
- Definition Classes
- Predictor
-
val
sst: Double
- Attributes
- protected
- Definition Classes
- Predictor
-
final
def
synchronized[T0](arg0: ⇒ T0): T0
- Definition Classes
- AnyRef
-
def
toString(): String
- Definition Classes
- AnyRef → Any
-
def
train(yy: VectoD = y): Regression[MatrixD, VectoD]
Train the predictor by fitting the parameter vector 'b' in the multiple regression equation
Train the predictor by fitting the parameter vector 'b' in the multiple regression equation
yy = b dot x + e = [b_0, ... b_k] dot [1, t, t2 ... tk] + e
using the least squares method.
- yy
the response vector
- Definition Classes
- PolyRegression → Predictor
-
def
vif: VectoD
Compute the Variance Inflation Factor (VIF) for each variable to test for multi-collinearity by regressing 'xj' against the rest of the variables.
Compute the Variance Inflation Factor (VIF) for each variable to test for multi-collinearity by regressing 'xj' against the rest of the variables. A VIF over 10 indicates that over 90% of the variance of 'xj' can be predicted from the other variables, so 'xj' is a candidate for removal from the model.
-
final
def
wait(): Unit
- Definition Classes
- AnyRef
- Annotations
- @throws( ... )
-
final
def
wait(arg0: Long, arg1: Int): Unit
- Definition Classes
- AnyRef
- Annotations
- @throws( ... )
-
final
def
wait(arg0: Long): Unit
- Definition Classes
- AnyRef
- Annotations
- @native() @throws( ... )