the input vector: t_i expands to x_i = [1, t_i, t_i2, ... t_ik]
the response vector
the order of the polynomial
the technique used to solve for b in x.t*x*b = x.t*y
Perform backward elimination to remove the least predictive variable from the model, returning the variable to eliminate, the new parameter vector, the new R-squared value and the new F statistic.
Expand the scalar 't' into a vector of powers of 't': [1, t, t2 ... tk].
Expand the scalar 't' into a vector of powers of 't': [1, t, t2 ... tk].
the scalar to expand into the vector
Return the fit (parameter vector b, quality of fit including rSquared).
Show the flaw by printing the error message.
Show the flaw by printing the error message.
the method where the error occurred
the error message
Predict the value of y = f(z) by evaluating the formula y = b dot zi for each row zi of matrix z.
Predict the value of y = f(z) by evaluating the formula y = b dot zi for each row zi of matrix z.
the new matrix to predict
Predict the value of y = f(z) by evaluating the formula y = b dot z, e.g., (b_0, b_1, b_2) dot (1, z_1, z_2).
Predict the value of y = f(z) by evaluating the formula y = b dot z, e.g., (b_0, b_1, b_2) dot (1, z_1, z_2).
the new vector to predict
Predict the value of y = f(z) by evaluating the formula y = b dot expand (z), e.g., (b_0, b_1, b_2) dot (1, z, z^2).
Predict the value of y = f(z) by evaluating the formula y = b dot expand (z), e.g., (b_0, b_1, b_2) dot (1, z, z^2).
the new scalar to predict
Given a new discrete data vector z, predict the y-value of f(z).
Given a new discrete data vector z, predict the y-value of f(z).
the vector to use for prediction
Retrain the predictor by fitting the parameter vector (b-vector) in the multiple regression equation yy = b dot x + e = [b_0, ...
Retrain the predictor by fitting the parameter vector (b-vector) in the multiple regression equation yy = b dot x + e = [b_0, ... b_k] dot [1, t, t2 ... tk] + e using the least squares method.
the new response vector
Train the predictor by fitting the parameter vector (b-vector) in the regression equation y = b dot x + e = [b_0, ...
Train the predictor by fitting the parameter vector (b-vector) in the regression equation y = b dot x + e = [b_0, ... b_k] dot [1, t, t2 ... tk] + e using the least squares method.
Compute the Variance Inflation Factor (VIF) for each variable to test for multi-colinearity by regressing xj against the rest of the variables.
Compute the Variance Inflation Factor (VIF) for each variable to test for multi-colinearity by regressing xj against the rest of the variables. A VIF over 10 indicates that over 90% of the varaince of xj can be predicted from the other variables, so xj is a candidate for removal from the model.
The
PolyRegression
class supports polynomial regression. In this case, 't' is expanded to [1, t, t2 ... tk]. Fit the parameter vector 'b' in the regression equationy = b dot x + e = b_0 + b_1 * t + b_2 * t2 ... b_k * tk + e
where 'e' represents the residuals (the part not explained by the model). Use Least-Squares (minimizing the residuals) to fit the parameter vector
b = x_pinv * y
where 'x_pinv' is the pseudo-inverse.
www.ams.sunysb.edu/~zhu/ams57213/Team3.pptx