the input vectors/points
the response vector
the order of the surface (defaults to quadratic, else cubic)
the technique used to solve for b in x.t*x*b = x.t*y
Create all forms/terms for each point placing them in a new matrix.
Perform backward elimination to remove the least predictive variable from the model, returning the variable to eliminate, the new parameter vector, the new R-squared value and the new F statistic.
Given a vector/point 'p', compute the values for all of its cubic, quadratic, linear and constant forms/terms, returning them as a vector.
Given a vector/point 'p', compute the values for all of its cubic, quadratic, linear and constant forms/terms, returning them as a vector. for 1D: p = (x_0) => VectorD (1, x_0, x_02, x_03) for 2D: p = (x_0, x_1) => VectorD (1, x_0, x_02, x_03, x_0*x_1, x_02*x_1, x_0*x_12, x_1, x_12, x_13)
the source vector/point for creating forms/terms
Return the fit (parameter vector b, quality of fit including rSquared).
Show the flaw by printing the error message.
Show the flaw by printing the error message.
the method where the error occurred
the error message
Predict the value of y = f(z) by evaluating the formula y = b dot zi for each row zi of matrix z.
Predict the value of y = f(z) by evaluating the formula y = b dot zi for each row zi of matrix z.
the new matrix to predict
Given a point z, use the quadratic rsm regression equation to predict a value for the function at z.
Given a point z, use the quadratic rsm regression equation to predict a value for the function at z. for 1D: b_0 + b_1*z_0 + b_2*z_02 for 2D: b_0 + b_1*z_0 + b_2*z_02 + b_3*z_1 + b_4*z_1*z_0 + b_5*z_1^2
the point/vector whose functional value is to be predicted
Given a new discrete data vector z, predict the y-value of f(z).
Given a new discrete data vector z, predict the y-value of f(z).
the vector to use for prediction
Given a vector/point 'p', compute the values for all of its quadratic, linear and constant forms/terms, returning them as a vector.
Given a vector/point 'p', compute the values for all of its quadratic, linear and constant forms/terms, returning them as a vector. for 1D: p = (x_0) => VectorD (1, x_0, x_02) for 2D: p = (x_0, x_1) => VectorD (1, x_0, x_02, x_0*x_1, x_1, x_1^2)
the source vector/point for creating forms/terms
Retrain the predictor by fitting the parameter vector (b-vector) in the quadratic rsm regression equation, e.g., for 2D yy = b dot x + e = [b_0, ...
Retrain the predictor by fitting the parameter vector (b-vector) in the quadratic rsm regression equation, e.g., for 2D yy = b dot x + e = [b_0, ... b_k] dot [1, x_0, x_02, x_1, x_1*x_0, x_12] + e using the least squares method.
the new response vector
Train the predictor by fitting the parameter vector (b-vector) in the quadratic rsm regression equation, e.g., for 2D y = b dot x + e = [b_0, ...
Train the predictor by fitting the parameter vector (b-vector) in the quadratic rsm regression equation, e.g., for 2D y = b dot x + e = [b_0, ... b_k] dot [1, x_0, x_02, x_1, x_1*x_0, x_12] + e using the least squares method.
Compute the Variance Inflation Factor (VIF) for each variable to test for multi-colinearity by regressing xj against the rest of the variables.
Compute the Variance Inflation Factor (VIF) for each variable to test for multi-colinearity by regressing xj against the rest of the variables. A VIF over 10 indicates that over 90% of the varaince of xj can be predicted from the other variables, so xj is a candidate for removal from the model.
The
ResponseSurface
class uses multiple regression to fit a quadratic/cubic surface to the data. For example in 2D, the quadratic regression equation isy = b dot x + e = [b_0, ... b_k] dot [1, x_0, x_02, x_1, x_0*x_1, x_12] + e
scalation.metamodel.QuadraticFit