the data matrix to reduce, stored column-wise
Assuming mean centered data, compute the covariance matrix.
Compute the unit eigenvectors for the covariance matrix.
Compute the unit eigenvectors for the covariance matrix.
the vector of eigenvalues for the covariance matrrix
Find the Principal Components/Features, the eigenvectors with the k highest eigenvalues.
Find the Principal Components/Features, the eigenvectors with the k highest eigenvalues.
the number of Principal Components (PCs) to find
Show the flaw by printing the error message.
Show the flaw by printing the error message.
the method where the error occurred
the error message
Center the data about the means (i.
Center the data about the means (i.e., subtract the means) and return the mean vector (i.e., the mean for each varaibale/dimension).
Approximately recover the orginal data by multiplying the reduced matrix by the inverse (via transpose) of the feature matrix and then adding back the means.
Approximately recover the orginal data by multiplying the reduced matrix by the inverse (via transpose) of the feature matrix and then adding back the means.
Multiply the zero mean data matrix by the feature matrix to reduce dimensionality.
Multiply the zero mean data matrix by the feature matrix to reduce dimensionality.
The
PrincipalComponents
class performs the Principal Component Analysis (PCA) on data matrix 'x'. It can be used to reduce the dimensionality of the data. First find the PCs by calling 'findPCs' and then call 'reduce' to reduce the data (i.e., reduce matrix 'x' to a lower dimensionality matrix).