Package gov.sandia.cognition.learning.algorithm.pca

Provides implementations of Principle Components Analysis (PCA).

See:
          Description

Interface Summary
PrincipalComponentsAnalysis Principal Components Analysis is a family of algorithms that map from a high-dimensional input space to a low-dimensional output space.
 

Class Summary
AbstractPrincipalComponentsAnalysis Abstract implementation of PCA.
GeneralizedHebbianAlgorithm Implementation of the Generalized Hebbian Algorithm, also known as Sanger's Rule, which is a generalization of Oja's Rule.
KernelPrincipalComponentsAnalysis<DataType> An implementation of the Kernel Principal Components Analysis (KPCA) algorithm.
KernelPrincipalComponentsAnalysis.Function<DataType> The resulting transformation function learned by Kernel Principal Components Analysis.
PrincipalComponentsAnalysisFunction This VectorFunction maps a high-dimension input space onto a (hopefully) simple low-dimensional output space by subtracting the mean of the input data, and passing the zero-mean input through a dimension-reducing matrix multiplication function.
ThinSingularValueDecomposition Computes the "thin" singular value decomposition of a dataset.
 

Package gov.sandia.cognition.learning.algorithm.pca Description

Provides implementations of Principle Components Analysis (PCA). This can also be referred to as Latent Semantic Analysis (LSA) or Latent Semantic Indexing (LSI) when applied to documents in the field of Information Retrieval.

Since:
2.0
Author:
Justin Basilico