Regression and function approximation


simpleR: simple Regression toolbox

The simple Regression toolbox, simpleR, contains a set of functions in Matlab to illustrate the capabilities of several statistical regression algorithms. simpleR contains simple educational code for linear regression (LR), decision trees (TREE), neural networks (NN), support vector regression (SVR), kernel ridge regression (KRR), aka Least Squares SVM, Gaussian Process  Regression (GPR), and Variational Heteroscedastic Gaussian Process Regression (VHGPR). We also include a dataset of collected spectra and associated chlorophyll content to illustrate the training/testing procedures. This is just a demo providing a default initialization. Training is not at all optimized. Other initializations, optimization techniques, and training strategies may be of  course better suited to achieve improved results in this or other problems. We just did it in the standard way for illustration and educational purposes, as well as to disseminate these models.

MSVR: Multioutput Support Vector Regression

Standard SVR formulation only considers the single-output problem. In the case of several output variables, other methods (neural networks, kernel ridge regression) must be deployed, but the good properties of SVR are lost: hinge-loss function and sparsity. The proposed model M-SVR extends the single-output SVR by taking into account the nonlinear relations between features but also among the output variables, which are typically inter-dependent.

KARMA: Kernel AutoRegressive Moving Average with the Support Vector Machine

Nonlinear system identification based on Support Vector Machines (SVM) has been usually addressed by means of the standard SVM regression (SVR), which can be seen as an implicit nonlinear Auto Regressive and Moving Average (ARMA) model in some Reproducing Kernel Hilbert Spaces (RKHS). The proposal here is twofold: First, the explicit consideration of an ARMA model in RKHS (SVM-ARMA2K) is originally proposed. Second, a general class of SVM-based system identification nonlinear models is presented, based on the use of composite Mercer's kernels.

Classification


SS-Graph: Semi-supervised Graph-based Classification

A graph-based method for semi-supervised learning: essentially an affinity matrix is computed, the graph Laplacian is normalized, and a spreading function is iterated until convergence. This algorithm can be understood intuitively in terms of spreading activation networks from experimental psychology, and explained as random walks on graphs. We successfully apply it to hyperspectral image classification. It incorporates contextual information through a full family of composite kernels. Noting that the graph method relies on inverting a huge kernel matrix formed by both labeled and unlabeled samples, we originally introduce the Nyström method in the formulation to speed up the classification process.

Feature extraction and manifold learning


simFEAT: Matlab toolbox for linear and kernel feature extraction methods

The simple feature extraction toolbox, simFEAT, contains a set of functions in Matlab to illustrate the capabilities of several multivariate linear and kernel feature extraction algorithms:
  • Linear methods: PCA, MNF, PCC, PLS, OPLS, LDA, PLDA, WLDA, RLDA, OLDA, ULDA
  • Kernel methods: KPCA, KICA, HSIC, KCCA, KPLS, KOPLS, KSNR and KECA

RBIG: Rotation-based Interative Gaussianization

The RBIG Toolbox is a Matlab Toolbox for estimates the Gaussianization transformation based on PCA (or any kind of orthogonal transform, random rotations included!) from given multidimensional signals.

More software and toolboxes


Visit http://isp.uv.es/soft.htm

Databases


Visit http://isp.uv.es/soft.htm