0000000000329375
AUTHOR
Sandra Jiménez
PRINCIPAL POLYNOMIAL ANALYSIS
© 2014 World Scientific Publishing Company. This paper presents a new framework for manifold learning based on a sequence of principal polynomials that capture the possibly nonlinear nature of the data. The proposed Principal Polynomial Analysis (PPA) generalizes PCA by modeling the directions of maximal variance by means of curves instead of straight lines. Contrarily to previous approaches PPA reduces to performing simple univariate regressions which makes it computationally feasible and robust. Moreover PPA shows a number of interesting analytical properties. First PPA is a volume preserving map which in turn guarantees the existence of the inverse. Second such an inverse can be obtained…
Nonlinear data description with Principal Polynomial Analysis
Principal Component Analysis (PCA) has been widely used for manifold description and dimensionality reduction. Performance of PCA is however hampered when data exhibits nonlinear feature relations. In this work, we propose a new framework for manifold learning based on the use of a sequence of Principal Polynomials that capture the eventually nonlinear nature of the data. The proposed Principal Polynomial Analysis (PPA) is shown to generalize PCA. Unlike recently proposed nonlinear methods (e.g. spectral/kernel methods and projection pursuit techniques, neural networks), PPA features are easily interpretable and the method leads to a fully invertible transform, which is a desirable property…
Principal polynomial analysis for remote sensing data processing
Inspired by the concept of Principal Curves, in this paper, we define Principal Polynomials as a non-linear generalization of Principal Components to overcome the conditional mean independence restriction of PCA. Principal Polynomials deform the straight Principal Components by minimizing the regression error (or variance) in the corresponding orthogonal subspaces. We propose to use a projection on a series of these polynomials to set a new nonlinear data representation: the Principal Polynomial Analysis (PPA). We prove that the dimensionality reduction error in PPA is always lower than in PCA. Lower truncation error and increased independence suggest that unsupervised PPA features can be b…
Nonlinearities and Adaptation of Color Vision from Sequential Principal Curves Analysis
Mechanisms of human color vision are characterized by two phenomenological aspects: the system is nonlinear and adaptive to changing environments. Conventional attempts to derive these features from statistics use separate arguments for each aspect. The few statistical explanations that do consider both phenomena simultaneously follow parametric formulations based on empirical models. Therefore, it may be argued that the behavior does not come directly from the color statistics but from the convenient functional form adopted. In addition, many times the whole statistical analysis is based on simplified databases that disregard relevant physical effects in the input signal, as, for instance…