Search results for " Principal Component Analysis"
showing 10 items of 71 documents
Gamma Kernel Intensity Estimation in Temporal Point Processes
2011
In this article, we propose a nonparametric approach for estimating the intensity function of temporal point processes based on kernel estimators. In particular, we use asymmetric kernel estimators characterized by the gamma distribution, in order to describe features of observed point patterns adequately. Some characteristics of these estimators are analyzed and discussed both through simulated results and applications to real data from different seismic catalogs.
Functional Principal Component Analysis for the explorative analysis of multisite-multivariate air pollution time series with long gaps
2013
The knowledge of the urban air quality represents the first step to face air pollution issues. For the last decades many cities can rely on a network of monitoring stations recording concentration values for the main pollutants. This paper focuses on functional principal component analysis (FPCA) to investigate multiple pollutant datasets measured over time at multiple sites within a given urban area. Our purpose is to extend what has been proposed in the literature to data that are multisite and multivariate at the same time. The approach results to be effective to highlight some relevant statistical features of the time series, giving the opportunity to identify significant pollutants and…
Applications of Kernel Methods
2009
In this chapter, we give a survey of applications of the kernel methods introduced in the previous chapter. We focus on different application domains that are particularly active in both direct application of well-known kernel methods, and in new algorithmic developments suited to a particular problem. In particular, we consider the following application fields: biomedical engineering (comprising both biological signal processing and bioinformatics), communications, signal, speech and image processing.
Learning non-linear time-scales with kernel -filters
2009
A family of kernel methods, based on the @c-filter structure, is presented for non-linear system identification and time series prediction. The kernel trick allows us to develop the natural non-linear extension of the (linear) support vector machine (SVM) @c-filter [G. Camps-Valls, M. Martinez-Ramon, J.L. Rojo-Alvarez, E. Soria-Olivas, Robust @c-filter using support vector machines, Neurocomput. J. 62(12) (2004) 493-499.], but this approach yields a rigid system model without non-linear cross relation between time-scales. Several functional analysis properties allow us to develop a full, principled family of kernel @c-filters. The improved performance in several application examples suggest…
Time Trends in the Joint Distributions of Income and Age
2001
We propose a method of analyzing time changes of joint income-age densities. Change is decomposed into time invariant components which act on the densities as deformations with time varying strength. The functional form of these components is estimated non parametrically from cross sectional data. The method is applied to analyze British household data on income and age for the years 1968–95. It is learned that for the young and middle aged there is a trend towards increasing inequality, while during the early eighties there seems to occur a reversal in the evolution of the income distribution for the old.
Nonlinear data description with Principal Polynomial Analysis
2012
Principal Component Analysis (PCA) has been widely used for manifold description and dimensionality reduction. Performance of PCA is however hampered when data exhibits nonlinear feature relations. In this work, we propose a new framework for manifold learning based on the use of a sequence of Principal Polynomials that capture the eventually nonlinear nature of the data. The proposed Principal Polynomial Analysis (PPA) is shown to generalize PCA. Unlike recently proposed nonlinear methods (e.g. spectral/kernel methods and projection pursuit techniques, neural networks), PPA features are easily interpretable and the method leads to a fully invertible transform, which is a desirable property…
Spectral clustering with the probabilistic cluster kernel
2015
Abstract This letter introduces a probabilistic cluster kernel for data clustering. The proposed kernel is computed with the composition of dot products between the posterior probabilities obtained via GMM clustering. The kernel is directly learned from the data, is parameter-free, and captures the data manifold structure at different scales. The projections in the kernel space induced by this kernel are useful for general feature extraction purposes and are here exploited in spectral clustering with the canonical k-means. The kernel structure, informative content and optimality are studied. Analysis and performance are illustrated in several real datasets.
Feature extraction from remote sensing data using Kernel Orthonormalized PLS
2007
This paper presents the study of a sparse kernel-based method for non-linear feature extraction in the context of remote sensing classification and regression problems. The so-called kernel orthonormalized PLS algorithm with reduced complexity (rKOPLS) has two core parts: (i) a kernel version of OPLS (called KOPLS), and (ii) a sparse (reduced) approximation for large scale data sets, which ultimately leads to rKOPLS. The method demonstrates good capabilities in terms of expressive power of the extracted features and scalability.
Semisupervised Kernel Feature Extraction for Remote Sensing Image Analysis
2014
This paper presents a novel semisupervised kernel partial least squares (KPLS) algorithm for nonlinear feature extraction to tackle both land-cover classification and biophysical parameter retrieval problems. The proposed method finds projections of the original input data that align with the target variable (labels) and incorporates the wealth of unlabeled information to deal with low-sized or underrepresented data sets. The method relies on combining two kernel functions: the standard radial-basis-function kernel based on labeled information and a generative, i.e., probabilistic, kernel directly learned by clustering the data many times and at different scales across the data manifold. Th…
A family of kernel anomaly change detectors
2014
This paper introduces the nonlinear extension of the anomaly change detection algorithms in [1] based on the theory of reproducing kernels. The presented methods generalize their linear counterparts, under both the Gaussian and elliptically-contoured assumptions, and produce both improved detection accuracies and reduced false alarm rates. We study the Gaussianity of the data in Hilbert spaces with kernel dependence estimates, provide low-rank kernel versions to cope with the high computational cost of the methods, and give prescriptions about the selection of the kernel functions and their parameters. We illustrate the performance of the introduced kernel methods in both pervasive and anom…