Search results for "Kernel method"
showing 10 items of 79 documents
Explicit Recursive and Adaptive Filtering in Reproducing Kernel Hilbert Spaces
2014
This brief presents a methodology to develop recursive filters in reproducing kernel Hilbert spaces. Unlike previous approaches that exploit the kernel trick on filtered and then mapped samples, we explicitly define the model recursivity in the Hilbert space. For that, we exploit some properties of functional analysis and recursive computation of dot products without the need of preimaging or a training dataset. We illustrate the feasibility of the methodology in the particular case of the $\gamma$ -filter, which is an infinite impulse response filter with controlled stability and memory depth. Different algorithmic formulations emerge from the signal model. Experiments in chaotic and elect…
Estimating biophysical variable dependences with kernels
2010
This paper introduces a nonlinear measure of dependence between random variables in the context of remote sensing data analysis. The Hilbert-Schmidt Independence Criterion (HSIC) is a kernel method for evaluating statistical dependence. HSIC is based on computing the Hilbert-Schmidt norm of the cross-covariance operator of mapped samples in the corresponding Hilbert spaces. The HSIC empirical estimator is very easy to compute and has good theoretical and practical properties. We exploit the capabilities of HSIC to explain nonlinear dependences in two remote sensing problems: temperature estimation and chlorophyll concentration prediction from spectra. Results show that, when the relationshi…
An Introduction to Kernel Methods
2009
Machine learning has experienced a great advance in the eighties and nineties due to the active research in artificial neural networks and adaptive systems. These tools have demonstrated good results in many real applications, since neither a priori knowledge about the distribution of the available data nor the relationships among the independent variables should be necessarily assumed. Overfitting due to reduced training data sets is controlled by means of a regularized functional which minimizes the complexity of the machine. Working with high dimensional input spaces is no longer a problem thanks to the use of kernel methods. Such methods also provide us with new ways to interpret the cl…
Explicit recursivity into reproducing kernel Hilbert spaces
2011
This paper presents a methodology to develop recursive filters in reproducing kernel Hilbert spaces (RKHS). Unlike previous approaches that exploit the kernel trick on filtered and then mapped samples, we explicitly define model recursivity in the Hilbert space. The method exploits some properties of functional analysis and recursive computation of dot products without the need of pre-imaging. We illustrate the feasibility of the methodology in the particular case of the gamma-filter, an infinite impulse response (IIR) filter with controlled stability and memory depth. Different algorithmic formulations emerge from the signal model. Experiments in chaotic and electroencephalographic time se…
Multiset Kernel CCA for multitemporal image classification
2013
The analysis of multitemporal remote sensing images is becoming an increasingly important problem because of the upcoming scenario of multispectral satellite constellations monitoring our Planet. Algorithms that can analyze such amount of heterogeneous information are necessary. While linear techniques have been extensively deployed, this work considers a kernel method that finds nonlinear correlations between all image sources and the class labels. We introduce in this context the Kernel Canonical Correlation Analysis (KCCA) to exploit the wealth of temporal image information and to handle nonlinear relations in a natural way via kernels. To achieve this goal, we use the generalization of …
Signal-to-noise ratio in reproducing kernel Hilbert spaces
2018
This paper introduces the kernel signal-to-noise ratio (kSNR) for different machine learning and signal processing applications}. The kSNR seeks to maximize the signal variance while minimizing the estimated noise variance explicitly in a reproducing kernel Hilbert space (rkHs). The kSNR gives rise to considering complex signal-to-noise relations beyond additive noise models, and can be seen as a useful signal-to-noise regularizer for feature extraction and dimensionality reduction. We show that the kSNR generalizes kernel PCA (and other spectral dimensionality reduction methods), least squares SVM, and kernel ridge regression to deal with cases where signal and noise cannot be assumed inde…
High-order Runge–Kutta–Nyström geometric methods with processing
2001
Abstract We present new families of sixth- and eighth-order Runge–Kutta–Nystrom geometric integrators with processing for ordinary differential equations. Both the processor and the kernel are composed of explicitly computable flows associated with non trivial elements belonging to the Lie algebra involved in the problem. Their efficiency is found to be superior to other previously known algorithms of equivalent order, in some case up to four orders of magnitude.
2016
The wealth of sensory data coming from different modalities has opened numerous opportunities for data analysis. The data are of increasing volume, complexity and dimensionality, thus calling for new methodological innovations towards multimodal data processing. However, multimodal architectures must rely on models able to adapt to changes in the data distribution. Differences in the density functions can be due to changes in acquisition conditions (pose, illumination), sensors characteristics (number of channels, resolution) or different views (e.g. street level vs. aerial views of a same building). We call these different acquisition modes domains, and refer to the adaptation problem as d…
Improved Neural Networks with Random Weights for Short-Term Load Forecasting.
2015
An effective forecasting model for short-term load plays a significant role in promoting the management efficiency of an electric power system. This paper proposes a new forecasting model based on the improved neural networks with random weights (INNRW). The key is to introduce a weighting technique to the inputs of the model and use a novel neural network to forecast the daily maximum load. Eight factors are selected as the inputs. A mutual information weighting algorithm is then used to allocate different weights to the inputs. The neural networks with random weights and kernels (KNNRW) is applied to approximate the nonlinear function between the selected inputs and the daily maximum load…
A support vector domain method for change detection in multitemporal images
2010
This paper formulates the problem of distinguishing changed from unchanged pixels in multitemporal remote sensing images as a minimum enclosing ball (MEB) problem with changed pixels as target class. The definition of the sphere-shaped decision boundary with minimal volume that embraces changed pixels is approached in the context of the support vector formalism adopting a support vector domain description (SVDD) one-class classifier. SVDD maps the data into a high dimensional feature space where the spherical support of the high dimensional distribution of changed pixels is computed. Unlike the standard SVDD, the proposed formulation of the SVDD uses both target and outlier samples for defi…