Search results for "ESTIMATION"
showing 10 items of 924 documents
Basic Statistical Techniques
2012
FastSLAM 2.0: Least-Squares Approach
2006
In this paper, we present a set of robust and efficient algorithms with O(N) cost for the following situations: object detection with a laser ranger; mobile robot pose estimation and a FastSLAM improved implementation. Objected detection is mainly based on a novel multiple line fitting method, related with walls at the environment. This method assumes that walls at the environment constitute a regular constrained angles. A line-based pose estimation method is also proposed, based on Least-Squares (LS). This method performs the matching of detected lines and estimated map lines and it can provide the global pose estimation under assumption of known Data-Association. FastSLAM 1.0 has been imp…
Denoising Autoencoders for Fast Combinatorial Black Box Optimization
2015
Estimation of Distribution Algorithms (EDAs) require flexible probability models that can be efficiently learned and sampled. Autoencoders (AE) are generative stochastic networks with these desired properties. We integrate a special type of AE, the Denoising Autoencoder (DAE), into an EDA and evaluate the performance of DAE-EDA on several combinatorial optimization problems with a single objective. We asses the number of fitness evaluations as well as the required CPU times. We compare the results to the performance to the Bayesian Optimization Algorithm (BOA) and RBM-EDA, another EDA which is based on a generative neural network which has proven competitive with BOA. For the considered pro…
A probabilistic estimation and prediction technique for dynamic continuous social science models: The evolution of the attitude of the Basque Country…
2015
In this paper, a computational technique to deal with uncertainty in dynamic continuous models in Social Sciences is presented.Considering data from surveys,the method consists of determining the probability distribution of the survey output and this allows to sample data and fit the model to the sampled data using a goodness-of-fit criterion based the χ2-test. Taking the fitted parameters that were not rejected by the χ2-test, substituting them into the model and computing their outputs, 95% confidence intervals in each time instant capturing the uncertainty of the survey data (probabilistic estimation) is built. Using the same set of obtained model parameters, a prediction over …
Optimized Kernel Entropy Components
2016
This work addresses two main issues of the standard Kernel Entropy Component Analysis (KECA) algorithm: the optimization of the kernel decomposition and the optimization of the Gaussian kernel parameter. KECA roughly reduces to a sorting of the importance of kernel eigenvectors by entropy instead of by variance as in Kernel Principal Components Analysis. In this work, we propose an extension of the KECA method, named Optimized KECA (OKECA), that directly extracts the optimal features retaining most of the data entropy by means of compacting the information in very few features (often in just one or two). The proposed method produces features which have higher expressive power. In particular…
Warped Gaussian Processes in Remote Sensing Parameter Estimation and Causal Inference
2018
This letter introduces warped Gaussian process (WGP) regression in remote sensing applications. WGP models output observations as a parametric nonlinear transformation of a GP. The parameters of such a prior model are then learned via standard maximum likelihood. We show the good performance of the proposed model for the estimation of oceanic chlorophyll content from multispectral data, vegetation parameters (chlorophyll, leaf area index, and fractional vegetation cover) from hyperspectral data, and in the detection of the causal direction in a collection of 28 bivariate geoscience and remote sensing causal problems. The model consistently performs better than the standard GP and the more a…
Kernel methods and their derivatives: Concept and perspectives for the earth system sciences.
2020
Kernel methods are powerful machine learning techniques which implement generic non-linear functions to solve complex tasks in a simple way. They Have a solid mathematical background and exhibit excellent performance in practice. However, kernel machines are still considered black-box models as the feature mapping is not directly accessible and difficult to interpret.The aim of this work is to show that it is indeed possible to interpret the functions learned by various kernel methods is intuitive despite their complexity. Specifically, we show that derivatives of these functions have a simple mathematical formulation, are easy to compute, and can be applied to many different problems. We n…
Randomized kernels for large scale Earth observation applications
2020
Abstract Current remote sensing applications of bio-geophysical parameter estimation and image classification have to deal with an unprecedented big amount of heterogeneous and complex data sources. New satellite sensors involving a high number of improved time, space and wavelength resolutions give rise to challenging computational problems. Standard physical inversion techniques cannot cope efficiently with this new scenario. Dealing with land cover classification of the new image sources has also turned to be a complex problem requiring large amount of memory and processing time. In order to cope with these problems, statistical learning has greatly helped in the last years to develop st…
Futures pricing in electricity markets based on stable CARMA spot models
2012
We present a new model for the electricity spot price dynamics, which is able to capture seasonality, low-frequency dynamics and the extreme spikes in the market. Instead of the usual purely deterministic trend we introduce a non-stationary independent increments process for the low-frequency dynamics, and model the large uctuations by a non-Gaussian stable CARMA process. The model allows for analytic futures prices, and we apply these to model and estimate the whole market consistently. Besides standard parameter estimation, an estimation procedure is suggested, where we t the non-stationary trend using futures data with long time until delivery, and a robust L 1 -lter to nd the states of …
Fractional generalized cumulative entropy and its dynamic version
2021
Following the theory of information measures based on the cumulative distribution function, we propose the fractional generalized cumulative entropy, and its dynamic version. These entropies are particularly suitable to deal with distributions satisfying the proportional reversed hazard model. We study the connection with fractional integrals, and some bounds and comparisons based on stochastic orderings, that allow to show that the proposed measure is actually a variability measure. The investigation also involves various notions of reliability theory, since the considered dynamic measure is a suitable extension of the mean inactivity time. We also introduce the empirical generalized fract…