0000000000286154

AUTHOR

David Luengo

showing 12 related works from this author

Physics-Aware Gaussian Processes for Earth Observation

2017

Earth observation from satellite sensory data pose challenging problems, where machine learning is currently a key player. In recent years, Gaussian Process (GP) regression and other kernel methods have excelled in biophysical parameter estimation tasks from space. GP regression is based on solid Bayesian statistics, and generally yield efficient and accurate parameter estimates. However, GPs are typically used for inverse modeling based on concurrent observations and in situ measurements only. Very often a forward model encoding the well-understood physical relations is available though. In this work, we review three GP models that respect and learn the physics of the underlying processes …

MatemáticasEstimation theory0211 other engineering and technologiesContext (language use)02 engineering and technologyMissing dataBayesian statisticssymbols.namesakeKernel method0202 electrical engineering electronic engineering information engineeringsymbolsGeología020201 artificial intelligence & image processingGaussian process emulatorGaussian processAlgorithm021101 geological & geomatics engineeringInterpolation
researchProduct

Heretical Mutiple Importance Sampling

2016

Multiple Importance Sampling (MIS) methods approximate moments of complicated distributions by drawing samples from a set of proposal distributions. Several ways to compute the importance weights assigned to each sample have been recently proposed, with the so-called deterministic mixture (DM) weights providing the best performance in terms of variance, at the expense of an increase in the computational cost. A recent work has shown that it is possible to achieve a trade-off between variance reduction and computational effort by performing an a priori random clustering of the proposals (partial DM algorithm). In this paper, we propose a novel "heretical" MIS framework, where the clustering …

FOS: Computer and information sciencesMean squared errorComputer scienceApplied MathematicsEstimator020206 networking & telecommunications02 engineering and technologyVariance (accounting)Statistics - Computation01 natural sciencesReduction (complexity)010104 statistics & probability[INFO.INFO-TS]Computer Science [cs]/Signal and Image ProcessingSignal Processing0202 electrical engineering electronic engineering information engineeringA priori and a posterioriVariance reduction0101 mathematicsElectrical and Electronic EngineeringCluster analysisAlgorithm[SPI.SIGNAL]Engineering Sciences [physics]/Signal and Image processingImportance samplingComputation (stat.CO)ComputingMilieux_MISCELLANEOUS
researchProduct

Adaptive Importance Sampling: The past, the present, and the future

2017

A fundamental problem in signal processing is the estimation of unknown parameters or functions from noisy observations. Important examples include localization of objects in wireless sensor networks [1] and the Internet of Things [2]; multiple source reconstruction from electroencephalograms [3]; estimation of power spectral density for speech enhancement [4]; or inference in genomic signal processing [5]. Within the Bayesian signal processing framework, these problems are addressed by constructing posterior probability distributions of the unknowns. The posteriors combine optimally all of the information about the unknowns in the observations with the information that is present in their …

Computer scienceBayesian probabilityPosterior probabilityInference02 engineering and technologyMachine learningcomputer.software_genre01 natural sciences010104 statistics & probabilityMultidimensional signal processing[INFO.INFO-TS]Computer Science [cs]/Signal and Image ProcessingPrior probability0202 electrical engineering electronic engineering information engineering0101 mathematicsElectrical and Electronic EngineeringComputingMilieux_MISCELLANEOUSbusiness.industryApplied Mathematics020206 networking & telecommunicationsApproximate inferenceSignal ProcessingProbability distributionArtificial intelligencebusinessAlgorithmcomputer[SPI.SIGNAL]Engineering Sciences [physics]/Signal and Image processingImportance sampling
researchProduct

Adaptive Population Importance Samplers: A General Perspective

2016

Importance sampling (IS) is a well-known Monte Carlo method, widely used to approximate a distribution of interest using a random measure composed of a set of weighted samples generated from another proposal density. Since the performance of the algorithm depends on the mismatch between the target and the proposal densities, a set of proposals is often iteratively adapted in order to reduce the variance of the resulting estimator. In this paper, we review several well-known adaptive population importance samplers, providing a unified common framework and classifying them according to the nature of their estimation and adaptive procedures. Furthermore, we interpret the underlying motivation …

Computer scienceMatemáticasMonte Carlo methodPopulation02 engineering and technologyMachine learningcomputer.software_genre01 natural sciences010104 statistics & probability[INFO.INFO-TS]Computer Science [cs]/Signal and Image Processing0202 electrical engineering electronic engineering information engineering0101 mathematicseducationComputingMilieux_MISCELLANEOUSeducation.field_of_studybusiness.industryEstimator020206 networking & telecommunicationsStatistical classificationRandom measureMonte Carlo integrationData miningArtificial intelligencebusinessParticle filtercomputer[SPI.SIGNAL]Engineering Sciences [physics]/Signal and Image processingImportance sampling
researchProduct

Adaptive independent sticky MCMC algorithms

2018

In this work, we introduce a novel class of adaptive Monte Carlo methods, called adaptive independent sticky MCMC algorithms, for efficient sampling from a generic target probability density function (pdf). The new class of algorithms employs adaptive non-parametric proposal densities which become closer and closer to the target as the number of iterations increases. The proposal pdf is built using interpolation procedures based on a set of support points which is constructed iteratively based on previously drawn samples. The algorithm's efficiency is ensured by a test that controls the evolution of the set of support points. This extra stage controls the computational cost and the converge…

FOS: Computer and information sciencesMathematical optimizationAdaptive Markov chain Monte Carlo (MCMC)Monte Carlo methodBayesian inferenceHASettore SECS-P/05 - Econometrialcsh:TK7800-8360Machine Learning (stat.ML)02 engineering and technologyBayesian inference01 natural sciencesStatistics - Computationlcsh:Telecommunication010104 statistics & probabilitysymbols.namesakeAdaptive Markov chain Monte Carlo (MCMC); Adaptive rejection Metropolis sampling (ARMS); Bayesian inference; Gibbs sampling; Hit and run algorithm; Metropolis-within-Gibbs; Monte Carlo methods; Signal Processing; Hardware and Architecture; Electrical and Electronic EngineeringGibbs samplingStatistics - Machine Learninglcsh:TK5101-67200202 electrical engineering electronic engineering information engineeringComputational statisticsMetropolis-within-GibbsHit and run algorithm0101 mathematicsElectrical and Electronic EngineeringGaussian processComputation (stat.CO)MathematicsSignal processinglcsh:Electronics020206 networking & telecommunicationsMarkov chain Monte CarloMonte Carlo methodsHardware and ArchitectureSignal ProcessingSettore SECS-S/03 - Statistica EconomicasymbolsSettore SECS-S/01 - StatisticaStatistical signal processingGibbs samplingAdaptive rejection Metropolis sampling (ARMS)EURASIP Journal on Advances in Signal Processing
researchProduct

Population Monte Carlo Schemes with Reduced Path Degeneracy

2017

Population Monte Carlo (PMC) algorithms are versatile adaptive tools for approximating moments of complicated distributions. A common problem of PMC algorithms is the so-called path degeneracy; the diversity in the adaptation is endangered due to the resampling step. In this paper we focus on novel population Monte Carlo schemes that present enhanced diversity, compared to the standard approach, while keeping the same implementation structure (sample generation, weighting and resampling). The new schemes combine different weighting and resampling strategies to reduce the path degeneracy and achieve a higher performance at the cost of additional low computational complexity cost. Computer si…

Computational complexity theoryMonte Carlo methodApproximation algorithm020206 networking & telecommunications02 engineering and technology01 natural sciencesStatistics::ComputationWeighting010104 statistics & probabilitysymbols.namesake[INFO.INFO-TS]Computer Science [cs]/Signal and Image ProcessingGaussian noiseResamplingPath (graph theory)0202 electrical engineering electronic engineering information engineeringsymbols0101 mathematicsDegeneracy (mathematics)Algorithm[SPI.SIGNAL]Engineering Sciences [physics]/Signal and Image processingComputingMilieux_MISCELLANEOUS
researchProduct

Physics-aware Gaussian processes in remote sensing

2018

Abstract Earth observation from satellite sensory data poses challenging problems, where machine learning is currently a key player. In recent years, Gaussian Process (GP) regression has excelled in biophysical parameter estimation tasks from airborne and satellite observations. GP regression is based on solid Bayesian statistics, and generally yields efficient and accurate parameter estimates. However, GPs are typically used for inverse modeling based on concurrent observations and in situ measurements only. Very often a forward model encoding the well-understood physical relations between the state vector and the radiance observations is available though and could be useful to improve pre…

Signal Processing (eess.SP)FOS: Computer and information sciences010504 meteorology & atmospheric sciences0211 other engineering and technologies02 engineering and technologyStatistics - Applications01 natural sciencessymbols.namesakeFOS: Electrical engineering electronic engineering information engineeringApplications (stat.AP)Electrical Engineering and Systems Science - Signal ProcessingGaussian processGaussian process emulator021101 geological & geomatics engineering0105 earth and related environmental sciencesbusiness.industryEstimation theoryBayesian optimizationState vectorMissing dataBayesian statisticssymbolsGlobal Positioning SystembusinessAlgorithmSoftwareApplied Soft Computing
researchProduct

Integrating Domain Knowledge in Data-Driven Earth Observation With Process Convolutions

2022

The modelling of Earth observation data is a challenging problem, typically approached by either purely mechanistic or purely data-driven methods. Mechanistic models encode the domain knowledge and physical rules governing the system. Such models, however, need the correct specification of all interactions between variables in the problem and the appropriate parameterization is a challenge in itself. On the other hand, machine learning approaches are flexible data-driven tools, able to approximate arbitrarily complex functions, but lack interpretability and struggle when data is scarce or in extrapolation regimes. In this paper, we argue that hybrid learning schemes that combine both approa…

FOS: Computer and information sciencesComputer Science - Machine LearningEarth observationAdvanced microwave scanning radiometer-2 (AMSR-2)moderate resolution imaging spectroradiometer (MODIS)Computer scienceleaf area index (LAI)0211 other engineering and technologiesExtrapolationMachine Learning (stat.ML)02 engineering and technologycomputer.software_genreMachine Learning (cs.LG)Data-drivenConvolutionsymbols.namesakeadvanced scatterometer (ASCAT)Statistics - Machine Learningordinary differential equation (ODE)Electrical and Electronic EngineeringGaussian processsoil moisture and ocean salinity (SMOS)021101 geological & geomatics engineeringInterpretabilityForcing (recursion theory)machine learning (ML)soil moisture (SM)time series analysisgaussian process (GP)symbolsGeneral Earth and Planetary SciencesDomain knowledgeData mininggap fillingphysicscomputerfraction of absorbed photosynthetically active radiation (faPAR)IEEE Transactions on Geoscience and Remote Sensing
researchProduct

Anti-tempered Layered Adaptive Importance Sampling

2017

Monte Carlo (MC) methods are widely used for Bayesian inference in signal processing, machine learning and statistics. In this work, we introduce an adaptive importance sampler which mixes together the benefits of the Importance Sampling (IS) and Markov Chain Monte Carlo (MCMC) approaches. Different parallel MCMC chains provide the location parameters of the proposal probability density functions (pdfs) used in an IS method. The MCMC algorithms consider a tempered version of the posterior distribution as invariant density. We also provide an exhaustive theoretical support explaining why, in the presented technique, even an anti-tempering strategy (reducing the scaling of the posterior) can …

Mathematical optimizationRejection samplingSlice sampling020206 networking & telecommunicationsMarkov chain Monte Carlo02 engineering and technology01 natural sciencesStatistics::ComputationHybrid Monte Carlo010104 statistics & probabilitysymbols.namesakeMetropolis–Hastings algorithm[INFO.INFO-TS]Computer Science [cs]/Signal and Image Processing0202 electrical engineering electronic engineering information engineeringsymbolsParallel tempering0101 mathematicsParticle filter[SPI.SIGNAL]Engineering Sciences [physics]/Signal and Image processingImportance samplingComputingMilieux_MISCELLANEOUSMathematics
researchProduct

Efficient linear fusion of partial estimators

2018

Abstract Many signal processing applications require performing statistical inference on large datasets, where computational and/or memory restrictions become an issue. In this big data setting, computing an exact global centralized estimator is often either unfeasible or impractical. Hence, several authors have considered distributed inference approaches, where the data are divided among multiple workers (cores, machines or a combination of both). The computations are then performed in parallel and the resulting partial estimators are finally combined to approximate the intractable global estimator. In this paper, we focus on the scenario where no communication exists among the workers, de…

Computer scienceBayesian probabilityInferenceAsymptotic distribution02 engineering and technology01 natural sciences010104 statistics & probability[INFO.INFO-TS]Computer Science [cs]/Signal and Image ProcessingArtificial Intelligence0202 electrical engineering electronic engineering information engineeringStatistical inferenceFusion rules0101 mathematicsElectrical and Electronic EngineeringComputingMilieux_MISCELLANEOUSMinimum mean square errorApplied MathematicsConstrained optimizationEstimator020206 networking & telecommunicationsComputational Theory and MathematicsSignal ProcessingComputer Vision and Pattern RecognitionStatistics Probability and Uncertainty[SPI.SIGNAL]Engineering Sciences [physics]/Signal and Image processingAlgorithmDigital Signal Processing
researchProduct

Latent force models for earth observation time series prediction

2016

We introduce latent force models for Earth observation time series analysis. The model uses Gaussian processes and differential equations to combine data driven modelling with a physical model of the system. The LFM presented here performs multi-output structured regression, adapts to the signal characteristics, it can cope with missing data in the time series, and provides explicit latent functions that allow system analysis and evaluation. We successfully illustrate the performance in challenging scenarios of crop monitoring from space, providing time-resolved time series predictions.

Earth observation010504 meteorology & atmospheric sciencesSeries (mathematics)Differential equationComputer scienceMatemáticas02 engineering and technologyMissing data01 natural sciencesData-drivenData modelingsymbols.namesake0202 electrical engineering electronic engineering information engineeringsymbols020201 artificial intelligence & image processingGeologíaTime seriesGaussian processAlgorithmSimulation0105 earth and related environmental sciences
researchProduct

Novel weighting and resampling schemes in Population Monte Carlo

2017

International audience; In this paper we focus on novel population Monte Carlo schemes that present added flexibility and superior performance compared to the standard approach. The new schemes combine different weighting and resampling strategies to achieve more efficient performance at the cost of a reasonable computational complexity increment. Computer simulations compare the different alternatives when applied to the problem of frequency estimation in superimposed sinusoids.; Dans cet article, nous nous concentrons sur de nouveaux schémas de population Monte Carlo présentant une flexibilité et des performance supérieures par rapportàrapport`rapportà l'approche standard. Ces nouveaux sc…

[INFO.INFO-TS]Computer Science [cs]/Signal and Image Processing[SPI.SIGNAL]Engineering Sciences [physics]/Signal and Image processing
researchProduct