0000000000968447

AUTHOR

Victor Elvira

showing 15 related works from this author

Heretical Mutiple Importance Sampling

2016

Multiple Importance Sampling (MIS) methods approximate moments of complicated distributions by drawing samples from a set of proposal distributions. Several ways to compute the importance weights assigned to each sample have been recently proposed, with the so-called deterministic mixture (DM) weights providing the best performance in terms of variance, at the expense of an increase in the computational cost. A recent work has shown that it is possible to achieve a trade-off between variance reduction and computational effort by performing an a priori random clustering of the proposals (partial DM algorithm). In this paper, we propose a novel "heretical" MIS framework, where the clustering …

FOS: Computer and information sciencesMean squared errorComputer scienceApplied MathematicsEstimator020206 networking & telecommunications02 engineering and technologyVariance (accounting)Statistics - Computation01 natural sciencesReduction (complexity)010104 statistics & probability[INFO.INFO-TS]Computer Science [cs]/Signal and Image ProcessingSignal Processing0202 electrical engineering electronic engineering information engineeringA priori and a posterioriVariance reduction0101 mathematicsElectrical and Electronic EngineeringCluster analysisAlgorithm[SPI.SIGNAL]Engineering Sciences [physics]/Signal and Image processingImportance samplingComputation (stat.CO)ComputingMilieux_MISCELLANEOUS
researchProduct

A new strategy for effective learning in population Monte Carlo sampling

2016

In this work, we focus on advancing the theory and practice of a class of Monte Carlo methods, population Monte Carlo (PMC) sampling, for dealing with inference problems with static parameters. We devise a new method for efficient adaptive learning from past samples and weights to construct improved proposal functions. It is based on assuming that, at each iteration, there is an intermediate target and that this target is gradually getting closer to the true one. Computer simulations show and confirm the improvement of the proposed strategy compared to the traditional PMC method on a simple considered scenario.

Mathematical optimizationComputer scienceMonte Carlo methodInference02 engineering and technology01 natural sciencesHybrid Monte Carlo010104 statistics & probabilitysymbols.namesake[INFO.INFO-TS]Computer Science [cs]/Signal and Image Processing0202 electrical engineering electronic engineering information engineeringQuasi-Monte Carlo methodKinetic Monte Carlo0101 mathematicsComputingMilieux_MISCELLANEOUSbusiness.industryRejection samplingSampling (statistics)020206 networking & telecommunicationsMarkov chain Monte CarloDynamic Monte Carlo methodsymbolsMonte Carlo integrationMonte Carlo method in statistical physicsArtificial intelligenceParticle filterbusiness[SPI.SIGNAL]Engineering Sciences [physics]/Signal and Image processingMonte Carlo molecular modeling
researchProduct

Adaptive Importance Sampling: The past, the present, and the future

2017

A fundamental problem in signal processing is the estimation of unknown parameters or functions from noisy observations. Important examples include localization of objects in wireless sensor networks [1] and the Internet of Things [2]; multiple source reconstruction from electroencephalograms [3]; estimation of power spectral density for speech enhancement [4]; or inference in genomic signal processing [5]. Within the Bayesian signal processing framework, these problems are addressed by constructing posterior probability distributions of the unknowns. The posteriors combine optimally all of the information about the unknowns in the observations with the information that is present in their …

Computer scienceBayesian probabilityPosterior probabilityInference02 engineering and technologyMachine learningcomputer.software_genre01 natural sciences010104 statistics & probabilityMultidimensional signal processing[INFO.INFO-TS]Computer Science [cs]/Signal and Image ProcessingPrior probability0202 electrical engineering electronic engineering information engineering0101 mathematicsElectrical and Electronic EngineeringComputingMilieux_MISCELLANEOUSbusiness.industryApplied Mathematics020206 networking & telecommunicationsApproximate inferenceSignal ProcessingProbability distributionArtificial intelligencebusinessAlgorithmcomputer[SPI.SIGNAL]Engineering Sciences [physics]/Signal and Image processingImportance sampling
researchProduct

Particle Group Metropolis Methods for Tracking the Leaf Area Index

2020

Monte Carlo (MC) algorithms are widely used for Bayesian inference in statistics, signal processing, and machine learning. In this work, we introduce an Markov Chain Monte Carlo (MCMC) technique driven by a particle filter. The resulting scheme is a generalization of the so-called Particle Metropolis-Hastings (PMH) method, where a suitable Markov chain of sets of weighted samples is generated. We also introduce a marginal version for the goal of jointly inferring dynamic and static variables. The proposed algorithms outperform the corresponding standard PMH schemes, as shown by numerical experiments.

Signal processing010504 meteorology & atmospheric sciencesMarkov chainGeneralizationComputer scienceBayesian inferenceMonte Carlo method020206 networking & telecommunicationsMarkov chain Monte Carlo02 engineering and technologystate-space modelsTracking (particle physics)Bayesian inference01 natural sciencesParticle FilteringStatistics::Computationsymbols.namesake0202 electrical engineering electronic engineering information engineeringsymbolsParticle MCMCParticle filterMonte CarloAlgorithm0105 earth and related environmental sciences
researchProduct

Adaptive Population Importance Samplers: A General Perspective

2016

Importance sampling (IS) is a well-known Monte Carlo method, widely used to approximate a distribution of interest using a random measure composed of a set of weighted samples generated from another proposal density. Since the performance of the algorithm depends on the mismatch between the target and the proposal densities, a set of proposals is often iteratively adapted in order to reduce the variance of the resulting estimator. In this paper, we review several well-known adaptive population importance samplers, providing a unified common framework and classifying them according to the nature of their estimation and adaptive procedures. Furthermore, we interpret the underlying motivation …

Computer scienceMatemáticasMonte Carlo methodPopulation02 engineering and technologyMachine learningcomputer.software_genre01 natural sciences010104 statistics & probability[INFO.INFO-TS]Computer Science [cs]/Signal and Image Processing0202 electrical engineering electronic engineering information engineering0101 mathematicseducationComputingMilieux_MISCELLANEOUSeducation.field_of_studybusiness.industryEstimator020206 networking & telecommunicationsStatistical classificationRandom measureMonte Carlo integrationData miningArtificial intelligencebusinessParticle filtercomputer[SPI.SIGNAL]Engineering Sciences [physics]/Signal and Image processingImportance sampling
researchProduct

Recycling Gibbs sampling

2017

Gibbs sampling is a well-known Markov chain Monte Carlo (MCMC) algorithm, extensively used in signal processing, machine learning and statistics. The key point for the successful application of the Gibbs sampler is the ability to draw samples from the full-conditional probability density functions efficiently. In the general case this is not possible, so in order to speed up the convergence of the chain, it is required to generate auxiliary samples. However, such intermediate information is finally disregarded. In this work, we show that these auxiliary samples can be recycled within the Gibbs estimators, improving their efficiency with no extra cost. Theoretical and exhaustive numerical co…

Computer scienceMonte Carlo methodSlice samplingMarkov processProbability density function02 engineering and technologyMachine learningcomputer.software_genre01 natural sciencesHybrid Monte Carlo010104 statistics & probabilitysymbols.namesake[INFO.INFO-TS]Computer Science [cs]/Signal and Image Processing0202 electrical engineering electronic engineering information engineering0101 mathematicsComputingMilieux_MISCELLANEOUSbusiness.industryRejection samplingEstimator020206 networking & telecommunicationsMarkov chain Monte CarlosymbolsArtificial intelligencebusiness[SPI.SIGNAL]Engineering Sciences [physics]/Signal and Image processingcomputerAlgorithmGibbs sampling2017 25th European Signal Processing Conference (EUSIPCO)
researchProduct

Compressed Particle Methods for Expensive Models With Application in Astronomy and Remote Sensing

2021

In many inference problems, the evaluation of complex and costly models is often required. In this context, Bayesian methods have become very popular in several fields over the last years, in order to obtain parameter inversion, model selection or uncertainty quantification. Bayesian inference requires the approximation of complicated integrals involving (often costly) posterior distributions. Generally, this approximation is obtained by means of Monte Carlo (MC) methods. In order to reduce the computational cost of the corresponding technique, surrogate models (also called emulators) are often employed. Another alternative approach is the so-called Approximate Bayesian Computation (ABC) sc…

FOS: Computer and information sciencesComputer scienceAstronomyModel selectionBayesian inferenceMonte Carlo methodBayesian probabilityAerospace EngineeringAstronomyInferenceMachine Learning (stat.ML)Context (language use)Bayesian inferenceStatistics - ComputationComputational Engineering Finance and Science (cs.CE)remote sensingimportance samplingStatistics - Machine Learningnumerical inversionparticle filteringElectrical and Electronic EngineeringUncertainty quantificationApproximate Bayesian computationComputer Science - Computational Engineering Finance and ScienceComputation (stat.CO)IEEE Transactions on Aerospace and Electronic Systems
researchProduct

Population Monte Carlo Schemes with Reduced Path Degeneracy

2017

Population Monte Carlo (PMC) algorithms are versatile adaptive tools for approximating moments of complicated distributions. A common problem of PMC algorithms is the so-called path degeneracy; the diversity in the adaptation is endangered due to the resampling step. In this paper we focus on novel population Monte Carlo schemes that present enhanced diversity, compared to the standard approach, while keeping the same implementation structure (sample generation, weighting and resampling). The new schemes combine different weighting and resampling strategies to reduce the path degeneracy and achieve a higher performance at the cost of additional low computational complexity cost. Computer si…

Computational complexity theoryMonte Carlo methodApproximation algorithm020206 networking & telecommunications02 engineering and technology01 natural sciencesStatistics::ComputationWeighting010104 statistics & probabilitysymbols.namesake[INFO.INFO-TS]Computer Science [cs]/Signal and Image ProcessingGaussian noiseResamplingPath (graph theory)0202 electrical engineering electronic engineering information engineeringsymbols0101 mathematicsDegeneracy (mathematics)Algorithm[SPI.SIGNAL]Engineering Sciences [physics]/Signal and Image processingComputingMilieux_MISCELLANEOUS
researchProduct

The Recycling Gibbs sampler for efficient learning

2018

Monte Carlo methods are essential tools for Bayesian inference. Gibbs sampling is a well-known Markov chain Monte Carlo (MCMC) algorithm, extensively used in signal processing, machine learning, and statistics, employed to draw samples from complicated high-dimensional posterior distributions. The key point for the successful application of the Gibbs sampler is the ability to draw efficiently samples from the full-conditional probability density functions. Since in the general case this is not possible, in order to speed up the convergence of the chain, it is required to generate auxiliary samples whose information is eventually disregarded. In this work, we show that these auxiliary sample…

FOS: Computer and information sciencesMonte Carlo methodSlice samplingInferenceMachine Learning (stat.ML)02 engineering and technologyBayesian inferenceStatistics - Computation01 natural sciencesMachine Learning (cs.LG)010104 statistics & probabilitysymbols.namesake[INFO.INFO-TS]Computer Science [cs]/Signal and Image ProcessingStatistics - Machine LearningArtificial IntelligenceStatistics0202 electrical engineering electronic engineering information engineering0101 mathematicsElectrical and Electronic EngineeringGaussian processComputation (stat.CO)ComputingMilieux_MISCELLANEOUSMathematicsChain rule (probability)Applied Mathematics020206 networking & telecommunicationsMarkov chain Monte CarloStatistics::ComputationComputer Science - LearningComputational Theory and MathematicsSignal ProcessingsymbolsComputer Vision and Pattern RecognitionStatistics Probability and UncertaintyAlgorithm[SPI.SIGNAL]Engineering Sciences [physics]/Signal and Image processingGibbs samplingDigital Signal Processing
researchProduct

Anti-tempered Layered Adaptive Importance Sampling

2017

Monte Carlo (MC) methods are widely used for Bayesian inference in signal processing, machine learning and statistics. In this work, we introduce an adaptive importance sampler which mixes together the benefits of the Importance Sampling (IS) and Markov Chain Monte Carlo (MCMC) approaches. Different parallel MCMC chains provide the location parameters of the proposal probability density functions (pdfs) used in an IS method. The MCMC algorithms consider a tempered version of the posterior distribution as invariant density. We also provide an exhaustive theoretical support explaining why, in the presented technique, even an anti-tempering strategy (reducing the scaling of the posterior) can …

Mathematical optimizationRejection samplingSlice sampling020206 networking & telecommunicationsMarkov chain Monte Carlo02 engineering and technology01 natural sciencesStatistics::ComputationHybrid Monte Carlo010104 statistics & probabilitysymbols.namesakeMetropolis–Hastings algorithm[INFO.INFO-TS]Computer Science [cs]/Signal and Image Processing0202 electrical engineering electronic engineering information engineeringsymbolsParallel tempering0101 mathematicsParticle filter[SPI.SIGNAL]Engineering Sciences [physics]/Signal and Image processingImportance samplingComputingMilieux_MISCELLANEOUSMathematics
researchProduct

Group Importance Sampling for particle filtering and MCMC

2018

Bayesian methods and their implementations by means of sophisticated Monte Carlo techniques have become very popular in signal processing over the last years. Importance Sampling (IS) is a well-known Monte Carlo technique that approximates integrals involving a posterior distribution by means of weighted samples. In this work, we study the assignation of a single weighted sample which compresses the information contained in a population of weighted samples. Part of the theory that we present as Group Importance Sampling (GIS) has been employed implicitly in different works in the literature. The provided analysis yields several theoretical and practical consequences. For instance, we discus…

FOS: Computer and information sciencesComputer Science - Machine LearningComputer sciencePosterior probabilityMonte Carlo methodMachine Learning (stat.ML)02 engineering and technologyMultiple-try MetropolisStatistics - Computation01 natural sciencesMachine Learning (cs.LG)Computational Engineering Finance and Science (cs.CE)Methodology (stat.ME)010104 statistics & probabilitysymbols.namesake[INFO.INFO-TS]Computer Science [cs]/Signal and Image ProcessingStatistics - Machine LearningArtificial IntelligenceResampling0202 electrical engineering electronic engineering information engineering0101 mathematicsElectrical and Electronic EngineeringComputer Science - Computational Engineering Finance and ScienceStatistics - MethodologyComputation (stat.CO)ComputingMilieux_MISCELLANEOUSMarkov chainApplied Mathematics020206 networking & telecommunicationsMarkov chain Monte CarloStatistics::ComputationComputational Theory and MathematicsSignal ProcessingsymbolsComputer Vision and Pattern RecognitionStatistics Probability and UncertaintyParticle filter[SPI.SIGNAL]Engineering Sciences [physics]/Signal and Image processingAlgorithmImportance samplingDigital Signal Processing
researchProduct

Efficient linear fusion of partial estimators

2018

Abstract Many signal processing applications require performing statistical inference on large datasets, where computational and/or memory restrictions become an issue. In this big data setting, computing an exact global centralized estimator is often either unfeasible or impractical. Hence, several authors have considered distributed inference approaches, where the data are divided among multiple workers (cores, machines or a combination of both). The computations are then performed in parallel and the resulting partial estimators are finally combined to approximate the intractable global estimator. In this paper, we focus on the scenario where no communication exists among the workers, de…

Computer scienceBayesian probabilityInferenceAsymptotic distribution02 engineering and technology01 natural sciences010104 statistics & probability[INFO.INFO-TS]Computer Science [cs]/Signal and Image ProcessingArtificial Intelligence0202 electrical engineering electronic engineering information engineeringStatistical inferenceFusion rules0101 mathematicsElectrical and Electronic EngineeringComputingMilieux_MISCELLANEOUSMinimum mean square errorApplied MathematicsConstrained optimizationEstimator020206 networking & telecommunicationsComputational Theory and MathematicsSignal ProcessingComputer Vision and Pattern RecognitionStatistics Probability and Uncertainty[SPI.SIGNAL]Engineering Sciences [physics]/Signal and Image processingAlgorithmDigital Signal Processing
researchProduct

Distributed Particle Metropolis-Hastings Schemes

2018

We introduce a Particle Metropolis-Hastings algorithm driven by several parallel particle filters. The communication with the central node requires the transmission of only a set of weighted samples, one per filter. Furthermore, the marginal version of the previous scheme, called Distributed Particle Marginal Metropolis-Hastings (DPMMH) method, is also presented. DPMMH can be used for making inference on both a dynamical and static variable of interest. The ergodicity is guaranteed, and numerical simulations show the advantages of the novel schemes.

Computer scienceMonte Carlo methodErgodicity020206 networking & telecommunications02 engineering and technologyFilter (signal processing)Bayesian inferenceStatistics::ComputationSet (abstract data type)Metropolis–Hastings algorithm[INFO.INFO-TS]Computer Science [cs]/Signal and Image ProcessingTransmission (telecommunications)0202 electrical engineering electronic engineering information engineering020201 artificial intelligence & image processingParticle filter[SPI.SIGNAL]Engineering Sciences [physics]/Signal and Image processingAlgorithmComputingMilieux_MISCELLANEOUS2018 IEEE Statistical Signal Processing Workshop (SSP)
researchProduct

Group Metropolis Sampling

2017

Monte Carlo (MC) methods are widely used for Bayesian inference and optimization in statistics, signal processing and machine learning. Two well-known class of MC methods are the Importance Sampling (IS) techniques and the Markov Chain Monte Carlo (MCMC) algorithms. In this work, we introduce the Group Importance Sampling (GIS) framework where different sets of weighted samples are properly summarized with one summary particle and one summary weight. GIS facilitates the design of novel efficient MC techniques. For instance, we present the Group Metropolis Sampling (GMS) algorithm which produces a Markov chain of sets of weighted samples. GMS in general outperforms other multiple try schemes…

Computer scienceMonte Carlo methodMarkov processSlice samplingProbability density function02 engineering and technologyMultiple-try MetropolisBayesian inferenceMachine learningcomputer.software_genre01 natural sciencesHybrid Monte Carlo010104 statistics & probabilitysymbols.namesake[INFO.INFO-TS]Computer Science [cs]/Signal and Image Processing0202 electrical engineering electronic engineering information engineering0101 mathematicsComputingMilieux_MISCELLANEOUSMarkov chainbusiness.industryRejection samplingSampling (statistics)020206 networking & telecommunicationsMarkov chain Monte CarloMetropolis–Hastings algorithmsymbolsMonte Carlo method in statistical physicsMonte Carlo integrationArtificial intelligencebusinessParticle filter[SPI.SIGNAL]Engineering Sciences [physics]/Signal and Image processingcomputerAlgorithmImportance samplingMonte Carlo molecular modeling
researchProduct

Metropolis Sampling

2017

Monte Carlo (MC) sampling methods are widely applied in Bayesian inference, system simulation and optimization problems. The Markov Chain Monte Carlo (MCMC) algorithms are a well-known class of MC methods which generate a Markov chain with the desired invariant distribution. In this document, we focus on the Metropolis-Hastings (MH) sampler, which can be considered as the atom of the MCMC techniques, introducing the basic notions and different properties. We describe in details all the elements involved in the MH algorithm and the most relevant variants. Several improvements and recent extensions proposed in the literature are also briefly discussed, providing a quick but exhaustive overvie…

FOS: Computer and information sciencesMachine Learning (stat.ML)020206 networking & telecommunications02 engineering and technologyStatistics - Computation01 natural sciencesStatistics::ComputationMethodology (stat.ME)010104 statistics & probabilityStatistics - Machine Learning0202 electrical engineering electronic engineering information engineering0101 mathematicsComputation (stat.CO)Statistics - MethodologyWiley StatsRef: Statistics Reference Online
researchProduct