Search results for "Importance sampling"

showing 10 items of 24 documents

Importance sampling for Lambda-coalescents in the infinitely many sites model

2011

We present and discuss new importance sampling schemes for the approximate computation of the sample probability of observed genetic types in the infinitely many sites model from population genetics. More specifically, we extend the 'classical framework', where genealogies are assumed to be governed by Kingman's coalescent, to the more general class of Lambda-coalescents and develop further Hobolth et. al.'s (2008) idea of deriving importance sampling schemes based on 'compressed genetrees'. The resulting schemes extend earlier work by Griffiths and Tavar\'e (1994), Stephens and Donnelly (2000), Birkner and Blath (2008) and Hobolth et. al. (2008). We conclude with a performance comparison o…

Class (set theory)ComputationSample (statistics)62F99 (Primary) 62P10 92D10 92D20 (Secondary)LambdaArticleSampling StudiesCoalescent theoryEvolution MolecularGene FrequencyFOS: MathematicsQuantitative Biology::Populations and EvolutionAnimalsQuantitative Biology - Populations and EvolutionEcology Evolution Behavior and Systematicscomputer.programming_languageMathematicsDiscrete mathematicsModels GeneticBETA (programming language)Probability (math.PR)Populations and Evolution (q-bio.PE)Markov ChainsGenetics PopulationPerformance comparisonFOS: Biological sciencesMutationcomputerMonte Carlo MethodMathematics - ProbabilityImportance sampling
researchProduct

Adaptive Importance Sampling: The past, the present, and the future

2017

A fundamental problem in signal processing is the estimation of unknown parameters or functions from noisy observations. Important examples include localization of objects in wireless sensor networks [1] and the Internet of Things [2]; multiple source reconstruction from electroencephalograms [3]; estimation of power spectral density for speech enhancement [4]; or inference in genomic signal processing [5]. Within the Bayesian signal processing framework, these problems are addressed by constructing posterior probability distributions of the unknowns. The posteriors combine optimally all of the information about the unknowns in the observations with the information that is present in their …

Computer scienceBayesian probabilityPosterior probabilityInference02 engineering and technologyMachine learningcomputer.software_genre01 natural sciences010104 statistics & probabilityMultidimensional signal processing[INFO.INFO-TS]Computer Science [cs]/Signal and Image ProcessingPrior probability0202 electrical engineering electronic engineering information engineering0101 mathematicsElectrical and Electronic EngineeringComputingMilieux_MISCELLANEOUSbusiness.industryApplied Mathematics020206 networking & telecommunicationsApproximate inferenceSignal ProcessingProbability distributionArtificial intelligencebusinessAlgorithmcomputer[SPI.SIGNAL]Engineering Sciences [physics]/Signal and Image processingImportance sampling
researchProduct

Adaptive Population Importance Samplers: A General Perspective

2016

Importance sampling (IS) is a well-known Monte Carlo method, widely used to approximate a distribution of interest using a random measure composed of a set of weighted samples generated from another proposal density. Since the performance of the algorithm depends on the mismatch between the target and the proposal densities, a set of proposals is often iteratively adapted in order to reduce the variance of the resulting estimator. In this paper, we review several well-known adaptive population importance samplers, providing a unified common framework and classifying them according to the nature of their estimation and adaptive procedures. Furthermore, we interpret the underlying motivation …

Computer scienceMatemáticasMonte Carlo methodPopulation02 engineering and technologyMachine learningcomputer.software_genre01 natural sciences010104 statistics & probability[INFO.INFO-TS]Computer Science [cs]/Signal and Image Processing0202 electrical engineering electronic engineering information engineering0101 mathematicseducationComputingMilieux_MISCELLANEOUSeducation.field_of_studybusiness.industryEstimator020206 networking & telecommunicationsStatistical classificationRandom measureMonte Carlo integrationData miningArtificial intelligencebusinessParticle filtercomputer[SPI.SIGNAL]Engineering Sciences [physics]/Signal and Image processingImportance sampling
researchProduct

Group Metropolis Sampling

2017

Monte Carlo (MC) methods are widely used for Bayesian inference and optimization in statistics, signal processing and machine learning. Two well-known class of MC methods are the Importance Sampling (IS) techniques and the Markov Chain Monte Carlo (MCMC) algorithms. In this work, we introduce the Group Importance Sampling (GIS) framework where different sets of weighted samples are properly summarized with one summary particle and one summary weight. GIS facilitates the design of novel efficient MC techniques. For instance, we present the Group Metropolis Sampling (GMS) algorithm which produces a Markov chain of sets of weighted samples. GMS in general outperforms other multiple try schemes…

Computer scienceMonte Carlo methodMarkov processSlice samplingProbability density function02 engineering and technologyMultiple-try MetropolisBayesian inferenceMachine learningcomputer.software_genre01 natural sciencesHybrid Monte Carlo010104 statistics & probabilitysymbols.namesake[INFO.INFO-TS]Computer Science [cs]/Signal and Image Processing0202 electrical engineering electronic engineering information engineering0101 mathematicsComputingMilieux_MISCELLANEOUSMarkov chainbusiness.industryRejection samplingSampling (statistics)020206 networking & telecommunicationsMarkov chain Monte CarloMetropolis–Hastings algorithmsymbolsMonte Carlo method in statistical physicsMonte Carlo integrationArtificial intelligencebusinessParticle filter[SPI.SIGNAL]Engineering Sciences [physics]/Signal and Image processingcomputerAlgorithmImportance samplingMonte Carlo molecular modeling
researchProduct

Theoretical Foundations of the Monte Carlo Method and Its Applications in Statistical Physics

2002

In this chapter we first introduce the basic concepts of Monte Carlo sampling, give some details on how Monte Carlo programs need to be organized, and then proceed to the interpretation and analysis of Monte Carlo results.

Computer scienceMonte Carlo methodThermodynamic limitPeriodic boundary conditionsMonte Carlo method in statistical physicsIsing modelStatistical physicsImportance samplingMonte Carlo molecular modelingInterpretation (model theory)
researchProduct

On the use of approximate Bayesian computation Markov chain Monte Carlo with inflated tolerance and post-correction

2020

Approximate Bayesian computation allows for inference of complicated probabilistic models with intractable likelihoods using model simulations. The Markov chain Monte Carlo implementation of approximate Bayesian computation is often sensitive to the tolerance parameter: low tolerance leads to poor mixing and large tolerance entails excess bias. We consider an approach using a relatively large tolerance for the Markov chain Monte Carlo sampler to ensure its sufficient mixing, and post-processing the output leading to estimators for a range of finer tolerances. We introduce an approximate confidence interval for the related post-corrected estimators, and propose an adaptive approximate Bayesi…

FOS: Computer and information sciences0301 basic medicineStatistics and Probabilitytolerance choiceGeneral MathematicsMarkovin ketjutInference01 natural sciencesStatistics - Computationapproximate Bayesian computation010104 statistics & probability03 medical and health sciencessymbols.namesakeMixing (mathematics)adaptive algorithmalgoritmit0101 mathematicsComputation (stat.CO)MathematicsAdaptive algorithmMarkov chainbayesilainen menetelmäApplied MathematicsProbabilistic logicEstimatorMarkov chain Monte CarloAgricultural and Biological Sciences (miscellaneous)Markov chain Monte CarloMonte Carlo -menetelmätimportance sampling030104 developmental biologyconfidence intervalsymbolsStatistics Probability and UncertaintyApproximate Bayesian computationGeneral Agricultural and Biological SciencesAlgorithm
researchProduct

Group Importance Sampling for particle filtering and MCMC

2018

Bayesian methods and their implementations by means of sophisticated Monte Carlo techniques have become very popular in signal processing over the last years. Importance Sampling (IS) is a well-known Monte Carlo technique that approximates integrals involving a posterior distribution by means of weighted samples. In this work, we study the assignation of a single weighted sample which compresses the information contained in a population of weighted samples. Part of the theory that we present as Group Importance Sampling (GIS) has been employed implicitly in different works in the literature. The provided analysis yields several theoretical and practical consequences. For instance, we discus…

FOS: Computer and information sciencesComputer Science - Machine LearningComputer sciencePosterior probabilityMonte Carlo methodMachine Learning (stat.ML)02 engineering and technologyMultiple-try MetropolisStatistics - Computation01 natural sciencesMachine Learning (cs.LG)Computational Engineering Finance and Science (cs.CE)Methodology (stat.ME)010104 statistics & probabilitysymbols.namesake[INFO.INFO-TS]Computer Science [cs]/Signal and Image ProcessingStatistics - Machine LearningArtificial IntelligenceResampling0202 electrical engineering electronic engineering information engineering0101 mathematicsElectrical and Electronic EngineeringComputer Science - Computational Engineering Finance and ScienceStatistics - MethodologyComputation (stat.CO)ComputingMilieux_MISCELLANEOUSMarkov chainApplied Mathematics020206 networking & telecommunicationsMarkov chain Monte CarloStatistics::ComputationComputational Theory and MathematicsSignal ProcessingsymbolsComputer Vision and Pattern RecognitionStatistics Probability and UncertaintyParticle filter[SPI.SIGNAL]Engineering Sciences [physics]/Signal and Image processingAlgorithmImportance samplingDigital Signal Processing
researchProduct

Deep Importance Sampling based on Regression for Model Inversion and Emulation

2021

Understanding systems by forward and inverse modeling is a recurrent topic of research in many domains of science and engineering. In this context, Monte Carlo methods have been widely used as powerful tools for numerical inference and optimization. They require the choice of a suitable proposal density that is crucial for their performance. For this reason, several adaptive importance sampling (AIS) schemes have been proposed in the literature. We here present an AIS framework called Regression-based Adaptive Deep Importance Sampling (RADIS). In RADIS, the key idea is the adaptive construction via regression of a non-parametric proposal density (i.e., an emulator), which mimics the posteri…

FOS: Computer and information sciencesComputer Science - Machine LearningImportance samplingComputer scienceMonte Carlo methodPosterior probabilityBayesian inferenceInferenceContext (language use)Machine Learning (stat.ML)02 engineering and technologyEstadísticaStatistics - ComputationMachine Learning (cs.LG)symbols.namesakeSurrogate modelStatistics - Machine LearningArtificial Intelligence0202 electrical engineering electronic engineering information engineeringAdaptive regressionEmulationElectrical and Electronic EngineeringModel inversionGaussian processComputation (stat.CO)EmulationApplied Mathematics020206 networking & telecommunicationsRemote sensingComputational Theory and MathematicsSignal Processingsymbols020201 artificial intelligence & image processingComputer Vision and Pattern RecognitionStatistics Probability and UncertaintyAlgorithmImportance sampling
researchProduct

Compressed Particle Methods for Expensive Models With Application in Astronomy and Remote Sensing

2021

In many inference problems, the evaluation of complex and costly models is often required. In this context, Bayesian methods have become very popular in several fields over the last years, in order to obtain parameter inversion, model selection or uncertainty quantification. Bayesian inference requires the approximation of complicated integrals involving (often costly) posterior distributions. Generally, this approximation is obtained by means of Monte Carlo (MC) methods. In order to reduce the computational cost of the corresponding technique, surrogate models (also called emulators) are often employed. Another alternative approach is the so-called Approximate Bayesian Computation (ABC) sc…

FOS: Computer and information sciencesComputer scienceAstronomyModel selectionBayesian inferenceMonte Carlo methodBayesian probabilityAerospace EngineeringAstronomyInferenceMachine Learning (stat.ML)Context (language use)Bayesian inferenceStatistics - ComputationComputational Engineering Finance and Science (cs.CE)remote sensingimportance samplingStatistics - Machine Learningnumerical inversionparticle filteringElectrical and Electronic EngineeringUncertainty quantificationApproximate Bayesian computationComputer Science - Computational Engineering Finance and ScienceComputation (stat.CO)IEEE Transactions on Aerospace and Electronic Systems
researchProduct

Heretical Mutiple Importance Sampling

2016

Multiple Importance Sampling (MIS) methods approximate moments of complicated distributions by drawing samples from a set of proposal distributions. Several ways to compute the importance weights assigned to each sample have been recently proposed, with the so-called deterministic mixture (DM) weights providing the best performance in terms of variance, at the expense of an increase in the computational cost. A recent work has shown that it is possible to achieve a trade-off between variance reduction and computational effort by performing an a priori random clustering of the proposals (partial DM algorithm). In this paper, we propose a novel "heretical" MIS framework, where the clustering …

FOS: Computer and information sciencesMean squared errorComputer scienceApplied MathematicsEstimator020206 networking & telecommunications02 engineering and technologyVariance (accounting)Statistics - Computation01 natural sciencesReduction (complexity)010104 statistics & probability[INFO.INFO-TS]Computer Science [cs]/Signal and Image ProcessingSignal Processing0202 electrical engineering electronic engineering information engineeringA priori and a posterioriVariance reduction0101 mathematicsElectrical and Electronic EngineeringCluster analysisAlgorithm[SPI.SIGNAL]Engineering Sciences [physics]/Signal and Image processingImportance samplingComputation (stat.CO)ComputingMilieux_MISCELLANEOUS
researchProduct