Search results for "Markov chain Monte Carlo"

showing 10 items of 79 documents

Nonlinear impact estimation in spatial autoregressive models

2018

International audience; This paper extends the literature on the calculation and interpretation of impacts for spatial autoregressive models. Using a Bayesian framework, we show how the individual direct and indirect impacts associated with an exogenous variable introduced in a nonlinear way in such models can be computed, theoretically and empirically. Rather than averaging the individual impacts, we suggest to graphically analyze them along with their confidence intervals calculated from Markov chain Monte Carlo (MCMC). We also explicitly derive the form of the gap between individual impacts in the spatial autoregressive model and the corresponding model without a spatial lag and show, in…

Economics and Econometrics[SDV]Life Sciences [q-bio]Lag0507 social and economic geographysymbols.namesake0502 economics and businessEconometricsMarginal impacts050207 economicsSpatial econometricsMathematics05 social sciencesMarkov chain Monte Carlo[SHS.ECO]Humanities and Social Sciences/Economics and FinanceSplineConfidence intervalMarkov chain Monte CarloSpline (mathematics)Nonlinear systemAutoregressive model13. Climate actionsymbolsBayesian frameworkSpatial econometrics050703 geographyFinanceEconomics Letters
researchProduct

Monte Carlo simulation of DNA electrophoresis

1989

This paper describes an attempt to study the electrophoresis mobility of a DNA molecule in a gel by means of a Monte Carlo simulation. We find that the electrophoresis mobility mu can be well described by the empirical equation mu v kappa 1/N + kappa 2E2 with N being the number of monomers of the model chain and E being the applied field. For small E the data can merge into the linear response result mu = kappa 1/N. The paper also discusses necessary extensions of the present approach.

ElectrophoresisPhysicsQuantitative Biology::BiomoleculesGel electrophoresis of nucleic acidsClinical BiochemistryMonte Carlo methodMarkov chain Monte CarloDNABiochemistryAnalytical ChemistryMolecular WeightHybrid Monte CarloElectrophoresissymbols.namesakeModels ChemicalsymbolsDynamic Monte Carlo methodComputer SimulationStatistical physicsGelsKappaMonte Carlo molecular modelingElectrophoresis
researchProduct

Comparison of different uncertainty techniques in urban stormwater quantity and quality modelling

2011

Abstract Urban drainage models are important tools used by both practitioners and scientists in the field of stormwater management. These models are often conceptual and usually require calibration using local datasets. The quantification of the uncertainty associated with the models is a must, although it is rarely practiced. The International Working Group on Data and Models, which works under the IWA/IAHR Joint Committee on Urban Drainage, has been working on the development of a framework for defining and assessing uncertainties in the field of urban drainage modelling. A part of that work is the assessment and comparison of different techniques generally used in the uncertainty assessm…

EngineeringEnvironmental Engineering* MCMCRainmedia_common.quotation_subjectBayesian probability* Parameter probability distributionBayesian inferencecomputer.software_genre* MICAsymbols.namesake* GLUEWater QualityStatistics* Bayesian inferenceComputer SimulationQuality (business)CitiesGLUEWaste Management and Disposal* Urban drainage modelWater Science and TechnologyCivil and Structural Engineeringmedia_common* SCEM-UALikelihood Functions* Multi-objective auto-calibrationSettore ICAR/03 - Ingegneria Sanitaria-Ambientalebusiness.industryEcological ModelingUncertaintyMarkov chain Monte CarloModels TheoreticalPollutionMarkov ChainsRunoff model* UncertaintieMetropolis–Hastings algorithmsymbolsProbability distribution* AMALGAMData miningbusinessMonte Carlo MethodcomputerAlgorithmsSoftware
researchProduct

On the use of approximate Bayesian computation Markov chain Monte Carlo with inflated tolerance and post-correction

2020

Approximate Bayesian computation allows for inference of complicated probabilistic models with intractable likelihoods using model simulations. The Markov chain Monte Carlo implementation of approximate Bayesian computation is often sensitive to the tolerance parameter: low tolerance leads to poor mixing and large tolerance entails excess bias. We consider an approach using a relatively large tolerance for the Markov chain Monte Carlo sampler to ensure its sufficient mixing, and post-processing the output leading to estimators for a range of finer tolerances. We introduce an approximate confidence interval for the related post-corrected estimators, and propose an adaptive approximate Bayesi…

FOS: Computer and information sciences0301 basic medicineStatistics and Probabilitytolerance choiceGeneral MathematicsMarkovin ketjutInference01 natural sciencesStatistics - Computationapproximate Bayesian computation010104 statistics & probability03 medical and health sciencessymbols.namesakeMixing (mathematics)adaptive algorithmalgoritmit0101 mathematicsComputation (stat.CO)MathematicsAdaptive algorithmMarkov chainbayesilainen menetelmäApplied MathematicsProbabilistic logicEstimatorMarkov chain Monte CarloAgricultural and Biological Sciences (miscellaneous)Markov chain Monte CarloMonte Carlo -menetelmätimportance sampling030104 developmental biologyconfidence intervalsymbolsStatistics Probability and UncertaintyApproximate Bayesian computationGeneral Agricultural and Biological SciencesAlgorithm
researchProduct

Group Importance Sampling for particle filtering and MCMC

2018

Bayesian methods and their implementations by means of sophisticated Monte Carlo techniques have become very popular in signal processing over the last years. Importance Sampling (IS) is a well-known Monte Carlo technique that approximates integrals involving a posterior distribution by means of weighted samples. In this work, we study the assignation of a single weighted sample which compresses the information contained in a population of weighted samples. Part of the theory that we present as Group Importance Sampling (GIS) has been employed implicitly in different works in the literature. The provided analysis yields several theoretical and practical consequences. For instance, we discus…

FOS: Computer and information sciencesComputer Science - Machine LearningComputer sciencePosterior probabilityMonte Carlo methodMachine Learning (stat.ML)02 engineering and technologyMultiple-try MetropolisStatistics - Computation01 natural sciencesMachine Learning (cs.LG)Computational Engineering Finance and Science (cs.CE)Methodology (stat.ME)010104 statistics & probabilitysymbols.namesake[INFO.INFO-TS]Computer Science [cs]/Signal and Image ProcessingStatistics - Machine LearningArtificial IntelligenceResampling0202 electrical engineering electronic engineering information engineering0101 mathematicsElectrical and Electronic EngineeringComputer Science - Computational Engineering Finance and ScienceStatistics - MethodologyComputation (stat.CO)ComputingMilieux_MISCELLANEOUSMarkov chainApplied Mathematics020206 networking & telecommunicationsMarkov chain Monte CarloStatistics::ComputationComputational Theory and MathematicsSignal ProcessingsymbolsComputer Vision and Pattern RecognitionStatistics Probability and UncertaintyParticle filter[SPI.SIGNAL]Engineering Sciences [physics]/Signal and Image processingAlgorithmImportance samplingDigital Signal Processing
researchProduct

A Review of Multiple Try MCMC algorithms for Signal Processing

2018

Many applications in signal processing require the estimation of some parameters of interest given a set of observed data. More specifically, Bayesian inference needs the computation of {\it a-posteriori} estimators which are often expressed as complicated multi-dimensional integrals. Unfortunately, analytical expressions for these estimators cannot be found in most real-world applications, and Monte Carlo methods are the only feasible approach. A very powerful class of Monte Carlo techniques is formed by the Markov Chain Monte Carlo (MCMC) algorithms. They generate a Markov chain such that its stationary distribution coincides with the target posterior density. In this work, we perform a t…

FOS: Computer and information sciencesComputer scienceMonte Carlo methodMachine Learning (stat.ML)02 engineering and technologyMultiple-try MetropolisBayesian inference01 natural sciencesStatistics - Computation010104 statistics & probabilitysymbols.namesakeArtificial IntelligenceStatistics - Machine Learning0202 electrical engineering electronic engineering information engineering0101 mathematicsElectrical and Electronic EngineeringComputation (stat.CO)Signal processingMarkov chainApplied MathematicsEstimator020206 networking & telecommunicationsMarkov chain Monte CarloStatistics::ComputationComputational Theory and MathematicsSignal ProcessingsymbolsSample spaceComputer Vision and Pattern RecognitionStatistics Probability and UncertaintyAlgorithm
researchProduct

Adaptive independent sticky MCMC algorithms

2018

In this work, we introduce a novel class of adaptive Monte Carlo methods, called adaptive independent sticky MCMC algorithms, for efficient sampling from a generic target probability density function (pdf). The new class of algorithms employs adaptive non-parametric proposal densities which become closer and closer to the target as the number of iterations increases. The proposal pdf is built using interpolation procedures based on a set of support points which is constructed iteratively based on previously drawn samples. The algorithm's efficiency is ensured by a test that controls the evolution of the set of support points. This extra stage controls the computational cost and the converge…

FOS: Computer and information sciencesMathematical optimizationAdaptive Markov chain Monte Carlo (MCMC)Monte Carlo methodBayesian inferenceHASettore SECS-P/05 - Econometrialcsh:TK7800-8360Machine Learning (stat.ML)02 engineering and technologyBayesian inference01 natural sciencesStatistics - Computationlcsh:Telecommunication010104 statistics & probabilitysymbols.namesakeAdaptive Markov chain Monte Carlo (MCMC); Adaptive rejection Metropolis sampling (ARMS); Bayesian inference; Gibbs sampling; Hit and run algorithm; Metropolis-within-Gibbs; Monte Carlo methods; Signal Processing; Hardware and Architecture; Electrical and Electronic EngineeringGibbs samplingStatistics - Machine Learninglcsh:TK5101-67200202 electrical engineering electronic engineering information engineeringComputational statisticsMetropolis-within-GibbsHit and run algorithm0101 mathematicsElectrical and Electronic EngineeringGaussian processComputation (stat.CO)MathematicsSignal processinglcsh:Electronics020206 networking & telecommunicationsMarkov chain Monte CarloMonte Carlo methodsHardware and ArchitectureSignal ProcessingSettore SECS-S/03 - Statistica EconomicasymbolsSettore SECS-S/01 - StatisticaStatistical signal processingGibbs samplingAdaptive rejection Metropolis sampling (ARMS)EURASIP Journal on Advances in Signal Processing
researchProduct

The Recycling Gibbs sampler for efficient learning

2018

Monte Carlo methods are essential tools for Bayesian inference. Gibbs sampling is a well-known Markov chain Monte Carlo (MCMC) algorithm, extensively used in signal processing, machine learning, and statistics, employed to draw samples from complicated high-dimensional posterior distributions. The key point for the successful application of the Gibbs sampler is the ability to draw efficiently samples from the full-conditional probability density functions. Since in the general case this is not possible, in order to speed up the convergence of the chain, it is required to generate auxiliary samples whose information is eventually disregarded. In this work, we show that these auxiliary sample…

FOS: Computer and information sciencesMonte Carlo methodSlice samplingInferenceMachine Learning (stat.ML)02 engineering and technologyBayesian inferenceStatistics - Computation01 natural sciencesMachine Learning (cs.LG)010104 statistics & probabilitysymbols.namesake[INFO.INFO-TS]Computer Science [cs]/Signal and Image ProcessingStatistics - Machine LearningArtificial IntelligenceStatistics0202 electrical engineering electronic engineering information engineering0101 mathematicsElectrical and Electronic EngineeringGaussian processComputation (stat.CO)ComputingMilieux_MISCELLANEOUSMathematicsChain rule (probability)Applied Mathematics020206 networking & telecommunicationsMarkov chain Monte CarloStatistics::ComputationComputer Science - LearningComputational Theory and MathematicsSignal ProcessingsymbolsComputer Vision and Pattern RecognitionStatistics Probability and UncertaintyAlgorithm[SPI.SIGNAL]Engineering Sciences [physics]/Signal and Image processingGibbs samplingDigital Signal Processing
researchProduct

Conditional particle filters with diffuse initial distributions

2020

Conditional particle filters (CPFs) are powerful smoothing algorithms for general nonlinear/non-Gaussian hidden Markov models. However, CPFs can be inefficient or difficult to apply with diffuse initial distributions, which are common in statistical applications. We propose a simple but generally applicable auxiliary variable method, which can be used together with the CPF in order to perform efficient inference with diffuse initial distributions. The method only requires simulatable Markov transitions that are reversible with respect to the initial distribution, which can be improper. We focus in particular on random-walk type transitions which are reversible with respect to a uniform init…

FOS: Computer and information sciencesStatistics and ProbabilityComputer scienceGaussianBayesian inferenceMarkovin ketjut02 engineering and technology01 natural sciencesStatistics - ComputationArticleTheoretical Computer ScienceMethodology (stat.ME)010104 statistics & probabilitysymbols.namesakeAdaptive Markov chain Monte Carlotilastotiede0202 electrical engineering electronic engineering information engineeringStatistical physics0101 mathematicsDiffuse initialisationHidden Markov modelComputation (stat.CO)Statistics - MethodologyState space modelHidden Markov modelbayesian inferenceMarkov chaindiffuse initialisationbayesilainen menetelmäconditional particle filtersmoothingmatemaattiset menetelmät020206 networking & telecommunicationsConditional particle filterCovariancecompartment modelRandom walkCompartment modelstate space modelComputational Theory and MathematicsAutoregressive modelsymbolsStatistics Probability and UncertaintyParticle filterSmoothingSmoothing
researchProduct

Unbiased Inference for Discretely Observed Hidden Markov Model Diffusions

2021

We develop a Bayesian inference method for diffusions observed discretely and with noise, which is free of discretisation bias. Unlike existing unbiased inference methods, our method does not rely on exact simulation techniques. Instead, our method uses standard time-discretised approximations of diffusions, such as the Euler--Maruyama scheme. Our approach is based on particle marginal Metropolis--Hastings, a particle filter, randomised multilevel Monte Carlo, and importance sampling type correction of approximate Markov chain Monte Carlo. The resulting estimator leads to inference without a bias from the time-discretisation as the number of Markov chain iterations increases. We give conver…

FOS: Computer and information sciencesStatistics and ProbabilityDiscretizationComputer scienceMarkovin ketjutInference010103 numerical & computational mathematicssequential Monte CarloBayesian inferenceStatistics - Computation01 natural sciencesMethodology (stat.ME)010104 statistics & probabilitysymbols.namesakediffuusio (fysikaaliset ilmiöt)FOS: MathematicsDiscrete Mathematics and Combinatorics0101 mathematicsHidden Markov modelComputation (stat.CO)Statistics - Methodologymatematiikkabayesilainen menetelmäApplied MathematicsProbability (math.PR)diffusionmatemaattiset menetelmätMarkov chain Monte CarloMarkov chain Monte CarloMonte Carlo -menetelmätNoiseimportance sampling65C05 (primary) 60H35 65C35 65C40 (secondary)Modeling and Simulationsymbolsmatemaattiset mallitStatistics Probability and Uncertaintymultilevel Monte CarloParticle filterAlgorithmMathematics - ProbabilityImportance samplingSIAM/ASA Journal on Uncertainty Quantification
researchProduct