0000000001085832

AUTHOR

Jordan Franks

showing 6 related works from this author

Importance sampling correction versus standard averages of reversible MCMCs in terms of the asymptotic variance

2017

We establish an ordering criterion for the asymptotic variances of two consistent Markov chain Monte Carlo (MCMC) estimators: an importance sampling (IS) estimator, based on an approximate reversible chain and subsequent IS weighting, and a standard MCMC estimator, based on an exact reversible chain. Essentially, we relax the criterion of the Peskun type covariance ordering by considering two different invariant probabilities, and obtain, in place of a strict ordering of asymptotic variances, a bound of the asymptotic variance of IS by that of the direct MCMC. Simple examples show that IS can have arbitrarily better or worse asymptotic variance than Metropolis-Hastings and delayed-acceptanc…

Statistics and ProbabilityFOS: Computer and information sciencesdelayed-acceptanceMarkovin ketjut01 natural sciencesStatistics - Computationasymptotic variance010104 statistics & probabilitysymbols.namesake60J22 65C05unbiased estimatorFOS: MathematicsApplied mathematics0101 mathematicsComputation (stat.CO)stokastiset prosessitestimointiMathematicsnumeeriset menetelmätpseudo-marginal algorithmApplied Mathematics010102 general mathematicsProbability (math.PR)EstimatorMarkov chain Monte CarloCovarianceInfimum and supremumWeightingMarkov chain Monte CarloMonte Carlo -menetelmätDelta methodimportance samplingModeling and SimulationBounded functionsymbolsImportance samplingMathematics - Probability
researchProduct

Unbiased Inference for Discretely Observed Hidden Markov Model Diffusions

2021

We develop a Bayesian inference method for diffusions observed discretely and with noise, which is free of discretisation bias. Unlike existing unbiased inference methods, our method does not rely on exact simulation techniques. Instead, our method uses standard time-discretised approximations of diffusions, such as the Euler--Maruyama scheme. Our approach is based on particle marginal Metropolis--Hastings, a particle filter, randomised multilevel Monte Carlo, and importance sampling type correction of approximate Markov chain Monte Carlo. The resulting estimator leads to inference without a bias from the time-discretisation as the number of Markov chain iterations increases. We give conver…

FOS: Computer and information sciencesStatistics and ProbabilityDiscretizationComputer scienceMarkovin ketjutInference010103 numerical & computational mathematicssequential Monte CarloBayesian inferenceStatistics - Computation01 natural sciencesMethodology (stat.ME)010104 statistics & probabilitysymbols.namesakediffuusio (fysikaaliset ilmiöt)FOS: MathematicsDiscrete Mathematics and Combinatorics0101 mathematicsHidden Markov modelComputation (stat.CO)Statistics - Methodologymatematiikkabayesilainen menetelmäApplied MathematicsProbability (math.PR)diffusionmatemaattiset menetelmätMarkov chain Monte CarloMarkov chain Monte CarloMonte Carlo -menetelmätNoiseimportance sampling65C05 (primary) 60H35 65C35 65C40 (secondary)Modeling and Simulationsymbolsmatemaattiset mallitStatistics Probability and Uncertaintymultilevel Monte CarloParticle filterAlgorithmMathematics - ProbabilityImportance samplingSIAM/ASA Journal on Uncertainty Quantification
researchProduct

On the use of approximate Bayesian computation Markov chain Monte Carlo with inflated tolerance and post-correction

2020

Approximate Bayesian computation allows for inference of complicated probabilistic models with intractable likelihoods using model simulations. The Markov chain Monte Carlo implementation of approximate Bayesian computation is often sensitive to the tolerance parameter: low tolerance leads to poor mixing and large tolerance entails excess bias. We consider an approach using a relatively large tolerance for the Markov chain Monte Carlo sampler to ensure its sufficient mixing, and post-processing the output leading to estimators for a range of finer tolerances. We introduce an approximate confidence interval for the related post-corrected estimators, and propose an adaptive approximate Bayesi…

FOS: Computer and information sciences0301 basic medicineStatistics and Probabilitytolerance choiceGeneral MathematicsMarkovin ketjutInference01 natural sciencesStatistics - Computationapproximate Bayesian computation010104 statistics & probability03 medical and health sciencessymbols.namesakeMixing (mathematics)adaptive algorithmalgoritmit0101 mathematicsComputation (stat.CO)MathematicsAdaptive algorithmMarkov chainbayesilainen menetelmäApplied MathematicsProbabilistic logicEstimatorMarkov chain Monte CarloAgricultural and Biological Sciences (miscellaneous)Markov chain Monte CarloMonte Carlo -menetelmätimportance sampling030104 developmental biologyconfidence intervalsymbolsStatistics Probability and UncertaintyApproximate Bayesian computationGeneral Agricultural and Biological SciencesAlgorithm
researchProduct

Importance sampling type estimators based on approximate marginal Markov chain Monte Carlo

2020

We consider importance sampling (IS) type weighted estimators based on Markov chain Monte Carlo (MCMC) targeting an approximate marginal of the target distribution. In the context of Bayesian latent variable models, the MCMC typically operates on the hyperparameters, and the subsequent weighting may be based on IS or sequential Monte Carlo (SMC), but allows for multilevel techniques as well. The IS approach provides a natural alternative to delayed acceptance (DA) pseudo-marginal/particle MCMC, and has many advantages over DA, including a straightforward parallelisation and additional flexibility in MCMC implementation. We detail minimal conditions which ensure strong consistency of the sug…

Statistics and ProbabilityHyperparameter05 social sciencesBayesian probabilityStrong consistencyEstimatorContext (language use)Markov chain Monte Carlo01 natural sciencesStatistics::Computation010104 statistics & probabilitysymbols.namesake0502 economics and businesssymbols0101 mathematicsStatistics Probability and UncertaintyParticle filterAlgorithmImportance sampling050205 econometrics MathematicsScandinavian Journal of Statistics
researchProduct

Importance sampling type estimators based on approximate marginal Markov chain Monte Carlo

2020

We consider importance sampling (IS) type weighted estimators based on Markov chain Monte Carlo (MCMC) targeting an approximate marginal of the target distribution. In the context of Bayesian latent variable models, the MCMC typically operates on the hyperparameters, and the subsequent weighting may be based on IS or sequential Monte Carlo (SMC), but allows for multilevel techniques as well. The IS approach provides a natural alternative to delayed acceptance (DA) pseudo-marginal/particle MCMC, and has many advantages over DA, including a straightforward parallelisation and additional flexibility in MCMC implementation. We detail minimal conditions which ensure strong consistency of the sug…

Monte Carlo -menetelmätbayesilainen menetelmätilastomenetelmätMarkovin ketjutMarkov chain Monte Carlo (MCMC)Bayesian analysisotantaStatistics::Computationestimointi
researchProduct

Markov chain Monte Carlo importance samplers for Bayesian models with intractable likelihoods

2019

Markov chain Monte Carlo (MCMC) is an approach to parameter inference in Bayesian models that is based on computing ergodic averages formed from a Markov chain targeting the Bayesian posterior probability. We consider the efficient use of an approximation within the Markov chain, with subsequent importance sampling (IS) correction of the Markov chain inexact output, leading to asymptotically exact inference. We detail convergence and central limit theorems for the resulting MCMC-IS estimators. We also consider the case where the approximate Markov chain is pseudo-marginal, requiring unbiased estimators for its approximate marginal target. Convergence results with asymptotic variance formula…

Markov chainsasymptoteapproximationBayesian modelsStatistics::Computation
researchProduct