0000000000714942

AUTHOR

Anthony Lee

0000-0001-7765-0616

showing 3 related works from this author

Coupled conditional backward sampling particle filter

2020

The conditional particle filter (CPF) is a promising algorithm for general hidden Markov model smoothing. Empirical evidence suggests that the variant of CPF with backward sampling (CBPF) performs well even with long time series. Previous theoretical results have not been able to demonstrate the improvement brought by backward sampling, whereas we provide rates showing that CBPF can remain effective with a fixed number of particles independent of the time horizon. Our result is based on analysis of a new coupling of two CBPFs, the coupled conditional backward sampling particle filter (CCBPF). We show that CCBPF has good stability properties in the sense that with fixed number of particles, …

65C05FOS: Computer and information sciencesStatistics and ProbabilityunbiasedMarkovin ketjutTime horizonStatistics - Computation01 natural sciencesStability (probability)backward sampling65C05 (Primary) 60J05 65C35 65C40 (secondary)010104 statistics & probabilityconvergence rateFOS: MathematicsApplied mathematics0101 mathematicscouplingHidden Markov model65C35Computation (stat.CO)Mathematicsstokastiset prosessitBackward samplingSeries (mathematics)Probability (math.PR)Sampling (statistics)conditional particle filterMonte Carlo -menetelmätRate of convergence65C6065C40numeerinen analyysiStatistics Probability and UncertaintyParticle filterMathematics - ProbabilitySmoothing
researchProduct

Uniform ergodicity of the iterated conditional SMC and geometric ergodicity of particle Gibbs samplers

2018

We establish quantitative bounds for rates of convergence and asymptotic variances for iterated conditional sequential Monte Carlo (i-cSMC) Markov chains and associated particle Gibbs samplers. Our main findings are that the essential boundedness of potential functions associated with the i-cSMC algorithm provide necessary and sufficient conditions for the uniform ergodicity of the i-cSMC Markov chain, as well as quantitative bounds on its (uniformly geometric) rate of convergence. Furthermore, we show that the i-cSMC Markov chain cannot even be geometrically ergodic if this essential boundedness does not hold in many applications of interest. Our sufficiency and quantitative bounds rely on…

Statistics and ProbabilityMetropoliswithin-Gibbsgeometric ergodicity01 natural sciencesCombinatorics010104 statistics & probabilitysymbols.namesakeFOS: MathematicsMetropolis-within-GibbsApplied mathematicsErgodic theory0101 mathematicsGibbs measureQAMathematics65C40 (Primary) 60J05 65C05 (Secondary)Particle GibbsMarkov chainGeometric ergodicity010102 general mathematicsErgodicityuniform ergodicityProbability (math.PR)iterated conditional sequential Monte CarloMarkov chain Monte CarloIterated conditional sequential Monte CarloRate of convergencesymbolsUniform ergodicityparticle GibbsParticle filterMathematics - ProbabilityGibbs sampling
researchProduct

Theoretical and methodological aspects of MCMC computations with noisy likelihoods

2018

Approximate Bayesian computation (ABC) [11, 42] is a popular method for Bayesian inference involving an intractable, or expensive to evaluate, likelihood function but where simulation from the model is easy. The method consists of defining an alternative likelihood function which is also in general intractable but naturally lends itself to pseudo-marginal computations [5], hence making the approach of practical interest. The aim of this chapter is to show the connections of ABC Markov chain Monte Carlo with pseudo-marginal algorithms, review their existing theoretical results, and discuss how these can inform practice and hopefully lead to fruitful methodological developments. peerReviewed

todennäköisyyslaskentabayesilainen menetelmälikelihoodsBayesian computationStatistics::Computation
researchProduct