Search results for "metropolis"

showing 10 items of 43 documents

Can the Adaptive Metropolis Algorithm Collapse Without the Covariance Lower Bound?

2011

The Adaptive Metropolis (AM) algorithm is based on the symmetric random-walk Metropolis algorithm. The proposal distribution has the following time-dependent covariance matrix at step $n+1$ \[ S_n = Cov(X_1,...,X_n) + \epsilon I, \] that is, the sample covariance matrix of the history of the chain plus a (small) constant $\epsilon>0$ multiple of the identity matrix $I$. The lower bound on the eigenvalues of $S_n$ induced by the factor $\epsilon I$ is theoretically convenient, but practically cumbersome, as a good value for the parameter $\epsilon$ may not always be easy to choose. This article considers variants of the AM algorithm that do not explicitly bound the eigenvalues of $S_n$ away …

Statistics and ProbabilityFOS: Computer and information sciencesIdentity matrixMathematics - Statistics TheoryStatistics Theory (math.ST)Upper and lower boundsStatistics - Computation93E3593E15Combinatorics60J27Mathematics::ProbabilityLaw of large numbers65C40 60J27 93E15 93E35stochastic approximationFOS: MathematicsEigenvalues and eigenvectorsComputation (stat.CO)Metropolis algorithmMathematicsProbability (math.PR)Zero (complex analysis)CovariancestabilityUniform continuityBounded function65C40Statistics Probability and Uncertaintyadaptive Markov chain Monte CarloMathematics - Probability
researchProduct

MCMC methods to approximate conditional predictive distributions

2006

Sampling from conditional distributions is a problem often encountered in statistics when inferences are based on conditional distributions which are not of closed-form. Several Markov chain Monte Carlo (MCMC) algorithms to simulate from them are proposed. Potential problems are pointed out and some suitable modifications are suggested. Approximations based on conditioning sets are also explored. The issues are illustrated within a specific statistical tool for Bayesian model checking, and compared in an example. An example in frequentist conditional testing is also given.

Statistics and ProbabilityMarkov chainApplied MathematicsMarkov chain Monte CarloConditional probability distributionBayesian inferenceComputational Mathematicssymbols.namesakeMetropolis–Hastings algorithmComputational Theory and MathematicsSampling distributionFrequentist inferencesymbolsEconometricsAlgorithmMathematicsGibbs samplingComputational Statistics & Data Analysis
researchProduct

Componentwise adaptation for high dimensional MCMC

2005

We introduce a new adaptive MCMC algorithm, based on the traditional single component Metropolis-Hastings algorithm and on our earlier adaptive Metropolis algorithm (AM). In the new algorithm the adaption is performed component by component. The chain is no more Markovian, but it remains ergodic. The algorithm is demonstrated to work well in varying test cases up to 1000 dimensions.

Statistics and ProbabilityMathematical optimization010504 meteorology & atmospheric sciencesMonte Carlo methodMarkov processMarkov chain Monte Carlo01 natural sciencesStatistics::Computation010104 statistics & probabilityComputational Mathematicssymbols.namesakeMetropolis–Hastings algorithmTest caseChain (algebraic topology)Component (UML)symbolsStatistics::MethodologyErgodic theory0101 mathematicsStatistics Probability and Uncertainty0105 earth and related environmental sciencesMathematicsComputational Statistics
researchProduct

Adaptive Metropolis algorithm using variational Bayesian adaptive Kalman filter

2013

Markov chain Monte Carlo (MCMC) methods are powerful computational tools for analysis of complex statistical problems. However, their computational efficiency is highly dependent on the chosen proposal distribution, which is generally difficult to find. One way to solve this problem is to use adaptive MCMC algorithms which automatically tune the statistics of a proposal distribution during the MCMC run. A new adaptive MCMC algorithm, called the variational Bayesian adaptive Metropolis (VBAM) algorithm, is developed. The VBAM algorithm updates the proposal covariance matrix using the variational Bayesian adaptive Kalman filter (VB-AKF). A strong law of large numbers for the VBAM algorithm is…

Statistics and ProbabilityMathematical optimizationCovariance matrixApplied MathematicsBayesian probabilityRejection samplingMathematics - Statistics TheoryMarkov chain Monte CarloStatistics Theory (math.ST)Kalman filterStatistics::ComputationComputational Mathematicssymbols.namesakeComputingMethodologies_PATTERNRECOGNITIONMetropolis–Hastings algorithmComputational Theory and MathematicsConvergence (routing)FOS: MathematicsKernel adaptive filtersymbolsMathematicsComputational Statistics & Data Analysis
researchProduct

Uniform ergodicity of the iterated conditional SMC and geometric ergodicity of particle Gibbs samplers

2018

We establish quantitative bounds for rates of convergence and asymptotic variances for iterated conditional sequential Monte Carlo (i-cSMC) Markov chains and associated particle Gibbs samplers. Our main findings are that the essential boundedness of potential functions associated with the i-cSMC algorithm provide necessary and sufficient conditions for the uniform ergodicity of the i-cSMC Markov chain, as well as quantitative bounds on its (uniformly geometric) rate of convergence. Furthermore, we show that the i-cSMC Markov chain cannot even be geometrically ergodic if this essential boundedness does not hold in many applications of interest. Our sufficiency and quantitative bounds rely on…

Statistics and ProbabilityMetropoliswithin-Gibbsgeometric ergodicity01 natural sciencesCombinatorics010104 statistics & probabilitysymbols.namesakeFOS: MathematicsMetropolis-within-GibbsApplied mathematicsErgodic theory0101 mathematicsGibbs measureQAMathematics65C40 (Primary) 60J05 65C05 (Secondary)Particle GibbsMarkov chainGeometric ergodicity010102 general mathematicsErgodicityuniform ergodicityProbability (math.PR)iterated conditional sequential Monte CarloMarkov chain Monte CarloIterated conditional sequential Monte CarloRate of convergencesymbolsUniform ergodicityparticle GibbsParticle filterMathematics - ProbabilityGibbs sampling
researchProduct

On the stability and ergodicity of adaptive scaling Metropolis algorithms

2011

The stability and ergodicity properties of two adaptive random walk Metropolis algorithms are considered. The both algorithms adjust the scaling of the proposal distribution continuously based on the observed acceptance probability. Unlike the previously proposed forms of the algorithms, the adapted scaling parameter is not constrained within a predefined compact interval. The first algorithm is based on scale adaptation only, while the second one incorporates also covariance adaptation. A strong law of large numbers is shown to hold assuming that the target density is smooth enough and has either compact support or super-exponentially decaying tails.

Statistics and ProbabilityStochastic approximationMathematics - Statistics TheoryStatistics Theory (math.ST)Law of large numbersMultiple-try Metropolis01 natural sciencesStability (probability)010104 statistics & probabilityModelling and Simulation65C40 60J27 93E15 93E35Adaptive Markov chain Monte CarloFOS: Mathematics0101 mathematicsScalingMetropolis algorithmMathematicsta112Applied Mathematics010102 general mathematicsRejection samplingErgodicityProbability (math.PR)ta111CovarianceRandom walkMetropolis–Hastings algorithmModeling and SimulationAlgorithmStabilityMathematics - ProbabilityStochastic Processes and their Applications
researchProduct

Contributed discussion on article by Pratola

2016

The author should be commended for his outstanding contribution to the literature on Bayesian regression tree models. The author introduces three innovative sampling approaches which allow for efficient traversal of the model space. In this response, we add a fourth alternative.

Statistics and Probabilitymodel selectionMarkov Chain Monte Carlo (MCMC)Bayesian regression treeComputer scienceBig dataBayesian regression tree (BRT) modelsComputingMilieux_LEGALASPECTSOFCOMPUTINGbirth–death processMachine learningcomputer.software_genreSequential Monte Carlo methods01 natural sciencespopulation Markov chain Monte Carlo010104 statistics & probabilitysymbols.namesakebig data0502 economics and businessBayesian Regression Trees (BART)0101 mathematics050205 econometrics Bayesian treed regressionMultiple Try Metropolis algorithmsINFERÊNCIA ESTATÍSTICAbusiness.industryApplied MathematicsModel selection05 social sciencesRejection samplingData scienceVariable-order Bayesian networkTree (data structure)Tree traversalMarkov chain Monte Carlocontinuous time Markov processsymbolsArtificial intelligencebusinessBayesian linear regressioncommunication-freecomputerGibbs samplingBayesian Analysis
researchProduct

Multi-label Classification Using Stacked Hierarchical Dirichlet Processes with Reduced Sampling Complexity

2018

Nonparametric topic models based on hierarchical Dirichlet processes (HDPs) allow for the number of topics to be automatically discovered from the data. The computational complexity of standard Gibbs sampling techniques for model training is linear in the number of topics. Recently, it was reduced to be linear in the number of topics per word using a technique called alias sampling combined with Metropolis Hastings (MH) sampling. We propose a different proposal distribution for the MH step based on the observation that distributions on the upper hierarchy level change slower than the document-specific distributions at the lower level. This reduces the sampling complexity, making it linear i…

Topic modelComputational complexity theoryComputer science02 engineering and technologyLatent Dirichlet allocationDirichlet distributionsymbols.namesakeArtificial Intelligence020204 information systems0202 electrical engineering electronic engineering information engineeringMathematicsMulti-label classificationbusiness.industrySampling (statistics)Pattern recognitionHuman-Computer InteractionDirichlet processMetropolis–Hastings algorithmHardware and ArchitectureTest setsymbols020201 artificial intelligence & image processingArtificial intelligencebusinessAlgorithmSoftwareInformation SystemsGibbs sampling2017 IEEE International Conference on Big Knowledge (ICBK)
researchProduct

Varsovie, une nouvelle métropole

2003

Depuis la chute du mur de Berlin, de profondes modifications économiques, politiques et sociales ont vu le jour dans les Pays d’Europe Centrale et Orientale. Parmi ces transformations, la naissance de nouveaux acteurs économiques, leur adaptation aux conditions de l’économie de marché et notamment leur entrée dans les réseaux globaux modifient, jour après jour, les anciens équilibres entre les métropoles européennes. Certainesgrandes villes comme Budapest ou Varsovie suivent un processus original d’adaptation aux nouvelles conditions économiques, politiques et techniques qui résulte à la fois du poids de leur histoire et de leur capacité de transformation. Dans ses grandes lignes, l’évoluti…

WarsawMetropoleHigh-order services[SHS.GEO] Humanities and Social Sciences/GeographyTransition[SHS.GEO]Humanities and Social Sciences/Geographyéconomie urbaineVarsovie[SHS.ECO] Humanities and Social Sciences/Economics and Finance[SHS.ECO]Humanities and Social Sciences/Economics and FinanceMetropolis[ SHS.GEO ] Humanities and Social Sciences/Geography
researchProduct

Metropolises stability vs. change

2003

In everyday language and even in some scientific work, the term “metropolis” evokes nothing more than a very large city, a focus for all that is good — and bad — about urban life. The vast literature on metropolises and metropolization, especially in Europe over the last 20 years, shows that things are anything but straightforward. Population alone is probably not a necessary condition and obviously not a sufficient condition to characterize a metropolis. Somany phenomena are associated with this term that, like Lacour (1999), we may wonder whether this diversity is evidence of just how rich or just how poor the concept is. Producing a meaningful definition is indeed a challenge (...)

[SHS.HISPHILSO]Humanities and Social Sciences/History Philosophy and Sociology of SciencesMetropolises[SHS.HISPHILSO] Humanities and Social Sciences/History Philosophy and Sociology of SciencesHigh-order servicesUrban history[ SHS.HISPHILSO ] Humanities and Social Sciences/History Philosophy and Sociology of Sciences
researchProduct