Search results for "Statistics::Computation"

showing 10 items of 48 documents

Anti-tempered Layered Adaptive Importance Sampling

2017

Monte Carlo (MC) methods are widely used for Bayesian inference in signal processing, machine learning and statistics. In this work, we introduce an adaptive importance sampler which mixes together the benefits of the Importance Sampling (IS) and Markov Chain Monte Carlo (MCMC) approaches. Different parallel MCMC chains provide the location parameters of the proposal probability density functions (pdfs) used in an IS method. The MCMC algorithms consider a tempered version of the posterior distribution as invariant density. We also provide an exhaustive theoretical support explaining why, in the presented technique, even an anti-tempering strategy (reducing the scaling of the posterior) can …

Mathematical optimizationRejection samplingSlice sampling020206 networking & telecommunicationsMarkov chain Monte Carlo02 engineering and technology01 natural sciencesStatistics::ComputationHybrid Monte Carlo010104 statistics & probabilitysymbols.namesakeMetropolis–Hastings algorithm[INFO.INFO-TS]Computer Science [cs]/Signal and Image Processing0202 electrical engineering electronic engineering information engineeringsymbolsParallel tempering0101 mathematicsParticle filter[SPI.SIGNAL]Engineering Sciences [physics]/Signal and Image processingImportance samplingComputingMilieux_MISCELLANEOUSMathematics
researchProduct

Test problems for large-scale nonsmooth minimization

2007

Many practical optimization problems involve nonsmooth (that is, not necessarily differentiable) functions of hundreds or thousands of variables with various constraints. However, there exist only few large-scale academic test problems for nonsmooth case and there is no established practice for testing solvers for large-scale nonsmooth optimization. For this reason, we now collect the nonsmooth test problems used in our previous numerical experiments and also give some new problems. Namely, we give problems for unconstrained, bound constrained, and inequality constrained nonsmooth minimization.

Mathematics::Optimization and ControlStatistics::Computation
researchProduct

Importance sampling type estimators based on approximate marginal Markov chain Monte Carlo

2020

We consider importance sampling (IS) type weighted estimators based on Markov chain Monte Carlo (MCMC) targeting an approximate marginal of the target distribution. In the context of Bayesian latent variable models, the MCMC typically operates on the hyperparameters, and the subsequent weighting may be based on IS or sequential Monte Carlo (SMC), but allows for multilevel techniques as well. The IS approach provides a natural alternative to delayed acceptance (DA) pseudo-marginal/particle MCMC, and has many advantages over DA, including a straightforward parallelisation and additional flexibility in MCMC implementation. We detail minimal conditions which ensure strong consistency of the sug…

Monte Carlo -menetelmätbayesilainen menetelmätilastomenetelmätMarkovin ketjutMarkov chain Monte Carlo (MCMC)Bayesian analysisotantaStatistics::Computationestimointi
researchProduct

Shrinkage and spectral filtering of correlation matrices: A comparison via the Kullback-Leibler distance

2007

The problem of filtering information from large correlation matrices is of great importance in many applications. We have recently proposed the use of the Kullback-Leibler distance to measure the performance of filtering algorithms in recovering the underlying correlation matrix when the variables are described by a multivariate Gaussian distribution. Here we use the Kullback-Leibler distance to investigate the performance of filtering methods based on Random Matrix Theory and on the shrinkage technique. We also present some results on the application of the Kullback-Leibler distance to multivariate data which are non Gaussian distributed.

Physics - Physics and SocietyStatistics::TheoryStatistical Finance (q-fin.ST)MathematicsofComputing_NUMERICALANALYSISFOS: Physical sciencesQuantitative Finance - Statistical FinancePhysics and Society (physics.soc-ph)Statistics::ComputationFOS: Economics and businessStatistics::Machine LearningComputingMethodologies_PATTERNRECOGNITIONPhysics - Data Analysis Statistics and ProbabilityStatistics::MethodologyCOVARIANCE-MATRIXData Analysis Statistics and Probability (physics.data-an)
researchProduct

Appendix C. Posterior distributions of the CR-SEM parameters (conditional on the covariates being in the model).

2016

Posterior distributions of the CR-SEM parameters (conditional on the covariates being in the model).

Physics::Medical PhysicsStatistics::MethodologyQuantitative Biology::OtherPhysics::GeophysicsStatistics::Computation
researchProduct

Particle Group Metropolis Methods for Tracking the Leaf Area Index

2020

Monte Carlo (MC) algorithms are widely used for Bayesian inference in statistics, signal processing, and machine learning. In this work, we introduce an Markov Chain Monte Carlo (MCMC) technique driven by a particle filter. The resulting scheme is a generalization of the so-called Particle Metropolis-Hastings (PMH) method, where a suitable Markov chain of sets of weighted samples is generated. We also introduce a marginal version for the goal of jointly inferring dynamic and static variables. The proposed algorithms outperform the corresponding standard PMH schemes, as shown by numerical experiments.

Signal processing010504 meteorology & atmospheric sciencesMarkov chainGeneralizationComputer scienceBayesian inferenceMonte Carlo method020206 networking & telecommunicationsMarkov chain Monte Carlo02 engineering and technologystate-space modelsTracking (particle physics)Bayesian inference01 natural sciencesParticle FilteringStatistics::Computationsymbols.namesake0202 electrical engineering electronic engineering information engineeringsymbolsParticle MCMCParticle filterMonte CarloAlgorithm0105 earth and related environmental sciences
researchProduct

A Bayesian analysis of classical hypothesis testing

1980

The procedure of maximizing the missing information is applied to derive reference posterior probabilities for null hypotheses. The results shed further light on Lindley’s paradox and suggest that a Bayesian interpretation of classical hypothesis testing is possible by providing a one-to-one approximate relationship between significance levels and posterior probabilities.

Statistics and ProbabilityBayes factorBayesian inferenceStatistics::ComputationBayesian statisticsStatisticsEconometricsBayesian experimental designStatistics::MethodologyStatistics Probability and UncertaintyBayesian linear regressionLindley's paradoxBayesian averageMathematicsStatistical hypothesis testingTrabajos de Estadistica Y de Investigacion Operativa
researchProduct

What Bayesians Expect of Each Other

1991

Abstract Our goal is to study general properties of one Bayesian's subjective beliefs about the behavior of another Bayesian's subjective beliefs. We consider two Bayesians, A and B, who have different subjective distributions for a parameter θ, and study Bayesian A's expectation of Bayesian B's posterior distribution for θ given some data Y. We show that when θ can take only two values, Bayesian A always expects Bayesian B's posterior distribution to lie between the prior distributions of A and B. Conditions are given under which a similar result holds for an arbitrary real-valued parameter θ. For a vector parameter θ we present useful expressions for the mean vector and covariance matrix …

Statistics and ProbabilityBayesian probabilityPosterior probabilityBayesian inferenceStatistics::ComputationBayesian statisticsStatisticsBayesian experimental designBayesian hierarchical modelingApplied mathematicsStatistics Probability and UncertaintyBayesian linear regressionBayesian averageMathematicsJournal of the American Statistical Association
researchProduct

An introduction to Bayesian reference analysis: inference on the ratio of multinomial parameters

1998

This paper offers an introduction to Bayesian reference analysis, often described as the more successful method to produce non-subjective, model-based, posterior distributions. The ideas are illustrated in detail with an interesting problem, the ratio of multinomial parameters, for which no model-based Bayesian analysis has been proposed. Signposts are provided to the huge related literature.

Statistics and ProbabilityBayesian probabilityPosterior probabilityInferenceBayesian inferencecomputer.software_genreStatistics::ComputationBayesian statisticsComputingMethodologies_PATTERNRECOGNITIONPrior probabilityEconometricsData miningBayesian linear regressionBayesian averagecomputerMathematicsJournal of the Royal Statistical Society: Series D (The Statistician)
researchProduct

Statistical inference and Monte Carlo algorithms

1996

This review article looks at a small part of the picture of the interrelationship between statistical theory and computational algorithms, especially the Gibbs sampler and the Accept-Reject algorithm. We pay particular attention to how the methodologies affect and complement each other.

Statistics and ProbabilityDecision theoryMonte Carlo methodMarkov chain Monte CarloStatistics::ComputationComplement (complexity)symbols.namesakeStatistical inferencesymbolsMonte Carlo method in statistical physicsStatistics Probability and UncertaintyStatistical theoryAlgorithmGibbs samplingMathematicsTest
researchProduct