Search results for "Statistics::Computation"
showing 10 items of 48 documents
"Table 31" of "Search for photonic signatures of gauge-mediated supersymmetry in 8 TeV $pp$ collisions with the ATLAS detector"
2015
The total NLO cross sections with uncertainties for the electroweak production GGM signal grid for the photon+j analysis.
Recent Advances in Bayesian Inference in Cosmology and Astroparticle Physics Thanks to the MultiNest Algorithm
2012
We present a new algorithm, called MultiNest, which is a highly efficient alternative to traditional Markov Chain Monte Carlo (MCMC) sampling of posterior distributions. MultiNest is more efficient than MCMC, can deal with highly multi-modal likelihoods and returns the Bayesian evidence (or model likelihood, the prime quantity for Bayesian model comparison) together with posterior samples. It can thus be used as an all-around Bayesian inference engine. When appropriately tuned, it also provides an exploration of the profile likelihood that is competitive with what can be obtained with dedicated algorithms.
Population Monte Carlo Schemes with Reduced Path Degeneracy
2017
Population Monte Carlo (PMC) algorithms are versatile adaptive tools for approximating moments of complicated distributions. A common problem of PMC algorithms is the so-called path degeneracy; the diversity in the adaptation is endangered due to the resampling step. In this paper we focus on novel population Monte Carlo schemes that present enhanced diversity, compared to the standard approach, while keeping the same implementation structure (sample generation, weighting and resampling). The new schemes combine different weighting and resampling strategies to reduce the path degeneracy and achieve a higher performance at the cost of additional low computational complexity cost. Computer si…
Distributed Particle Metropolis-Hastings Schemes
2018
We introduce a Particle Metropolis-Hastings algorithm driven by several parallel particle filters. The communication with the central node requires the transmission of only a set of weighted samples, one per filter. Furthermore, the marginal version of the previous scheme, called Distributed Particle Marginal Metropolis-Hastings (DPMMH) method, is also presented. DPMMH can be used for making inference on both a dynamical and static variable of interest. The ergodicity is guaranteed, and numerical simulations show the advantages of the novel schemes.
Applications and Limitations of Robust Bayesian Bounds and Type II MLE
1994
Three applications of robust Bayesian analysis and three examples of its limitations are given. The applications that are reviewed are the development of an automatic Ockham’s Razor, outlier detection, and analysis of weighted distributions. Limitations of robust Bayesian bounds are highlighted through examples that include analysis of a paranormal experiment and a hierarchical model. This last example shows a disturbing difference between actual hierarchical Bayesian analysis and robust Bayesian bounds, a difference which also arises if, instead, a Type II MLE or empirical Bayes analysis is performed.
The Effective Sample Size
2013
Model selection procedures often depend explicitly on the sample size n of the experiment. One example is the Bayesian information criterion (BIC) criterion and another is the use of Zellner–Siow priors in Bayesian model selection. Sample size is well-defined if one has i.i.d real observations, but is not well-defined for vector observations or in non-i.i.d. settings; extensions of critera such as BIC to such settings thus requires a definition of effective sample size that applies also in such cases. A definition of effective sample size that applies to fairly general linear models is proposed and illustrated in a variety of situations. The definition is also used to propose a suitable ‘sc…
Group Importance Sampling for particle filtering and MCMC
2018
Bayesian methods and their implementations by means of sophisticated Monte Carlo techniques have become very popular in signal processing over the last years. Importance Sampling (IS) is a well-known Monte Carlo technique that approximates integrals involving a posterior distribution by means of weighted samples. In this work, we study the assignation of a single weighted sample which compresses the information contained in a population of weighted samples. Part of the theory that we present as Group Importance Sampling (GIS) has been employed implicitly in different works in the literature. The provided analysis yields several theoretical and practical consequences. For instance, we discus…
A Review of Multiple Try MCMC algorithms for Signal Processing
2018
Many applications in signal processing require the estimation of some parameters of interest given a set of observed data. More specifically, Bayesian inference needs the computation of {\it a-posteriori} estimators which are often expressed as complicated multi-dimensional integrals. Unfortunately, analytical expressions for these estimators cannot be found in most real-world applications, and Monte Carlo methods are the only feasible approach. A very powerful class of Monte Carlo techniques is formed by the Markov Chain Monte Carlo (MCMC) algorithms. They generate a Markov chain such that its stationary distribution coincides with the target posterior density. In this work, we perform a t…
Metropolis Sampling
2017
Monte Carlo (MC) sampling methods are widely applied in Bayesian inference, system simulation and optimization problems. The Markov Chain Monte Carlo (MCMC) algorithms are a well-known class of MC methods which generate a Markov chain with the desired invariant distribution. In this document, we focus on the Metropolis-Hastings (MH) sampler, which can be considered as the atom of the MCMC techniques, introducing the basic notions and different properties. We describe in details all the elements involved in the MH algorithm and the most relevant variants. Several improvements and recent extensions proposed in the literature are also briefly discussed, providing a quick but exhaustive overvie…
The Recycling Gibbs sampler for efficient learning
2018
Monte Carlo methods are essential tools for Bayesian inference. Gibbs sampling is a well-known Markov chain Monte Carlo (MCMC) algorithm, extensively used in signal processing, machine learning, and statistics, employed to draw samples from complicated high-dimensional posterior distributions. The key point for the successful application of the Gibbs sampler is the ability to draw efficiently samples from the full-conditional probability density functions. Since in the general case this is not possible, in order to speed up the convergence of the chain, it is required to generate auxiliary samples whose information is eventually disregarded. In this work, we show that these auxiliary sample…