Search results for "Statistics::Computation"

showing 10 items of 48 documents

"Table 31" of "Search for photonic signatures of gauge-mediated supersymmetry in 8 TeV $pp$ collisions with the ATLAS detector"

2015

The total NLO cross sections with uncertainties for the electroweak production GGM signal grid for the photon+j analysis.

8000.0Physics::Instrumentation and DetectorsHigh Energy Physics::PhenomenologyIntegrated Cross SectionSUSYCross SectionSIGStatistics::ComputationP P --> CHARGINO1 CHARGINO1 XP P --> NEUTRALINO2 NEUTRALINO2 XInclusiveP P --> CHARGINO1 NEUTRALINO2 XProton-Proton ScatteringHigh Energy Physics::ExperimentSupersymmetry
researchProduct

Recent Advances in Bayesian Inference in Cosmology and Astroparticle Physics Thanks to the MultiNest Algorithm

2012

We present a new algorithm, called MultiNest, which is a highly efficient alternative to traditional Markov Chain Monte Carlo (MCMC) sampling of posterior distributions. MultiNest is more efficient than MCMC, can deal with highly multi-modal likelihoods and returns the Bayesian evidence (or model likelihood, the prime quantity for Bayesian model comparison) together with posterior samples. It can thus be used as an all-around Bayesian inference engine. When appropriately tuned, it also provides an exploration of the profile likelihood that is competitive with what can be obtained with dedicated algorithms.

Astroparticle physicsPhysicsPosterior probabilitySampling (statistics)Markov chain Monte CarloBayesian evidenceBayesian inferenceCosmologyPrime (order theory)Statistics::Computationsymbols.namesakeSettore FIS/05 - Astronomia e AstrofisicasymbolsStatistics::MethodologyAlgorithmComputer Science::Databases
researchProduct

Population Monte Carlo Schemes with Reduced Path Degeneracy

2017

Population Monte Carlo (PMC) algorithms are versatile adaptive tools for approximating moments of complicated distributions. A common problem of PMC algorithms is the so-called path degeneracy; the diversity in the adaptation is endangered due to the resampling step. In this paper we focus on novel population Monte Carlo schemes that present enhanced diversity, compared to the standard approach, while keeping the same implementation structure (sample generation, weighting and resampling). The new schemes combine different weighting and resampling strategies to reduce the path degeneracy and achieve a higher performance at the cost of additional low computational complexity cost. Computer si…

Computational complexity theoryMonte Carlo methodApproximation algorithm020206 networking & telecommunications02 engineering and technology01 natural sciencesStatistics::ComputationWeighting010104 statistics & probabilitysymbols.namesake[INFO.INFO-TS]Computer Science [cs]/Signal and Image ProcessingGaussian noiseResamplingPath (graph theory)0202 electrical engineering electronic engineering information engineeringsymbols0101 mathematicsDegeneracy (mathematics)Algorithm[SPI.SIGNAL]Engineering Sciences [physics]/Signal and Image processingComputingMilieux_MISCELLANEOUS
researchProduct

Distributed Particle Metropolis-Hastings Schemes

2018

We introduce a Particle Metropolis-Hastings algorithm driven by several parallel particle filters. The communication with the central node requires the transmission of only a set of weighted samples, one per filter. Furthermore, the marginal version of the previous scheme, called Distributed Particle Marginal Metropolis-Hastings (DPMMH) method, is also presented. DPMMH can be used for making inference on both a dynamical and static variable of interest. The ergodicity is guaranteed, and numerical simulations show the advantages of the novel schemes.

Computer scienceMonte Carlo methodErgodicity020206 networking & telecommunications02 engineering and technologyFilter (signal processing)Bayesian inferenceStatistics::ComputationSet (abstract data type)Metropolis–Hastings algorithm[INFO.INFO-TS]Computer Science [cs]/Signal and Image ProcessingTransmission (telecommunications)0202 electrical engineering electronic engineering information engineering020201 artificial intelligence & image processingParticle filter[SPI.SIGNAL]Engineering Sciences [physics]/Signal and Image processingAlgorithmComputingMilieux_MISCELLANEOUS2018 IEEE Statistical Signal Processing Workshop (SSP)
researchProduct

Applications and Limitations of Robust Bayesian Bounds and Type II MLE

1994

Three applications of robust Bayesian analysis and three examples of its limitations are given. The applications that are reviewed are the development of an automatic Ockham’s Razor, outlier detection, and analysis of weighted distributions. Limitations of robust Bayesian bounds are highlighted through examples that include analysis of a paranormal experiment and a hierarchical model. This last example shows a disturbing difference between actual hierarchical Bayesian analysis and robust Bayesian bounds, a difference which also arises if, instead, a Type II MLE or empirical Bayes analysis is performed.

Computer sciencebusiness.industryBayesian probabilityMachine learningcomputer.software_genreHierarchical database modelStatistics::ComputationBayesian robustnessRobust Bayesian analysisPrior probabilityAnomaly detectionArtificial intelligenceBayes analysisbusinesscomputer
researchProduct

The Effective Sample Size

2013

Model selection procedures often depend explicitly on the sample size n of the experiment. One example is the Bayesian information criterion (BIC) criterion and another is the use of Zellner–Siow priors in Bayesian model selection. Sample size is well-defined if one has i.i.d real observations, but is not well-defined for vector observations or in non-i.i.d. settings; extensions of critera such as BIC to such settings thus requires a definition of effective sample size that applies also in such cases. A definition of effective sample size that applies to fairly general linear models is proposed and illustrated in a variety of situations. The definition is also used to propose a suitable ‘sc…

Deviance information criterionEconomics and EconometricsBayesian information criterionSample size determinationModel selectionPrior probabilityStatisticsLinear modelBayesian inferenceAlgorithmSelection (genetic algorithm)Statistics::ComputationMathematicsEconometric Reviews
researchProduct

Group Importance Sampling for particle filtering and MCMC

2018

Bayesian methods and their implementations by means of sophisticated Monte Carlo techniques have become very popular in signal processing over the last years. Importance Sampling (IS) is a well-known Monte Carlo technique that approximates integrals involving a posterior distribution by means of weighted samples. In this work, we study the assignation of a single weighted sample which compresses the information contained in a population of weighted samples. Part of the theory that we present as Group Importance Sampling (GIS) has been employed implicitly in different works in the literature. The provided analysis yields several theoretical and practical consequences. For instance, we discus…

FOS: Computer and information sciencesComputer Science - Machine LearningComputer sciencePosterior probabilityMonte Carlo methodMachine Learning (stat.ML)02 engineering and technologyMultiple-try MetropolisStatistics - Computation01 natural sciencesMachine Learning (cs.LG)Computational Engineering Finance and Science (cs.CE)Methodology (stat.ME)010104 statistics & probabilitysymbols.namesake[INFO.INFO-TS]Computer Science [cs]/Signal and Image ProcessingStatistics - Machine LearningArtificial IntelligenceResampling0202 electrical engineering electronic engineering information engineering0101 mathematicsElectrical and Electronic EngineeringComputer Science - Computational Engineering Finance and ScienceStatistics - MethodologyComputation (stat.CO)ComputingMilieux_MISCELLANEOUSMarkov chainApplied Mathematics020206 networking & telecommunicationsMarkov chain Monte CarloStatistics::ComputationComputational Theory and MathematicsSignal ProcessingsymbolsComputer Vision and Pattern RecognitionStatistics Probability and UncertaintyParticle filter[SPI.SIGNAL]Engineering Sciences [physics]/Signal and Image processingAlgorithmImportance samplingDigital Signal Processing
researchProduct

A Review of Multiple Try MCMC algorithms for Signal Processing

2018

Many applications in signal processing require the estimation of some parameters of interest given a set of observed data. More specifically, Bayesian inference needs the computation of {\it a-posteriori} estimators which are often expressed as complicated multi-dimensional integrals. Unfortunately, analytical expressions for these estimators cannot be found in most real-world applications, and Monte Carlo methods are the only feasible approach. A very powerful class of Monte Carlo techniques is formed by the Markov Chain Monte Carlo (MCMC) algorithms. They generate a Markov chain such that its stationary distribution coincides with the target posterior density. In this work, we perform a t…

FOS: Computer and information sciencesComputer scienceMonte Carlo methodMachine Learning (stat.ML)02 engineering and technologyMultiple-try MetropolisBayesian inference01 natural sciencesStatistics - Computation010104 statistics & probabilitysymbols.namesakeArtificial IntelligenceStatistics - Machine Learning0202 electrical engineering electronic engineering information engineering0101 mathematicsElectrical and Electronic EngineeringComputation (stat.CO)Signal processingMarkov chainApplied MathematicsEstimator020206 networking & telecommunicationsMarkov chain Monte CarloStatistics::ComputationComputational Theory and MathematicsSignal ProcessingsymbolsSample spaceComputer Vision and Pattern RecognitionStatistics Probability and UncertaintyAlgorithm
researchProduct

Metropolis Sampling

2017

Monte Carlo (MC) sampling methods are widely applied in Bayesian inference, system simulation and optimization problems. The Markov Chain Monte Carlo (MCMC) algorithms are a well-known class of MC methods which generate a Markov chain with the desired invariant distribution. In this document, we focus on the Metropolis-Hastings (MH) sampler, which can be considered as the atom of the MCMC techniques, introducing the basic notions and different properties. We describe in details all the elements involved in the MH algorithm and the most relevant variants. Several improvements and recent extensions proposed in the literature are also briefly discussed, providing a quick but exhaustive overvie…

FOS: Computer and information sciencesMachine Learning (stat.ML)020206 networking & telecommunications02 engineering and technologyStatistics - Computation01 natural sciencesStatistics::ComputationMethodology (stat.ME)010104 statistics & probabilityStatistics - Machine Learning0202 electrical engineering electronic engineering information engineering0101 mathematicsComputation (stat.CO)Statistics - MethodologyWiley StatsRef: Statistics Reference Online
researchProduct

The Recycling Gibbs sampler for efficient learning

2018

Monte Carlo methods are essential tools for Bayesian inference. Gibbs sampling is a well-known Markov chain Monte Carlo (MCMC) algorithm, extensively used in signal processing, machine learning, and statistics, employed to draw samples from complicated high-dimensional posterior distributions. The key point for the successful application of the Gibbs sampler is the ability to draw efficiently samples from the full-conditional probability density functions. Since in the general case this is not possible, in order to speed up the convergence of the chain, it is required to generate auxiliary samples whose information is eventually disregarded. In this work, we show that these auxiliary sample…

FOS: Computer and information sciencesMonte Carlo methodSlice samplingInferenceMachine Learning (stat.ML)02 engineering and technologyBayesian inferenceStatistics - Computation01 natural sciencesMachine Learning (cs.LG)010104 statistics & probabilitysymbols.namesake[INFO.INFO-TS]Computer Science [cs]/Signal and Image ProcessingStatistics - Machine LearningArtificial IntelligenceStatistics0202 electrical engineering electronic engineering information engineering0101 mathematicsElectrical and Electronic EngineeringGaussian processComputation (stat.CO)ComputingMilieux_MISCELLANEOUSMathematicsChain rule (probability)Applied Mathematics020206 networking & telecommunicationsMarkov chain Monte CarloStatistics::ComputationComputer Science - LearningComputational Theory and MathematicsSignal ProcessingsymbolsComputer Vision and Pattern RecognitionStatistics Probability and UncertaintyAlgorithm[SPI.SIGNAL]Engineering Sciences [physics]/Signal and Image processingGibbs samplingDigital Signal Processing
researchProduct