Search results for "bayesian"

showing 10 items of 604 documents

Adaptive independent sticky MCMC algorithms

2018

In this work, we introduce a novel class of adaptive Monte Carlo methods, called adaptive independent sticky MCMC algorithms, for efficient sampling from a generic target probability density function (pdf). The new class of algorithms employs adaptive non-parametric proposal densities which become closer and closer to the target as the number of iterations increases. The proposal pdf is built using interpolation procedures based on a set of support points which is constructed iteratively based on previously drawn samples. The algorithm's efficiency is ensured by a test that controls the evolution of the set of support points. This extra stage controls the computational cost and the converge…

FOS: Computer and information sciencesMathematical optimizationAdaptive Markov chain Monte Carlo (MCMC)Monte Carlo methodBayesian inferenceHASettore SECS-P/05 - Econometrialcsh:TK7800-8360Machine Learning (stat.ML)02 engineering and technologyBayesian inference01 natural sciencesStatistics - Computationlcsh:Telecommunication010104 statistics & probabilitysymbols.namesakeAdaptive Markov chain Monte Carlo (MCMC); Adaptive rejection Metropolis sampling (ARMS); Bayesian inference; Gibbs sampling; Hit and run algorithm; Metropolis-within-Gibbs; Monte Carlo methods; Signal Processing; Hardware and Architecture; Electrical and Electronic EngineeringGibbs samplingStatistics - Machine Learninglcsh:TK5101-67200202 electrical engineering electronic engineering information engineeringComputational statisticsMetropolis-within-GibbsHit and run algorithm0101 mathematicsElectrical and Electronic EngineeringGaussian processComputation (stat.CO)MathematicsSignal processinglcsh:Electronics020206 networking & telecommunicationsMarkov chain Monte CarloMonte Carlo methodsHardware and ArchitectureSignal ProcessingSettore SECS-S/03 - Statistica EconomicasymbolsSettore SECS-S/01 - StatisticaStatistical signal processingGibbs samplingAdaptive rejection Metropolis sampling (ARMS)EURASIP Journal on Advances in Signal Processing
researchProduct

Bayesian Unification of Gradient and Bandit-based Learning for Accelerated Global Optimisation

2017

Bandit based optimisation has a remarkable advantage over gradient based approaches due to their global perspective, which eliminates the danger of getting stuck at local optima. However, for continuous optimisation problems or problems with a large number of actions, bandit based approaches can be hindered by slow learning. Gradient based approaches, on the other hand, navigate quickly in high-dimensional continuous spaces through local optimisation, following the gradient in fine grained steps. Yet, apart from being susceptible to local optima, these schemes are less suited for online learning due to their reliance on extensive trial-and-error before the optimum can be identified. In this…

FOS: Computer and information sciencesMathematical optimizationComputer scienceComputer Science - Artificial IntelligenceBayesian probability02 engineering and technologyMachine learningcomputer.software_genreMachine Learning (cs.LG)symbols.namesakeLocal optimumMargin (machine learning)0202 electrical engineering electronic engineering information engineeringGaussian processFlexibility (engineering)business.industry020206 networking & telecommunicationsFunction (mathematics)Computer Science - LearningArtificial Intelligence (cs.AI)symbols020201 artificial intelligence & image processingAlgorithm designLinear approximationArtificial intelligencebusinesscomputer
researchProduct

A Bayesian Multilevel Random-Effects Model for Estimating Noise in Image Sensors

2020

Sensor noise sources cause differences in the signal recorded across pixels in a single image and across multiple images. This paper presents a Bayesian approach to decomposing and characterizing the sensor noise sources involved in imaging with digital cameras. A Bayesian probabilistic model based on the (theoretical) model for noise sources in image sensing is fitted to a set of a time-series of images with different reflectance and wavelengths under controlled lighting conditions. The image sensing model is a complex model, with several interacting components dependent on reflectance and wavelength. The properties of the Bayesian approach of defining conditional dependencies among parame…

FOS: Computer and information sciencesMean squared errorC.4Computer scienceBayesian probabilityG.3ComputingMethodologies_IMAGEPROCESSINGANDCOMPUTERVISIONInference02 engineering and technologyBayesian inferenceStatistics - Applications0202 electrical engineering electronic engineering information engineeringFOS: Electrical engineering electronic engineering information engineeringApplications (stat.AP)Electrical and Electronic EngineeringImage sensorI.4.1C.4; G.3; I.4.1Pixelbusiness.industryImage and Video Processing (eess.IV)020206 networking & telecommunicationsPattern recognitionStatistical modelElectrical Engineering and Systems Science - Image and Video ProcessingRandom effects modelNoise62P30 62P35 62F15 62J05Signal Processing020201 artificial intelligence & image processingComputer Vision and Pattern RecognitionArtificial intelligencebusinessSoftware
researchProduct

The Recycling Gibbs sampler for efficient learning

2018

Monte Carlo methods are essential tools for Bayesian inference. Gibbs sampling is a well-known Markov chain Monte Carlo (MCMC) algorithm, extensively used in signal processing, machine learning, and statistics, employed to draw samples from complicated high-dimensional posterior distributions. The key point for the successful application of the Gibbs sampler is the ability to draw efficiently samples from the full-conditional probability density functions. Since in the general case this is not possible, in order to speed up the convergence of the chain, it is required to generate auxiliary samples whose information is eventually disregarded. In this work, we show that these auxiliary sample…

FOS: Computer and information sciencesMonte Carlo methodSlice samplingInferenceMachine Learning (stat.ML)02 engineering and technologyBayesian inferenceStatistics - Computation01 natural sciencesMachine Learning (cs.LG)010104 statistics & probabilitysymbols.namesake[INFO.INFO-TS]Computer Science [cs]/Signal and Image ProcessingStatistics - Machine LearningArtificial IntelligenceStatistics0202 electrical engineering electronic engineering information engineering0101 mathematicsElectrical and Electronic EngineeringGaussian processComputation (stat.CO)ComputingMilieux_MISCELLANEOUSMathematicsChain rule (probability)Applied Mathematics020206 networking & telecommunicationsMarkov chain Monte CarloStatistics::ComputationComputer Science - LearningComputational Theory and MathematicsSignal ProcessingsymbolsComputer Vision and Pattern RecognitionStatistics Probability and UncertaintyAlgorithm[SPI.SIGNAL]Engineering Sciences [physics]/Signal and Image processingGibbs samplingDigital Signal Processing
researchProduct

Conditional particle filters with diffuse initial distributions

2020

Conditional particle filters (CPFs) are powerful smoothing algorithms for general nonlinear/non-Gaussian hidden Markov models. However, CPFs can be inefficient or difficult to apply with diffuse initial distributions, which are common in statistical applications. We propose a simple but generally applicable auxiliary variable method, which can be used together with the CPF in order to perform efficient inference with diffuse initial distributions. The method only requires simulatable Markov transitions that are reversible with respect to the initial distribution, which can be improper. We focus in particular on random-walk type transitions which are reversible with respect to a uniform init…

FOS: Computer and information sciencesStatistics and ProbabilityComputer scienceGaussianBayesian inferenceMarkovin ketjut02 engineering and technology01 natural sciencesStatistics - ComputationArticleTheoretical Computer ScienceMethodology (stat.ME)010104 statistics & probabilitysymbols.namesakeAdaptive Markov chain Monte Carlotilastotiede0202 electrical engineering electronic engineering information engineeringStatistical physics0101 mathematicsDiffuse initialisationHidden Markov modelComputation (stat.CO)Statistics - MethodologyState space modelHidden Markov modelbayesian inferenceMarkov chaindiffuse initialisationbayesilainen menetelmäconditional particle filtersmoothingmatemaattiset menetelmät020206 networking & telecommunicationsConditional particle filterCovariancecompartment modelRandom walkCompartment modelstate space modelComputational Theory and MathematicsAutoregressive modelsymbolsStatistics Probability and UncertaintyParticle filterSmoothingSmoothing
researchProduct

Unbiased Inference for Discretely Observed Hidden Markov Model Diffusions

2021

We develop a Bayesian inference method for diffusions observed discretely and with noise, which is free of discretisation bias. Unlike existing unbiased inference methods, our method does not rely on exact simulation techniques. Instead, our method uses standard time-discretised approximations of diffusions, such as the Euler--Maruyama scheme. Our approach is based on particle marginal Metropolis--Hastings, a particle filter, randomised multilevel Monte Carlo, and importance sampling type correction of approximate Markov chain Monte Carlo. The resulting estimator leads to inference without a bias from the time-discretisation as the number of Markov chain iterations increases. We give conver…

FOS: Computer and information sciencesStatistics and ProbabilityDiscretizationComputer scienceMarkovin ketjutInference010103 numerical & computational mathematicssequential Monte CarloBayesian inferenceStatistics - Computation01 natural sciencesMethodology (stat.ME)010104 statistics & probabilitysymbols.namesakediffuusio (fysikaaliset ilmiöt)FOS: MathematicsDiscrete Mathematics and Combinatorics0101 mathematicsHidden Markov modelComputation (stat.CO)Statistics - Methodologymatematiikkabayesilainen menetelmäApplied MathematicsProbability (math.PR)diffusionmatemaattiset menetelmätMarkov chain Monte CarloMarkov chain Monte CarloMonte Carlo -menetelmätNoiseimportance sampling65C05 (primary) 60H35 65C35 65C40 (secondary)Modeling and Simulationsymbolsmatemaattiset mallitStatistics Probability and Uncertaintymultilevel Monte CarloParticle filterAlgorithmMathematics - ProbabilityImportance samplingSIAM/ASA Journal on Uncertainty Quantification
researchProduct

Estimation of causal effects with small data in the presence of trapdoor variables

2021

We consider the problem of estimating causal effects of interventions from observational data when well-known back-door and front-door adjustments are not applicable. We show that when an identifiable causal effect is subject to an implicit functional constraint that is not deducible from conditional independence relations, the estimator of the causal effect can exhibit bias in small samples. This bias is related to variables that we call trapdoor variables. We use simulated data to study different strategies to account for trapdoor variables and suggest how the related trapdoor bias might be minimized. The importance of trapdoor variables in causal effect estimation is illustrated with rea…

FOS: Computer and information sciencesStatistics and ProbabilityEconomics and EconometricsbiascausalityComputer scienceBayesian probabilityContext (language use)01 natural sciencesStatistics - ComputationMethodology (stat.ME)010104 statistics & probability0504 sociologyEconometrics0101 mathematicsComputation (stat.CO)Statistics - MethodologyestimointiEstimationSmall databayesilainen menetelmä05 social sciences050401 social sciences methodsEstimatorBayesian estimationidentifiabilityConstraint (information theory)functional constraintConditional independencekausaliteettiObservational studyStatistics Probability and UncertaintySocial Sciences (miscellaneous)
researchProduct

Bayesian inference for the extremal dependence

2016

A simple approach for modeling multivariate extremes is to consider the vector of component-wise maxima and their max-stable distributions. The extremal dependence can be inferred by estimating the angular measure or, alternatively, the Pickands dependence function. We propose a nonparametric Bayesian model that allows, in the bivariate case, the simultaneous estimation of both functional representations through the use of polynomials in the Bernstein form. The constraints required to provide a valid extremal dependence are addressed in a straightforward manner, by placing a prior on the coefficients of the Bernstein polynomials which gives probability one to the set of valid functions. The…

FOS: Computer and information sciencesStatistics and ProbabilityInferenceBernstein polynomialsBivariate analysisBayesian inference01 natural sciencesMethodology (stat.ME)Bayesian nonparametrics010104 statistics & probabilitysymbols.namesakeGeneralised extreme value distribution0502 economics and business62G07Applied mathematics62G05Degree of a polynomial0101 mathematicsStatistics - Methodology050205 econometrics MathematicsAngular measureMax-stable distributionGENERALISED EXTREME VALUE DISTRIBUTION EXTREMAL DEPENDENCE ANGULAR MEASURE MAX-STABLE DISTRIBUTION BERNSTEIN POLYNOMIALS BAYESIAN NONPARAMETRICS TRANS-DIMENSIONAL MCMC EXCHANGE RATEExchange rates05 social sciencesNonparametric statisticsMarkov chain Monte CarloBernstein polynomialGENERALISED EXTREME VALUE DISTRIBUTION; EXTREMAL DEPENDENCE; ANGULAR MEASURE; MAX-STABLE DISTRIBUTION; BERNSTEIN POLYNOMIALS; BAYESIAN NONPARAMETRICS; TRANS-DIMENSIONAL MCMC; EXCHANGE RATETrans-dimensional MCMCEXCHANGE RATEsymbolsStatistics Probability and UncertaintySettore SECS-S/01 - StatisticaMaximaExtremal dependence62G32Electronic Journal of Statistics
researchProduct

Bayesian Checking of the Second Levels of Hierarchical Models

2007

Hierarchical models are increasingly used in many applications. Along with this increased use comes a desire to investigate whether the model is compatible with the observed data. Bayesian methods are well suited to eliminate the many (nuisance) parameters in these complicated models; in this paper we investigate Bayesian methods for model checking. Since we contemplate model checking as a preliminary, exploratory analysis, we concentrate on objective Bayesian methods in which careful specification of an informative prior distribution is avoided. Numerous examples are given and different proposals are investigated and critically compared.

FOS: Computer and information sciencesStatistics and ProbabilityModel checkingModel checkingComputer scienceconflictGeneral MathematicsBayesian probabilityMachine learningcomputer.software_genreMethodology (stat.ME)partial posterior predictivePrior probabilityStatistics - Methodologybusiness.industrymodel criticismProbability and statisticsExploratory analysisobjective Bayesian methodsempirical-Bayesposterior predictivep-valuesArtificial intelligenceStatistics Probability and Uncertaintybusinesscomputer
researchProduct

Efficient Bayesian generalized linear models with time-varying coefficients : The walker package in R

2020

The R package walker extends standard Bayesian general linear models to the case where the effects of the explanatory variables can vary in time. This allows, for example, to model the effects of interventions such as changes in tax policy which gradually increases their effect over time. The Markov chain Monte Carlo algorithms powering the Bayesian inference are based on Hamiltonian Monte Carlo provided by Stan software, using a state space representation of the model to marginalise over the regression coefficients for efficient low-dimensional sampling.

FOS: Computer and information sciencesaikasarjatbayesilainen menetelmäBayesian inferenceMarkovin ketjutRStatistics - Computationlineaariset mallitR-kieliMarkov chain Monte CarloMonte Carlo -menetelmätregressioanalyysiComputation (stat.CO)time-varying regression
researchProduct