Search results for "Markov"

showing 10 items of 628 documents

Group Importance Sampling for particle filtering and MCMC

2018

Bayesian methods and their implementations by means of sophisticated Monte Carlo techniques have become very popular in signal processing over the last years. Importance Sampling (IS) is a well-known Monte Carlo technique that approximates integrals involving a posterior distribution by means of weighted samples. In this work, we study the assignation of a single weighted sample which compresses the information contained in a population of weighted samples. Part of the theory that we present as Group Importance Sampling (GIS) has been employed implicitly in different works in the literature. The provided analysis yields several theoretical and practical consequences. For instance, we discus…

FOS: Computer and information sciencesComputer Science - Machine LearningComputer sciencePosterior probabilityMonte Carlo methodMachine Learning (stat.ML)02 engineering and technologyMultiple-try MetropolisStatistics - Computation01 natural sciencesMachine Learning (cs.LG)Computational Engineering Finance and Science (cs.CE)Methodology (stat.ME)010104 statistics & probabilitysymbols.namesake[INFO.INFO-TS]Computer Science [cs]/Signal and Image ProcessingStatistics - Machine LearningArtificial IntelligenceResampling0202 electrical engineering electronic engineering information engineering0101 mathematicsElectrical and Electronic EngineeringComputer Science - Computational Engineering Finance and ScienceStatistics - MethodologyComputation (stat.CO)ComputingMilieux_MISCELLANEOUSMarkov chainApplied Mathematics020206 networking & telecommunicationsMarkov chain Monte CarloStatistics::ComputationComputational Theory and MathematicsSignal ProcessingsymbolsComputer Vision and Pattern RecognitionStatistics Probability and UncertaintyParticle filter[SPI.SIGNAL]Engineering Sciences [physics]/Signal and Image processingAlgorithmImportance samplingDigital Signal Processing
researchProduct

Expanding the Active Inference Landscape: More Intrinsic Motivations in the Perception-Action Loop

2018

Active inference is an ambitious theory that treats perception, inference and action selection of autonomous agents under the heading of a single principle. It suggests biologically plausible explanations for many cognitive phenomena, including consciousness. In active inference, action selection is driven by an objective function that evaluates possible future actions with respect to current, inferred beliefs about the world. Active inference at its core is independent from extrinsic rewards, resulting in a high level of robustness across e.g.\ different environments or agent morphologies. In the literature, paradigms that share this independence have been summarised under the notion of in…

FOS: Computer and information sciencesComputer scienceComputer Science - Artificial Intelligencepredictive informationBiomedical EngineeringInferenceSystems and Control (eess.SY)02 engineering and technologyAction selectionI.2.0; I.2.6; I.5.0; I.5.1lcsh:RC321-57103 medical and health sciences0302 clinical medicineactive inferenceArtificial IntelligenceFOS: Electrical engineering electronic engineering information engineering0202 electrical engineering electronic engineering information engineeringFormal concept analysisMethodsperception-action loopuniversal reinforcement learningintrinsic motivationlcsh:Neurosciences. Biological psychiatry. NeuropsychiatryFree energy principleCognitive scienceRobotics and AII.5.0I.5.1I.2.6Partially observable Markov decision processI.2.0Artificial Intelligence (cs.AI)Action (philosophy)empowermentIndependence (mathematical logic)free energy principleComputer Science - Systems and Control020201 artificial intelligence & image processingBiological plausibility62F15 91B06030217 neurology & neurosurgeryvariational inference
researchProduct

A Review of Multiple Try MCMC algorithms for Signal Processing

2018

Many applications in signal processing require the estimation of some parameters of interest given a set of observed data. More specifically, Bayesian inference needs the computation of {\it a-posteriori} estimators which are often expressed as complicated multi-dimensional integrals. Unfortunately, analytical expressions for these estimators cannot be found in most real-world applications, and Monte Carlo methods are the only feasible approach. A very powerful class of Monte Carlo techniques is formed by the Markov Chain Monte Carlo (MCMC) algorithms. They generate a Markov chain such that its stationary distribution coincides with the target posterior density. In this work, we perform a t…

FOS: Computer and information sciencesComputer scienceMonte Carlo methodMachine Learning (stat.ML)02 engineering and technologyMultiple-try MetropolisBayesian inference01 natural sciencesStatistics - Computation010104 statistics & probabilitysymbols.namesakeArtificial IntelligenceStatistics - Machine Learning0202 electrical engineering electronic engineering information engineering0101 mathematicsElectrical and Electronic EngineeringComputation (stat.CO)Signal processingMarkov chainApplied MathematicsEstimator020206 networking & telecommunicationsMarkov chain Monte CarloStatistics::ComputationComputational Theory and MathematicsSignal ProcessingsymbolsSample spaceComputer Vision and Pattern RecognitionStatistics Probability and UncertaintyAlgorithm
researchProduct

On resampling schemes for particle filters with weakly informative observations

2022

We consider particle filters with weakly informative observations (or `potentials') relative to the latent state dynamics. The particular focus of this work is on particle filters to approximate time-discretisations of continuous-time Feynman--Kac path integral models -- a scenario that naturally arises when addressing filtering and smoothing problems in continuous time -- but our findings are indicative about weakly informative settings beyond this context too. We study the performance of different resampling schemes, such as systematic resampling, SSP (Srinivasan sampling process) and stratified resampling, as the time-discretisation becomes finer and also identify their continuous-time l…

FOS: Computer and information sciencesHidden Markov modelparticle filterStatistics and ProbabilityProbability (math.PR)Markovin ketjutStatistics - ComputationMethodology (stat.ME)resamplingFOS: Mathematicsotantanumeerinen analyysiPrimary 65C35 secondary 65C05 65C60 60J25Statistics Probability and UncertaintyFeynman–Kac modeltilastolliset mallitComputation (stat.CO)path integralMathematics - ProbabilityStatistics - Methodologystokastiset prosessit
researchProduct

Adaptive independent sticky MCMC algorithms

2018

In this work, we introduce a novel class of adaptive Monte Carlo methods, called adaptive independent sticky MCMC algorithms, for efficient sampling from a generic target probability density function (pdf). The new class of algorithms employs adaptive non-parametric proposal densities which become closer and closer to the target as the number of iterations increases. The proposal pdf is built using interpolation procedures based on a set of support points which is constructed iteratively based on previously drawn samples. The algorithm's efficiency is ensured by a test that controls the evolution of the set of support points. This extra stage controls the computational cost and the converge…

FOS: Computer and information sciencesMathematical optimizationAdaptive Markov chain Monte Carlo (MCMC)Monte Carlo methodBayesian inferenceHASettore SECS-P/05 - Econometrialcsh:TK7800-8360Machine Learning (stat.ML)02 engineering and technologyBayesian inference01 natural sciencesStatistics - Computationlcsh:Telecommunication010104 statistics & probabilitysymbols.namesakeAdaptive Markov chain Monte Carlo (MCMC); Adaptive rejection Metropolis sampling (ARMS); Bayesian inference; Gibbs sampling; Hit and run algorithm; Metropolis-within-Gibbs; Monte Carlo methods; Signal Processing; Hardware and Architecture; Electrical and Electronic EngineeringGibbs samplingStatistics - Machine Learninglcsh:TK5101-67200202 electrical engineering electronic engineering information engineeringComputational statisticsMetropolis-within-GibbsHit and run algorithm0101 mathematicsElectrical and Electronic EngineeringGaussian processComputation (stat.CO)MathematicsSignal processinglcsh:Electronics020206 networking & telecommunicationsMarkov chain Monte CarloMonte Carlo methodsHardware and ArchitectureSignal ProcessingSettore SECS-S/03 - Statistica EconomicasymbolsSettore SECS-S/01 - StatisticaStatistical signal processingGibbs samplingAdaptive rejection Metropolis sampling (ARMS)EURASIP Journal on Advances in Signal Processing
researchProduct

The Recycling Gibbs sampler for efficient learning

2018

Monte Carlo methods are essential tools for Bayesian inference. Gibbs sampling is a well-known Markov chain Monte Carlo (MCMC) algorithm, extensively used in signal processing, machine learning, and statistics, employed to draw samples from complicated high-dimensional posterior distributions. The key point for the successful application of the Gibbs sampler is the ability to draw efficiently samples from the full-conditional probability density functions. Since in the general case this is not possible, in order to speed up the convergence of the chain, it is required to generate auxiliary samples whose information is eventually disregarded. In this work, we show that these auxiliary sample…

FOS: Computer and information sciencesMonte Carlo methodSlice samplingInferenceMachine Learning (stat.ML)02 engineering and technologyBayesian inferenceStatistics - Computation01 natural sciencesMachine Learning (cs.LG)010104 statistics & probabilitysymbols.namesake[INFO.INFO-TS]Computer Science [cs]/Signal and Image ProcessingStatistics - Machine LearningArtificial IntelligenceStatistics0202 electrical engineering electronic engineering information engineering0101 mathematicsElectrical and Electronic EngineeringGaussian processComputation (stat.CO)ComputingMilieux_MISCELLANEOUSMathematicsChain rule (probability)Applied Mathematics020206 networking & telecommunicationsMarkov chain Monte CarloStatistics::ComputationComputer Science - LearningComputational Theory and MathematicsSignal ProcessingsymbolsComputer Vision and Pattern RecognitionStatistics Probability and UncertaintyAlgorithm[SPI.SIGNAL]Engineering Sciences [physics]/Signal and Image processingGibbs samplingDigital Signal Processing
researchProduct

Quadratic speedup for finding marked vertices by quantum walks

2020

A quantum walk algorithm can detect the presence of a marked vertex on a graph quadratically faster than the corresponding random walk algorithm (Szegedy, FOCS 2004). However, quantum algorithms that actually find a marked element quadratically faster than a classical random walk were only known for the special case when the marked set consists of just a single vertex, or in the case of some specific graphs. We present a new quantum algorithm for finding a marked vertex in any graph, with any set of marked vertices, that is (up to a log factor) quadratically faster than the corresponding classical random walk.

FOS: Computer and information sciencesQuadratic growthQuantum PhysicsQuantum algorithmsSpeedupMarkov chainMarkov chainsProbability (math.PR)FOS: Physical sciencesRandom walkVertex (geometry)CombinatoricsQuadratic equationSearch by random walkQuantum searchComputer Science - Data Structures and AlgorithmsFOS: MathematicsData Structures and Algorithms (cs.DS)Quantum walkQuantum algorithmQuantum Physics (quant-ph)Mathematics - ProbabilityMathematicsQuantum walks
researchProduct

Combining Markov Random Fields and Convolutional Neural Networks for Image Synthesis

2016

This paper studies a combination of generative Markov random field (MRF) models and discriminatively trained deep convolutional neural networks (dCNNs) for synthesizing 2D images. The generative MRF acts on higher-levels of a dCNN feature pyramid, controling the image layout at an abstract level. We apply the method to both photographic and non-photo-realistic (artwork) synthesis tasks. The MRF regularizer prevents over-excitation artifacts and reduces implausible feature mixtures common to previous dCNN inversion approaches, permitting synthezing photographic content with increased visual plausibility. Unlike standard MRF-based texture synthesis, the combined system can both match and adap…

FOS: Computer and information sciencesRandom fieldMarkov random fieldArtificial neural networkMarkov chainComputer sciencebusiness.industryComputer Vision and Pattern Recognition (cs.CV)Computer Science - Computer Vision and Pattern RecognitionComputingMethodologies_IMAGEPROCESSINGANDCOMPUTERVISION020207 software engineeringPattern recognition02 engineering and technologyIterative reconstructionConvolutional neural networkComputingMethodologies_PATTERNRECOGNITION0202 electrical engineering electronic engineering information engineering020201 artificial intelligence & image processingComputer visionArtificial intelligencebusinessGenerative grammarTexture synthesis2016 IEEE Conference on Computer Vision and Pattern Recognition (CVPR)
researchProduct

Conditional particle filters with diffuse initial distributions

2020

Conditional particle filters (CPFs) are powerful smoothing algorithms for general nonlinear/non-Gaussian hidden Markov models. However, CPFs can be inefficient or difficult to apply with diffuse initial distributions, which are common in statistical applications. We propose a simple but generally applicable auxiliary variable method, which can be used together with the CPF in order to perform efficient inference with diffuse initial distributions. The method only requires simulatable Markov transitions that are reversible with respect to the initial distribution, which can be improper. We focus in particular on random-walk type transitions which are reversible with respect to a uniform init…

FOS: Computer and information sciencesStatistics and ProbabilityComputer scienceGaussianBayesian inferenceMarkovin ketjut02 engineering and technology01 natural sciencesStatistics - ComputationArticleTheoretical Computer ScienceMethodology (stat.ME)010104 statistics & probabilitysymbols.namesakeAdaptive Markov chain Monte Carlotilastotiede0202 electrical engineering electronic engineering information engineeringStatistical physics0101 mathematicsDiffuse initialisationHidden Markov modelComputation (stat.CO)Statistics - MethodologyState space modelHidden Markov modelbayesian inferenceMarkov chaindiffuse initialisationbayesilainen menetelmäconditional particle filtersmoothingmatemaattiset menetelmät020206 networking & telecommunicationsConditional particle filterCovariancecompartment modelRandom walkCompartment modelstate space modelComputational Theory and MathematicsAutoregressive modelsymbolsStatistics Probability and UncertaintyParticle filterSmoothingSmoothing
researchProduct

Unbiased Inference for Discretely Observed Hidden Markov Model Diffusions

2021

We develop a Bayesian inference method for diffusions observed discretely and with noise, which is free of discretisation bias. Unlike existing unbiased inference methods, our method does not rely on exact simulation techniques. Instead, our method uses standard time-discretised approximations of diffusions, such as the Euler--Maruyama scheme. Our approach is based on particle marginal Metropolis--Hastings, a particle filter, randomised multilevel Monte Carlo, and importance sampling type correction of approximate Markov chain Monte Carlo. The resulting estimator leads to inference without a bias from the time-discretisation as the number of Markov chain iterations increases. We give conver…

FOS: Computer and information sciencesStatistics and ProbabilityDiscretizationComputer scienceMarkovin ketjutInference010103 numerical & computational mathematicssequential Monte CarloBayesian inferenceStatistics - Computation01 natural sciencesMethodology (stat.ME)010104 statistics & probabilitysymbols.namesakediffuusio (fysikaaliset ilmiöt)FOS: MathematicsDiscrete Mathematics and Combinatorics0101 mathematicsHidden Markov modelComputation (stat.CO)Statistics - Methodologymatematiikkabayesilainen menetelmäApplied MathematicsProbability (math.PR)diffusionmatemaattiset menetelmätMarkov chain Monte CarloMarkov chain Monte CarloMonte Carlo -menetelmätNoiseimportance sampling65C05 (primary) 60H35 65C35 65C40 (secondary)Modeling and Simulationsymbolsmatemaattiset mallitStatistics Probability and Uncertaintymultilevel Monte CarloParticle filterAlgorithmMathematics - ProbabilityImportance samplingSIAM/ASA Journal on Uncertainty Quantification
researchProduct