Search results for "Bay"

showing 10 items of 1187 documents

Efficient linear fusion of partial estimators

2018

Abstract Many signal processing applications require performing statistical inference on large datasets, where computational and/or memory restrictions become an issue. In this big data setting, computing an exact global centralized estimator is often either unfeasible or impractical. Hence, several authors have considered distributed inference approaches, where the data are divided among multiple workers (cores, machines or a combination of both). The computations are then performed in parallel and the resulting partial estimators are finally combined to approximate the intractable global estimator. In this paper, we focus on the scenario where no communication exists among the workers, de…

Computer scienceBayesian probabilityInferenceAsymptotic distribution02 engineering and technology01 natural sciences010104 statistics & probability[INFO.INFO-TS]Computer Science [cs]/Signal and Image ProcessingArtificial Intelligence0202 electrical engineering electronic engineering information engineeringStatistical inferenceFusion rules0101 mathematicsElectrical and Electronic EngineeringComputingMilieux_MISCELLANEOUSMinimum mean square errorApplied MathematicsConstrained optimizationEstimator020206 networking & telecommunicationsComputational Theory and MathematicsSignal ProcessingComputer Vision and Pattern RecognitionStatistics Probability and Uncertainty[SPI.SIGNAL]Engineering Sciences [physics]/Signal and Image processingAlgorithmDigital Signal Processing
researchProduct

Adaptive Importance Sampling: The past, the present, and the future

2017

A fundamental problem in signal processing is the estimation of unknown parameters or functions from noisy observations. Important examples include localization of objects in wireless sensor networks [1] and the Internet of Things [2]; multiple source reconstruction from electroencephalograms [3]; estimation of power spectral density for speech enhancement [4]; or inference in genomic signal processing [5]. Within the Bayesian signal processing framework, these problems are addressed by constructing posterior probability distributions of the unknowns. The posteriors combine optimally all of the information about the unknowns in the observations with the information that is present in their …

Computer scienceBayesian probabilityPosterior probabilityInference02 engineering and technologyMachine learningcomputer.software_genre01 natural sciences010104 statistics & probabilityMultidimensional signal processing[INFO.INFO-TS]Computer Science [cs]/Signal and Image ProcessingPrior probability0202 electrical engineering electronic engineering information engineering0101 mathematicsElectrical and Electronic EngineeringComputingMilieux_MISCELLANEOUSbusiness.industryApplied Mathematics020206 networking & telecommunicationsApproximate inferenceSignal ProcessingProbability distributionArtificial intelligencebusinessAlgorithmcomputer[SPI.SIGNAL]Engineering Sciences [physics]/Signal and Image processingImportance sampling
researchProduct

A New Simple Computational Method of Simultaneous Constructing and Comparing Confidence Intervals of Shortest Length and Equal Tails for Making Effic…

2021

A confidence interval is a range of values that provides the user with useful information about how accurately a statistic estimates a parameter. In the present paper, a new simple computational method is proposed for simultaneous constructing and comparing confidence intervals of shortest length and equal tails in order to make efficient decisions under parametric uncertainty. This unified computational method provides intervals in several situations that previously required separate analysis using more advanced methods and tables for numerical solutions. In contrast to the Bayesian approach, the proposed approach does not depend on the choice of priors and is a novelty in the theory of st…

Computer scienceBayesian probabilityPrior probabilityProbability distributionQuantile functionPivotal quantityAlgorithmConfidence intervalParametric statisticsQuantile
researchProduct

The impact of sample reduction on PCA-based feature extraction for supervised learning

2006

"The curse of dimensionality" is pertinent to many learning algorithms, and it denotes the drastic raise of computational complexity and classification error in high dimensions. In this paper, different feature extraction (FE) techniques are analyzed as means of dimensionality reduction, and constructive induction with respect to the performance of Naive Bayes classifier. When a data set contains a large number of instances, some sampling approach is applied to address the computational complexity of FE and classification processes. The main goal of this paper is to show the impact of sample reduction on the process of FE for supervised learning. In our study we analyzed the conventional PC…

Computer scienceCovariance matrixbusiness.industryDimensionality reductionFeature extractionSupervised learningNonparametric statisticsSampling (statistics)Pattern recognitionStratified samplingNaive Bayes classifierSample size determinationArtificial intelligencebusinessEigenvalues and eigenvectorsParametric statisticsCurse of dimensionalityProceedings of the 2006 ACM symposium on Applied computing
researchProduct

A Bayesian unified framework for risk estimation and cluster identification in small area health data analysis.

2020

Many statistical models have been proposed to analyse small area disease data with the aim of describing spatial variation in disease risk. In this paper, we propose a Bayesian hierarchical model that simultaneously allows for risk estimation and cluster identification. Our model formulation assumes that there is an unknown number of risk classes and small areas are assigned to a risk class by means of independent allocation variables. Therefore, areas within each cluster are assumed to share a common risk but they may be geographically separated. The posterior distribution of the parameter representing the number of risk classes is estimated using a novel procedure that combines its prior …

Computer scienceEpidemiologyPathology and Laboratory Medicine01 natural sciencesGeographical locations010104 statistics & probabilityChickenpoxMathematical and Statistical TechniquesStatisticsMedicine and Health SciencesPublic and Occupational Health0303 health sciencesMultidisciplinarySimulation and ModelingQREuropeIdentification (information)Medical MicrobiologySmall-Area AnalysisViral PathogensVirusesPhysical SciencesMedicinePathogensAlgorithmsResearch ArticleHerpesvirusesScienceBayesian probabilityPosterior probabilityBayesian MethodDisease SurveillanceDisease clusterResearch and Analysis MethodsRisk AssessmentMicrobiologyVaricella Zoster Virus03 medical and health sciencesRisk classPrior probabilityCovariateBayesian hierarchical modelingHumansEuropean Union0101 mathematicsMicrobial Pathogens030304 developmental biologyBiology and life sciencesOrganismsStatistical modelBayes TheoremProbability TheoryProbability DistributionMarginal likelihoodConvolutionSpainPeople and placesDNA virusesMathematical FunctionsMathematicsPloS one
researchProduct

Temporal Binding in Multisensory and Motor-Sensory Contexts: Toward a Unified Model

2021

Our senses receive a manifold of sensory signals at any given moment in our daily lives. For a coherent and unified representation of information and precise motor control, our brain needs to temporally bind the signals emanating from a common causal event and segregate others. Traditionally, different mechanisms were proposed for the temporal binding phenomenon in multisensory and motor-sensory contexts. This paper reviews the literature on the temporal binding phenomenon in both multisensory and motor-sensory contexts and suggests future research directions for advancing the field. Moreover, by critically evaluating the recent literature, this paper suggests that common computational prin…

Computer scienceMini ReviewEvent (relativity)Sensory system050105 experimental psychologylcsh:RC321-57103 medical and health sciencesBehavioral Neuroscience0302 clinical medicinetemporal bindingPhenomenon0501 psychology and cognitive sciencescausal inferencelcsh:Neurosciences. Biological psychiatry. Neuropsychiatrymotor-sensoryBayesian modelsBiological PsychiatryUncertainty reduction theoryCognitive science05 social sciencesRepresentation (systemics)Motor controlHuman NeuroscienceUnified ModelmultisensoryPsychiatry and Mental healthNeuropsychology and Physiological PsychologyNeurologyCausal inferenceprecision030217 neurology & neurosurgeryFrontiers in Human Neuroscience
researchProduct

Distributed Particle Metropolis-Hastings Schemes

2018

We introduce a Particle Metropolis-Hastings algorithm driven by several parallel particle filters. The communication with the central node requires the transmission of only a set of weighted samples, one per filter. Furthermore, the marginal version of the previous scheme, called Distributed Particle Marginal Metropolis-Hastings (DPMMH) method, is also presented. DPMMH can be used for making inference on both a dynamical and static variable of interest. The ergodicity is guaranteed, and numerical simulations show the advantages of the novel schemes.

Computer scienceMonte Carlo methodErgodicity020206 networking & telecommunications02 engineering and technologyFilter (signal processing)Bayesian inferenceStatistics::ComputationSet (abstract data type)Metropolis–Hastings algorithm[INFO.INFO-TS]Computer Science [cs]/Signal and Image ProcessingTransmission (telecommunications)0202 electrical engineering electronic engineering information engineering020201 artificial intelligence & image processingParticle filter[SPI.SIGNAL]Engineering Sciences [physics]/Signal and Image processingAlgorithmComputingMilieux_MISCELLANEOUS2018 IEEE Statistical Signal Processing Workshop (SSP)
researchProduct

Group Metropolis Sampling

2017

Monte Carlo (MC) methods are widely used for Bayesian inference and optimization in statistics, signal processing and machine learning. Two well-known class of MC methods are the Importance Sampling (IS) techniques and the Markov Chain Monte Carlo (MCMC) algorithms. In this work, we introduce the Group Importance Sampling (GIS) framework where different sets of weighted samples are properly summarized with one summary particle and one summary weight. GIS facilitates the design of novel efficient MC techniques. For instance, we present the Group Metropolis Sampling (GMS) algorithm which produces a Markov chain of sets of weighted samples. GMS in general outperforms other multiple try schemes…

Computer scienceMonte Carlo methodMarkov processSlice samplingProbability density function02 engineering and technologyMultiple-try MetropolisBayesian inferenceMachine learningcomputer.software_genre01 natural sciencesHybrid Monte Carlo010104 statistics & probabilitysymbols.namesake[INFO.INFO-TS]Computer Science [cs]/Signal and Image Processing0202 electrical engineering electronic engineering information engineering0101 mathematicsComputingMilieux_MISCELLANEOUSMarkov chainbusiness.industryRejection samplingSampling (statistics)020206 networking & telecommunicationsMarkov chain Monte CarloMetropolis–Hastings algorithmsymbolsMonte Carlo method in statistical physicsMonte Carlo integrationArtificial intelligencebusinessParticle filter[SPI.SIGNAL]Engineering Sciences [physics]/Signal and Image processingcomputerAlgorithmImportance samplingMonte Carlo molecular modeling
researchProduct

Additive noise and multiplicative bias as disclosure limitation techniques for continuous microdata: A simulation study

2004

This paper focuses on a combination of two disclosure limitation techniques, additive noise and multiplicative bias, and studies their efficacy in protecting confidentiality of continuous microdata. A Bayesian intruder model is extensively simulated in order to assess the performance of these disclosure limitation techniques as a function of key parameters like the variability amongst profiles in the original data, the amount of users prior information, the amount of bias and noise introduced in the data. The results of the simulation offer insight into the degree of vulnerability of data on continuous random variables and suggests some guidelines for effective protection measures.

Computer scienceMultiplicative functionBayesian probabilityGeneral Engineeringcomputer.software_genreComputer Science ApplicationsOriginal dataComputational MathematicsMicrodata (HTML)Simulated dataConfidentialityData miningRandom variablecomputerPrior information
researchProduct

Incorporating Uncertainties into Traffic Simulators

2007

Computer scienceReal-time computingPosterior probabilityErrors-in-variables modelsHierarchical network modelTraffic generation modelTelecommunications networkVariable-order Bayesian networkSimulationNetwork simulationNetwork traffic simulation
researchProduct