Search results for " Sampling"

showing 10 items of 375 documents

Importance sampling for Lambda-coalescents in the infinitely many sites model

2011

We present and discuss new importance sampling schemes for the approximate computation of the sample probability of observed genetic types in the infinitely many sites model from population genetics. More specifically, we extend the 'classical framework', where genealogies are assumed to be governed by Kingman's coalescent, to the more general class of Lambda-coalescents and develop further Hobolth et. al.'s (2008) idea of deriving importance sampling schemes based on 'compressed genetrees'. The resulting schemes extend earlier work by Griffiths and Tavar\'e (1994), Stephens and Donnelly (2000), Birkner and Blath (2008) and Hobolth et. al. (2008). We conclude with a performance comparison o…

Class (set theory)ComputationSample (statistics)62F99 (Primary) 62P10 92D10 92D20 (Secondary)LambdaArticleSampling StudiesCoalescent theoryEvolution MolecularGene FrequencyFOS: MathematicsQuantitative Biology::Populations and EvolutionAnimalsQuantitative Biology - Populations and EvolutionEcology Evolution Behavior and Systematicscomputer.programming_languageMathematicsDiscrete mathematicsModels GeneticBETA (programming language)Probability (math.PR)Populations and Evolution (q-bio.PE)Markov ChainsGenetics PopulationPerformance comparisonFOS: Biological sciencesMutationcomputerMonte Carlo MethodMathematics - ProbabilityImportance sampling
researchProduct

Thompson Sampling for Dynamic Multi-armed Bandits

2011

The importance of multi-armed bandit (MAB) problems is on the rise due to their recent application in a large variety of areas such as online advertising, news article selection, wireless networks, and medicinal trials, to name a few. The most common assumption made when solving such MAB problems is that the unknown reward probability theta k of each bandit arm k is fixed. However, this assumption rarely holds in practice simply because real-life problems often involve underlying processes that are dynamically evolving. In this paper, we model problems where reward probabilities theta k are drifting, and introduce a new method called Dynamic Thompson Sampling (DTS) that facilitates Order St…

Computer Science::Machine LearningMathematical optimizationbusiness.industryComputer scienceOrder statisticBayesian probabilitySampling (statistics)RegretArtificial intelligencebusinessThompson samplingRandom variableSelection (genetic algorithm)2011 10th International Conference on Machine Learning and Applications and Workshops
researchProduct

Seismic evaluation of ordinary RC buildings retrofitted with externally bonded FRPs using a reliability-based approach

2020

International audience; Despite the extensive literature on reinforced concrete (RC) members retrofitted with fiberreinforced polymer (FRP) composites, few studies have employed a reliability-based approach to evaluate the seismic performance of RC buildings in terms of their collapse capacity and ductility. In this study, the performance of a poorly-confined RC building structure is investigated for different FRP retrofitting schemes using different configurations and combinations of wrapping and flange-bonded FRPs, as two well-established techniques. A nonlinear pushover analysis is then implemented with a computational reliability analysis based on Latin Hypercube Sampling (LHS) to deter…

Computer science02 engineering and technologyRetrofitting0203 mechanical engineeringRC buildings[PHYS.MECA.SOLID]Physics [physics]/Mechanics [physics]/Solid mechanics [physics.class-ph]RetrofittingCollapse capacityDuctilityReliability (statistics)Civil and Structural EngineeringDuctilitybusiness.industryProbabilistic logicFailure modeStructural engineeringFibre-reinforced plastic021001 nanoscience & nanotechnologyReliability020303 mechanical engineering & transportsLatin hypercube samplingCeramics and Composites0210 nano-technologybusinessMaterial propertiesFailure mode and effects analysisFRP
researchProduct

Adaptive Importance Sampling: The past, the present, and the future

2017

A fundamental problem in signal processing is the estimation of unknown parameters or functions from noisy observations. Important examples include localization of objects in wireless sensor networks [1] and the Internet of Things [2]; multiple source reconstruction from electroencephalograms [3]; estimation of power spectral density for speech enhancement [4]; or inference in genomic signal processing [5]. Within the Bayesian signal processing framework, these problems are addressed by constructing posterior probability distributions of the unknowns. The posteriors combine optimally all of the information about the unknowns in the observations with the information that is present in their …

Computer scienceBayesian probabilityPosterior probabilityInference02 engineering and technologyMachine learningcomputer.software_genre01 natural sciences010104 statistics & probabilityMultidimensional signal processing[INFO.INFO-TS]Computer Science [cs]/Signal and Image ProcessingPrior probability0202 electrical engineering electronic engineering information engineering0101 mathematicsElectrical and Electronic EngineeringComputingMilieux_MISCELLANEOUSbusiness.industryApplied Mathematics020206 networking & telecommunicationsApproximate inferenceSignal ProcessingProbability distributionArtificial intelligencebusinessAlgorithmcomputer[SPI.SIGNAL]Engineering Sciences [physics]/Signal and Image processingImportance sampling
researchProduct

The impact of sample reduction on PCA-based feature extraction for supervised learning

2006

"The curse of dimensionality" is pertinent to many learning algorithms, and it denotes the drastic raise of computational complexity and classification error in high dimensions. In this paper, different feature extraction (FE) techniques are analyzed as means of dimensionality reduction, and constructive induction with respect to the performance of Naive Bayes classifier. When a data set contains a large number of instances, some sampling approach is applied to address the computational complexity of FE and classification processes. The main goal of this paper is to show the impact of sample reduction on the process of FE for supervised learning. In our study we analyzed the conventional PC…

Computer scienceCovariance matrixbusiness.industryDimensionality reductionFeature extractionSupervised learningNonparametric statisticsSampling (statistics)Pattern recognitionStratified samplingNaive Bayes classifierSample size determinationArtificial intelligencebusinessEigenvalues and eigenvectorsParametric statisticsCurse of dimensionalityProceedings of the 2006 ACM symposium on Applied computing
researchProduct

Register data in sample allocations for small-area estimation

2018

The inadequate control of sample sizes in surveys using stratified sampling and area estimation may occur when the overall sample size is small or auxiliary information is insufficiently used. Very small sample sizes are possible for some areas. The proposed allocation based on multi-objective optimization uses a small-area model and estimation method and semi-collected empirical data annually collected empirical data. The assessment of its performance at the area and at the population levels is based on design-based sample simulations. Five previously developed allocations serve as references. The model-based estimator is more accurate than the design-based Horvitz–Thompson estimator and t…

Computer scienceGeneral MathematicsGeography Planning and DevelopmentPopulationSample (statistics)01 natural sciences010104 statistics & probabilitySmall area estimationmodel-based EBLUP0502 economics and businessSampling designStatisticsrekisteritotanta0101 mathematicseducation050205 econometrics DemographyEstimationta113education.field_of_studyta112kaupparekisteritauxiliary and proxy data05 social sciencesEstimatortrade-off between areas and populationmonitavoiteoptimointiStratified samplingkohdentaminenmulti-objective optimizationSample size determinationGeneral Agricultural and Biological SciencesperformanceMathematical Population Studies
researchProduct

Adaptive Population Importance Samplers: A General Perspective

2016

Importance sampling (IS) is a well-known Monte Carlo method, widely used to approximate a distribution of interest using a random measure composed of a set of weighted samples generated from another proposal density. Since the performance of the algorithm depends on the mismatch between the target and the proposal densities, a set of proposals is often iteratively adapted in order to reduce the variance of the resulting estimator. In this paper, we review several well-known adaptive population importance samplers, providing a unified common framework and classifying them according to the nature of their estimation and adaptive procedures. Furthermore, we interpret the underlying motivation …

Computer scienceMatemáticasMonte Carlo methodPopulation02 engineering and technologyMachine learningcomputer.software_genre01 natural sciences010104 statistics & probability[INFO.INFO-TS]Computer Science [cs]/Signal and Image Processing0202 electrical engineering electronic engineering information engineering0101 mathematicseducationComputingMilieux_MISCELLANEOUSeducation.field_of_studybusiness.industryEstimator020206 networking & telecommunicationsStatistical classificationRandom measureMonte Carlo integrationData miningArtificial intelligencebusinessParticle filtercomputer[SPI.SIGNAL]Engineering Sciences [physics]/Signal and Image processingImportance sampling
researchProduct

Group Metropolis Sampling

2017

Monte Carlo (MC) methods are widely used for Bayesian inference and optimization in statistics, signal processing and machine learning. Two well-known class of MC methods are the Importance Sampling (IS) techniques and the Markov Chain Monte Carlo (MCMC) algorithms. In this work, we introduce the Group Importance Sampling (GIS) framework where different sets of weighted samples are properly summarized with one summary particle and one summary weight. GIS facilitates the design of novel efficient MC techniques. For instance, we present the Group Metropolis Sampling (GMS) algorithm which produces a Markov chain of sets of weighted samples. GMS in general outperforms other multiple try schemes…

Computer scienceMonte Carlo methodMarkov processSlice samplingProbability density function02 engineering and technologyMultiple-try MetropolisBayesian inferenceMachine learningcomputer.software_genre01 natural sciencesHybrid Monte Carlo010104 statistics & probabilitysymbols.namesake[INFO.INFO-TS]Computer Science [cs]/Signal and Image Processing0202 electrical engineering electronic engineering information engineering0101 mathematicsComputingMilieux_MISCELLANEOUSMarkov chainbusiness.industryRejection samplingSampling (statistics)020206 networking & telecommunicationsMarkov chain Monte CarloMetropolis–Hastings algorithmsymbolsMonte Carlo method in statistical physicsMonte Carlo integrationArtificial intelligencebusinessParticle filter[SPI.SIGNAL]Engineering Sciences [physics]/Signal and Image processingcomputerAlgorithmImportance samplingMonte Carlo molecular modeling
researchProduct

Recycling Gibbs sampling

2017

Gibbs sampling is a well-known Markov chain Monte Carlo (MCMC) algorithm, extensively used in signal processing, machine learning and statistics. The key point for the successful application of the Gibbs sampler is the ability to draw samples from the full-conditional probability density functions efficiently. In the general case this is not possible, so in order to speed up the convergence of the chain, it is required to generate auxiliary samples. However, such intermediate information is finally disregarded. In this work, we show that these auxiliary samples can be recycled within the Gibbs estimators, improving their efficiency with no extra cost. Theoretical and exhaustive numerical co…

Computer scienceMonte Carlo methodSlice samplingMarkov processProbability density function02 engineering and technologyMachine learningcomputer.software_genre01 natural sciencesHybrid Monte Carlo010104 statistics & probabilitysymbols.namesake[INFO.INFO-TS]Computer Science [cs]/Signal and Image Processing0202 electrical engineering electronic engineering information engineering0101 mathematicsComputingMilieux_MISCELLANEOUSbusiness.industryRejection samplingEstimator020206 networking & telecommunicationsMarkov chain Monte CarlosymbolsArtificial intelligencebusiness[SPI.SIGNAL]Engineering Sciences [physics]/Signal and Image processingcomputerAlgorithmGibbs sampling2017 25th European Signal Processing Conference (EUSIPCO)
researchProduct

Theoretical Foundations of the Monte Carlo Method and Its Applications in Statistical Physics

2002

In this chapter we first introduce the basic concepts of Monte Carlo sampling, give some details on how Monte Carlo programs need to be organized, and then proceed to the interpretation and analysis of Monte Carlo results.

Computer scienceMonte Carlo methodThermodynamic limitPeriodic boundary conditionsMonte Carlo method in statistical physicsIsing modelStatistical physicsImportance samplingMonte Carlo molecular modelingInterpretation (model theory)
researchProduct