Search results for "Probability and Uncertainty"

showing 10 items of 578 documents

Past price “memory” in the housing market: testing the performance of different spatio-temporal specifications.

2017

ABSTRACTRecent methodological developments provide a way to incorporate the temporal dimension when accounting for spatial effects in hedonic pricing. Weight matrices should decompose the spatial effects into two distinct components: bidirectional contemporaneous spatial connections; and unidirectional spatio-temporal effects from past transactions. Our iterative estimation approach explicitly analyses the role of time in price determination. The results show that both spatio-temporal components should be included in model specification; past transaction information stops contributing to price determination after eight months; and limited temporal friction is exhibited within this period. T…

EstimationSpatial weight matirx050208 financeSTARComputer science05 social sciencesGeography Planning and DevelopmentHedonic pricingHousing marketHedonic PricingSpecification0502 economics and businessEarth and Planetary Sciences (miscellaneous)EconometricsSpatial econometricsSpatio-temporal050207 economicsStatistics Probability and UncertaintyDimension (data warehouse)Spatial econometricsGeneral Economics Econometrics and FinanceDatabase transactionSAR
researchProduct

ISARIC-COVID-19 dataset: A Prospective, Standardized, Global Dataset of Patients Hospitalized with COVID-19

2022

The International Severe Acute Respiratory and Emerging Infection Consortium (ISARIC) COVID-19 dataset is one of the largest international databases of prospectively collected clinical data on people hospitalized with COVID-19. This dataset was compiled during the COVID-19 pandemic by a network of hospitals that collect data using the ISARIC-World Health Organization Clinical Characterization Protocol and data tools. The database includes data from more than 705,000 patients, collected in more than 60 countries and 1,500 centres worldwide. Patient data are available from acute hospital admissions with COVID-19 and outpatient follow-ups. The data include signs and symptoms, pre-existing como…

EğitimSocial Sciences and HumanitiesInformation Security and ReliabilitySocial Sciences (SOC)Sosyal Bilimler ve Beşeri BilimlerEpidemiologyEDUCATION & EDUCATIONAL RESEARCHTemel Bilimler (SCI)BİLGİSAYAR BİLİMİ BİLGİ SİSTEMLERİMATHEMATICSSociology[SDV.MHEP.MI]Life Sciences [q-bio]/Human health and pathology/Infectious diseasesProspective StudiesCOMPUTER SCIENCE INFORMATION SYSTEMSSTATISTICS & PROBABILITYMatematikBilgisayar Bilimi UygulamalarıComputer SciencesBilgi Güvenliği ve GüvenilirliğiEĞİTİM VE EĞİTİM ARAŞTIRMASIBİLGİ BİLİMİ VE KÜTÜPHANE BİLİMİBilgi sistemiComputer Science ApplicationsKütüphane ve Bilgi BilimleriHospitalizationNatural Sciences (SCI)Physical SciencesEngineering and TechnologySosyal Bilimler (SOC)Bilgisayar BilimiStatistics Probability and UncertaintyInformation SystemsHumanStatistics and ProbabilityHumans; Pandemics; Prospective Studies; SARS-CoV-2; COVID-19; HospitalizationSOCIAL SCIENCES GENERALLibrary and Information SciencesEducationSDG 3 - Good Health and Well-beingLibrary SciencesINFORMATION SCIENCE & LIBRARY SCIENCEİstatistik ve OlasılıkHumansSosyal ve Beşeri BilimlerBilgisayar BilimleriSocial Sciences & HumanitiesEngineering Computing & Technology (ENG)SosyolojiPandemicsPandemicSARS-CoV-2İSTATİSTİK & OLASILIKCOVID-19Mühendislik Bilişim ve Teknoloji (ENG)İstatistik Olasılık ve BelirsizlikSosyal Bilimler GenelCOMPUTER SCIENCEProspective StudieFizik BilimleriViral infectionMühendislik ve TeknolojiKütüphanecilik
researchProduct

On the use of approximate Bayesian computation Markov chain Monte Carlo with inflated tolerance and post-correction

2020

Approximate Bayesian computation allows for inference of complicated probabilistic models with intractable likelihoods using model simulations. The Markov chain Monte Carlo implementation of approximate Bayesian computation is often sensitive to the tolerance parameter: low tolerance leads to poor mixing and large tolerance entails excess bias. We consider an approach using a relatively large tolerance for the Markov chain Monte Carlo sampler to ensure its sufficient mixing, and post-processing the output leading to estimators for a range of finer tolerances. We introduce an approximate confidence interval for the related post-corrected estimators, and propose an adaptive approximate Bayesi…

FOS: Computer and information sciences0301 basic medicineStatistics and Probabilitytolerance choiceGeneral MathematicsMarkovin ketjutInference01 natural sciencesStatistics - Computationapproximate Bayesian computation010104 statistics & probability03 medical and health sciencessymbols.namesakeMixing (mathematics)adaptive algorithmalgoritmit0101 mathematicsComputation (stat.CO)MathematicsAdaptive algorithmMarkov chainbayesilainen menetelmäApplied MathematicsProbabilistic logicEstimatorMarkov chain Monte CarloAgricultural and Biological Sciences (miscellaneous)Markov chain Monte CarloMonte Carlo -menetelmätimportance sampling030104 developmental biologyconfidence intervalsymbolsStatistics Probability and UncertaintyApproximate Bayesian computationGeneral Agricultural and Biological SciencesAlgorithm
researchProduct

Group Importance Sampling for particle filtering and MCMC

2018

Bayesian methods and their implementations by means of sophisticated Monte Carlo techniques have become very popular in signal processing over the last years. Importance Sampling (IS) is a well-known Monte Carlo technique that approximates integrals involving a posterior distribution by means of weighted samples. In this work, we study the assignation of a single weighted sample which compresses the information contained in a population of weighted samples. Part of the theory that we present as Group Importance Sampling (GIS) has been employed implicitly in different works in the literature. The provided analysis yields several theoretical and practical consequences. For instance, we discus…

FOS: Computer and information sciencesComputer Science - Machine LearningComputer sciencePosterior probabilityMonte Carlo methodMachine Learning (stat.ML)02 engineering and technologyMultiple-try MetropolisStatistics - Computation01 natural sciencesMachine Learning (cs.LG)Computational Engineering Finance and Science (cs.CE)Methodology (stat.ME)010104 statistics & probabilitysymbols.namesake[INFO.INFO-TS]Computer Science [cs]/Signal and Image ProcessingStatistics - Machine LearningArtificial IntelligenceResampling0202 electrical engineering electronic engineering information engineering0101 mathematicsElectrical and Electronic EngineeringComputer Science - Computational Engineering Finance and ScienceStatistics - MethodologyComputation (stat.CO)ComputingMilieux_MISCELLANEOUSMarkov chainApplied Mathematics020206 networking & telecommunicationsMarkov chain Monte CarloStatistics::ComputationComputational Theory and MathematicsSignal ProcessingsymbolsComputer Vision and Pattern RecognitionStatistics Probability and UncertaintyParticle filter[SPI.SIGNAL]Engineering Sciences [physics]/Signal and Image processingAlgorithmImportance samplingDigital Signal Processing
researchProduct

Deep Importance Sampling based on Regression for Model Inversion and Emulation

2021

Understanding systems by forward and inverse modeling is a recurrent topic of research in many domains of science and engineering. In this context, Monte Carlo methods have been widely used as powerful tools for numerical inference and optimization. They require the choice of a suitable proposal density that is crucial for their performance. For this reason, several adaptive importance sampling (AIS) schemes have been proposed in the literature. We here present an AIS framework called Regression-based Adaptive Deep Importance Sampling (RADIS). In RADIS, the key idea is the adaptive construction via regression of a non-parametric proposal density (i.e., an emulator), which mimics the posteri…

FOS: Computer and information sciencesComputer Science - Machine LearningImportance samplingComputer scienceMonte Carlo methodPosterior probabilityBayesian inferenceInferenceContext (language use)Machine Learning (stat.ML)02 engineering and technologyEstadísticaStatistics - ComputationMachine Learning (cs.LG)symbols.namesakeSurrogate modelStatistics - Machine LearningArtificial Intelligence0202 electrical engineering electronic engineering information engineeringAdaptive regressionEmulationElectrical and Electronic EngineeringModel inversionGaussian processComputation (stat.CO)EmulationApplied Mathematics020206 networking & telecommunicationsRemote sensingComputational Theory and MathematicsSignal Processingsymbols020201 artificial intelligence & image processingComputer Vision and Pattern RecognitionStatistics Probability and UncertaintyAlgorithmImportance sampling
researchProduct

A Review of Multiple Try MCMC algorithms for Signal Processing

2018

Many applications in signal processing require the estimation of some parameters of interest given a set of observed data. More specifically, Bayesian inference needs the computation of {\it a-posteriori} estimators which are often expressed as complicated multi-dimensional integrals. Unfortunately, analytical expressions for these estimators cannot be found in most real-world applications, and Monte Carlo methods are the only feasible approach. A very powerful class of Monte Carlo techniques is formed by the Markov Chain Monte Carlo (MCMC) algorithms. They generate a Markov chain such that its stationary distribution coincides with the target posterior density. In this work, we perform a t…

FOS: Computer and information sciencesComputer scienceMonte Carlo methodMachine Learning (stat.ML)02 engineering and technologyMultiple-try MetropolisBayesian inference01 natural sciencesStatistics - Computation010104 statistics & probabilitysymbols.namesakeArtificial IntelligenceStatistics - Machine Learning0202 electrical engineering electronic engineering information engineering0101 mathematicsElectrical and Electronic EngineeringComputation (stat.CO)Signal processingMarkov chainApplied MathematicsEstimator020206 networking & telecommunicationsMarkov chain Monte CarloStatistics::ComputationComputational Theory and MathematicsSignal ProcessingsymbolsSample spaceComputer Vision and Pattern RecognitionStatistics Probability and UncertaintyAlgorithm
researchProduct

On resampling schemes for particle filters with weakly informative observations

2022

We consider particle filters with weakly informative observations (or `potentials') relative to the latent state dynamics. The particular focus of this work is on particle filters to approximate time-discretisations of continuous-time Feynman--Kac path integral models -- a scenario that naturally arises when addressing filtering and smoothing problems in continuous time -- but our findings are indicative about weakly informative settings beyond this context too. We study the performance of different resampling schemes, such as systematic resampling, SSP (Srinivasan sampling process) and stratified resampling, as the time-discretisation becomes finer and also identify their continuous-time l…

FOS: Computer and information sciencesHidden Markov modelparticle filterStatistics and ProbabilityProbability (math.PR)Markovin ketjutStatistics - ComputationMethodology (stat.ME)resamplingFOS: Mathematicsotantanumeerinen analyysiPrimary 65C35 secondary 65C05 65C60 60J25Statistics Probability and UncertaintyFeynman–Kac modeltilastolliset mallitComputation (stat.CO)path integralMathematics - ProbabilityStatistics - Methodologystokastiset prosessit
researchProduct

The Recycling Gibbs sampler for efficient learning

2018

Monte Carlo methods are essential tools for Bayesian inference. Gibbs sampling is a well-known Markov chain Monte Carlo (MCMC) algorithm, extensively used in signal processing, machine learning, and statistics, employed to draw samples from complicated high-dimensional posterior distributions. The key point for the successful application of the Gibbs sampler is the ability to draw efficiently samples from the full-conditional probability density functions. Since in the general case this is not possible, in order to speed up the convergence of the chain, it is required to generate auxiliary samples whose information is eventually disregarded. In this work, we show that these auxiliary sample…

FOS: Computer and information sciencesMonte Carlo methodSlice samplingInferenceMachine Learning (stat.ML)02 engineering and technologyBayesian inferenceStatistics - Computation01 natural sciencesMachine Learning (cs.LG)010104 statistics & probabilitysymbols.namesake[INFO.INFO-TS]Computer Science [cs]/Signal and Image ProcessingStatistics - Machine LearningArtificial IntelligenceStatistics0202 electrical engineering electronic engineering information engineering0101 mathematicsElectrical and Electronic EngineeringGaussian processComputation (stat.CO)ComputingMilieux_MISCELLANEOUSMathematicsChain rule (probability)Applied Mathematics020206 networking & telecommunicationsMarkov chain Monte CarloStatistics::ComputationComputer Science - LearningComputational Theory and MathematicsSignal ProcessingsymbolsComputer Vision and Pattern RecognitionStatistics Probability and UncertaintyAlgorithm[SPI.SIGNAL]Engineering Sciences [physics]/Signal and Image processingGibbs samplingDigital Signal Processing
researchProduct

Multispectral image denoising with optimized vector non-local mean filter

2016

Nowadays, many applications rely on images of high quality to ensure good performance in conducting their tasks. However, noise goes against this objective as it is an unavoidable issue in most applications. Therefore, it is essential to develop techniques to attenuate the impact of noise, while maintaining the integrity of relevant information in images. We propose in this work to extend the application of the Non-Local Means filter (NLM) to the vector case and apply it for denoising multispectral images. The objective is to benefit from the additional information brought by multispectral imaging systems. The NLM filter exploits the redundancy of information in an image to remove noise. A …

FOS: Computer and information sciencesMulti-spectral imaging systemsComputer Vision and Pattern Recognition (cs.CV)Optimization frameworkMultispectral imageComputer Science - Computer Vision and Pattern Recognition02 engineering and technologyWhite noisePixels[SPI]Engineering Sciences [physics][ SPI ] Engineering Sciences [physics]0202 electrical engineering electronic engineering information engineeringComputer visionUnbiased risk estimatorMultispectral imageMathematicsMultispectral imagesApplied MathematicsBilateral FilterNumerical Analysis (math.NA)Non-local meansAdditive White Gaussian noiseStein's unbiased risk estimatorIlluminationComputational Theory and MathematicsRestorationImage denoisingsymbols020201 artificial intelligence & image processingNon-local mean filtersComputer Vision and Pattern RecognitionStatistics Probability and UncertaintyGaussian noise (electronic)Non- local means filtersAlgorithmsNoise reductionComputingMethodologies_IMAGEPROCESSINGANDCOMPUTERVISIONFace Recognitionsymbols.namesakeNoise RemovalArtificial IntelligenceFOS: MathematicsParameter estimationMedian filterMathematics - Numerical AnalysisElectrical and Electronic EngineeringFusionPixelbusiness.industryVector non-local mean filter020206 networking & telecommunicationsPattern recognitionFilter (signal processing)Bandpass filters[ SPI.TRON ] Engineering Sciences [physics]/Electronics[SPI.TRON]Engineering Sciences [physics]/ElectronicsStein's unbiased risk estimators (SURE)NoiseAdditive white Gaussian noiseComputer Science::Computer Vision and Pattern RecognitionSignal ProcessingArtificial intelligenceReconstructionbusinessModel
researchProduct

The FLUXCOM ensemble of global land-atmosphere energy fluxes

2019

Although a key driver of Earth’s climate system, global land-atmosphere energy fluxes are poorly constrained. Here we use machine learning to merge energy flux measurements from FLUXNET eddy covariance towers with remote sensing and meteorological data to estimate global gridded net radiation, latent and sensible heat and their uncertainties. The resulting FLUXCOM database comprises 147 products in two setups: (1) 0.0833° resolution using MODIS remote sensing data (RS) and (2) 0.5° resolution using remote sensing and meteorological data (RS + METEO). Within each setup we use a full factorial design across machine learning methods, forcing datasets and energy balance closure corrections. For…

FOS: Computer and information sciencesStatistics and ProbabilityComputer Science - Machine LearningData Descriptor010504 meteorology & atmospheric sciencesMeteorology0208 environmental biotechnologyEnergy balanceEddy covarianceFOS: Physical sciencesEnergy fluxMachine Learning (stat.ML)02 engineering and technologySensible heatLibrary and Information Sciences01 natural sciences7. Clean energyMachine Learning (cs.LG)EducationFluxNetStatistics - Machine LearningEvapotranspirationLatent heatlcsh:Science0105 earth and related environmental sciences020801 environmental engineeringComputer Science ApplicationsMetadataEnvironmental sciencesPhysics - Atmospheric and Oceanic Physics13. Climate actionAtmospheric and Oceanic Physics (physics.ao-ph)Environmental sciencelcsh:QStatistics Probability and UncertaintyHydrologyClimate sciencesInformation SystemsScientific Data
researchProduct