Search results for "Resampling"

showing 10 items of 40 documents

Effect of raster resolution and polygon-conversion algorithm on landslide susceptibility mapping

2016

The choice of the proper resolution in landslide susceptibility mapping is a worth considering issue. If, on the one hand, a coarse spatial resolution may describe the terrain morphologic properties with low accuracy, on the other hand, at very fine resolutions, some of the DEM-derived morphometric factors may hold an excess of details. Moreover, the landslide inventory maps are represented throughout geospatial vector data structure, therefore a conversion procedure vector-to-raster is required.This work investigates the effects of raster resolution on the susceptibility mapping in conjunction with the use of different algorithms of vector-raster conversion. The Artificial Neural Network t…

Artificial neural networkResamplingEnvironmental EngineeringGeospatial analysis010504 meteorology & atmospheric sciencesComputer scienceArtificial neural network; Grid-cell size; Landslide susceptibility mapping; Resampling; Vector-to-raster conversion; Ecological Modeling; Environmental Engineering; Software0208 environmental biotechnologyComputingMethodologies_IMAGEPROCESSINGANDCOMPUTERVISIONTerrain02 engineering and technologycomputer.software_genre01 natural sciencesArray data structureGrid-cell sizeImage resolutionLandslide susceptibility mapping0105 earth and related environmental sciencesArtificial neural networkEcological ModelingSettore ICAR/02 - Costruzioni Idrauliche E Marittime E IdrologiaVector-to-raster conversionLandslidecomputer.file_format020801 environmental engineeringPolygonRaster graphicscomputerAlgorithmSoftwareEnvironmental Modelling & Software
researchProduct

Population Monte Carlo Schemes with Reduced Path Degeneracy

2017

Population Monte Carlo (PMC) algorithms are versatile adaptive tools for approximating moments of complicated distributions. A common problem of PMC algorithms is the so-called path degeneracy; the diversity in the adaptation is endangered due to the resampling step. In this paper we focus on novel population Monte Carlo schemes that present enhanced diversity, compared to the standard approach, while keeping the same implementation structure (sample generation, weighting and resampling). The new schemes combine different weighting and resampling strategies to reduce the path degeneracy and achieve a higher performance at the cost of additional low computational complexity cost. Computer si…

Computational complexity theoryMonte Carlo methodApproximation algorithm020206 networking & telecommunications02 engineering and technology01 natural sciencesStatistics::ComputationWeighting010104 statistics & probabilitysymbols.namesake[INFO.INFO-TS]Computer Science [cs]/Signal and Image ProcessingGaussian noiseResamplingPath (graph theory)0202 electrical engineering electronic engineering information engineeringsymbols0101 mathematicsDegeneracy (mathematics)Algorithm[SPI.SIGNAL]Engineering Sciences [physics]/Signal and Image processingComputingMilieux_MISCELLANEOUS
researchProduct

Modeling Snow Dynamics Using a Bayesian Network

2015

In this paper we propose a novel snow accumulation and melt model, formulated as a Dynamic Bayesian Network DBN. We encode uncertainty explicitly and train the DBN using Monte Carlo analysis, carried out with a deterministic hydrology model under a wide range of plausible parameter configurations. The trained DBN was tested against field observations of snow water equivalents SWE. The results indicate that our DBN can be used to reason about uncertainty, without doing resampling from the deterministic model. In all brevity, the DBN's ability to reproduce the mean of the observations was similar to what could be obtained with the deterministic hydrology model, but with a more realistic repre…

Computer scienceResamplingMonte Carlo methodRange (statistics)Bayesian networkComputer Science::Artificial IntelligenceSnowRepresentation (mathematics)AlgorithmField (computer science)Dynamic Bayesian networkSimulation
researchProduct

Measuring the Spatial Homogeneity in Corneal Endotheliums by Means of a Randomization Test

1999

Quantification of regularity of cell sizes and the spatial arrangement of cells in corneal endotheliums becomes of a great importance associated to stress situations such as cataract surgery, corneal transplantation or implantation of intra-ocular lenses. A new index of regularity of the spatial distribution of cell sizes in corneal endotheliums is proposed. The corneal endothelium is described by means of a spatial marked point pattern (the cell centroids marked with the cell areas). The hypothesis of no dependency between mark and locations is tested by a Monte Carlo test. The new index is the p-value of the test validating the hypothesis. Pairs of endotheliums from different eyes of the …

Corneal endotheliumgenetic structuresCoefficient of variationmedicine.medical_treatmentCentroidSpatial distributioneye diseasesMonte carlo testResamplingStatisticsmedicinesense organsSpatial homogeneityCorneal transplantationMathematics
researchProduct

Residual-based block bootstrap for cointegration testing

2010

We propose a new testing procedure to determine the rank of cointegration. This new method is based on the nonparametric resampling procedure, so-called Residual-Based Block Bootstrap (RBB), which is developed by Paparoditis and Politis (2003) in the context of unit root testing. Through Monte Carlo experiments we show that, in small samples, the RBB cointegration test has good power properties in relation to the other two well-known tests for cointegration, such as the Augmented Dickey–Fuller (ADF), applied to the residual of a cointegrating regression, and the Johansen's maximum eigenvalue tests. Likewise, this article looks at the influence played by the correlation of the ‘X’ variables …

Economics and EconometricsCointegrationResamplingMonte Carlo methodStatisticsEconometricsNonparametric statisticsContext (language use)ResidualJohansen testRegressionMathematicsApplied Economics Letters
researchProduct

A Note on Resampling the Integration Across the Correlation Integral with Alternative Ranges

2003

Abstract This paper reconsiders the nonlinearity test proposed by Ko[cbreve]enda (Ko[cbreve]enda, E. (2001). An alternative to the BDS test: integration across the correlation integral. Econometric Reviews20:337–351). When the analyzed series is non‐Gaussian, the empirical rejection rates can be much larger than the nominal size. In this context, the necessity of tabulating the empirical distribution of the statistic each time the test is computed is stressed. To that end, simple random permutation works reasonably well. This paper also shows, through Monte Carlo experiments, that Ko[cbreve]enda's test can be more powerful than the Brock et al. (Brock, W., Dechert, D., Scheickman, J., LeBar…

Economics and EconometricsCorrelation dimensionResamplingMonte Carlo methodEconometricsCorrelation integralContext (language use)Random permutationEmpirical distribution functionStatisticMathematicsEconometric Reviews
researchProduct

PACo: a novel procrustes application to cophylogenetic analysis.

2013

We present Procrustean Approach to Cophylogeny (PACo), a novel statistical tool to test for congruence between phylogenetic trees, or between phylogenetic distance matrices of associated taxa. Unlike previous tests, PACo evaluates the dependence of one phylogeny upon the other. This makes it especially appropriate to test the classical coevolutionary model that assumes that parasites that spend part of their life in or on their hosts track the phylogeny of their hosts. The new method does not require fully resolved phylogenies and allows for multiple host-parasite associations. PACo produces a Procrustes superimposition plot enabling a graphical assessment of the fit of the parasite phyloge…

Evolutionary ProcessesParàsitsZoologylcsh:MedicineBiologia Models matemàticsAnimal PhylogeneticsBiostatisticsBiologyForms of EvolutionStatistical powerPlot (graphics)Host-Parasite InteractionsEvolution MolecularCongruence (geometry)StatisticsAnimalsEvolutionary SystematicsComputer SimulationParasiteslcsh:ScienceBiologyPhylogenyStatisticEvolutionary BiologyMultidisciplinaryPhylogenetic treeStatisticslcsh:RConfidence intervalPhylogeneticsParasitologylcsh:QZoologyJackknife resamplingMathematicsSoftwareResearch ArticleCoevolutionType I and type II errorsPLoS ONE
researchProduct

Group Importance Sampling for particle filtering and MCMC

2018

Bayesian methods and their implementations by means of sophisticated Monte Carlo techniques have become very popular in signal processing over the last years. Importance Sampling (IS) is a well-known Monte Carlo technique that approximates integrals involving a posterior distribution by means of weighted samples. In this work, we study the assignation of a single weighted sample which compresses the information contained in a population of weighted samples. Part of the theory that we present as Group Importance Sampling (GIS) has been employed implicitly in different works in the literature. The provided analysis yields several theoretical and practical consequences. For instance, we discus…

FOS: Computer and information sciencesComputer Science - Machine LearningComputer sciencePosterior probabilityMonte Carlo methodMachine Learning (stat.ML)02 engineering and technologyMultiple-try MetropolisStatistics - Computation01 natural sciencesMachine Learning (cs.LG)Computational Engineering Finance and Science (cs.CE)Methodology (stat.ME)010104 statistics & probabilitysymbols.namesake[INFO.INFO-TS]Computer Science [cs]/Signal and Image ProcessingStatistics - Machine LearningArtificial IntelligenceResampling0202 electrical engineering electronic engineering information engineering0101 mathematicsElectrical and Electronic EngineeringComputer Science - Computational Engineering Finance and ScienceStatistics - MethodologyComputation (stat.CO)ComputingMilieux_MISCELLANEOUSMarkov chainApplied Mathematics020206 networking & telecommunicationsMarkov chain Monte CarloStatistics::ComputationComputational Theory and MathematicsSignal ProcessingsymbolsComputer Vision and Pattern RecognitionStatistics Probability and UncertaintyParticle filter[SPI.SIGNAL]Engineering Sciences [physics]/Signal and Image processingAlgorithmImportance samplingDigital Signal Processing
researchProduct

On resampling schemes for particle filters with weakly informative observations

2022

We consider particle filters with weakly informative observations (or `potentials') relative to the latent state dynamics. The particular focus of this work is on particle filters to approximate time-discretisations of continuous-time Feynman--Kac path integral models -- a scenario that naturally arises when addressing filtering and smoothing problems in continuous time -- but our findings are indicative about weakly informative settings beyond this context too. We study the performance of different resampling schemes, such as systematic resampling, SSP (Srinivasan sampling process) and stratified resampling, as the time-discretisation becomes finer and also identify their continuous-time l…

FOS: Computer and information sciencesHidden Markov modelparticle filterStatistics and ProbabilityProbability (math.PR)Markovin ketjutStatistics - ComputationMethodology (stat.ME)resamplingFOS: Mathematicsotantanumeerinen analyysiPrimary 65C35 secondary 65C05 65C60 60J25Statistics Probability and UncertaintyFeynman–Kac modeltilastolliset mallitComputation (stat.CO)path integralMathematics - ProbabilityStatistics - Methodologystokastiset prosessit
researchProduct

Sensitivity of the C-band SRTM DEM Vertical Accuracy to Terrain Characteristics and Spatial Resolution

2008

This work reports the results of a careful regional analysis of the SRTM DEM (Shuttle Radar Topography Mission – Digital Elevation Model) vertical accuracy as a function of both topography and Land-Use/Land Cover (LULC). Absolute vertical errors appear LULC-dependent, with some values greater than the stated accuracy of the SRTM dataset, mostly over forested areas. The results show that the structure of the errors is well modeled by a cosine power n of the local incidence angle (θloc). SRTM quality is further assessed using slope and topographical similarity indexes. The results show a lower relative accuracy on slope with a R2 = 0.5 and a moderate agreement (Kappa ≈ 0.4) between SRTM- and …

GeographyResamplingInterferometric synthetic aperture radarElevationTerrainShuttle Radar Topography MissionLand coverDigital elevation modelImage resolutionRemote sensing
researchProduct