Search results for "Resampling"
showing 10 items of 40 documents
Effect of raster resolution and polygon-conversion algorithm on landslide susceptibility mapping
2016
The choice of the proper resolution in landslide susceptibility mapping is a worth considering issue. If, on the one hand, a coarse spatial resolution may describe the terrain morphologic properties with low accuracy, on the other hand, at very fine resolutions, some of the DEM-derived morphometric factors may hold an excess of details. Moreover, the landslide inventory maps are represented throughout geospatial vector data structure, therefore a conversion procedure vector-to-raster is required.This work investigates the effects of raster resolution on the susceptibility mapping in conjunction with the use of different algorithms of vector-raster conversion. The Artificial Neural Network t…
Population Monte Carlo Schemes with Reduced Path Degeneracy
2017
Population Monte Carlo (PMC) algorithms are versatile adaptive tools for approximating moments of complicated distributions. A common problem of PMC algorithms is the so-called path degeneracy; the diversity in the adaptation is endangered due to the resampling step. In this paper we focus on novel population Monte Carlo schemes that present enhanced diversity, compared to the standard approach, while keeping the same implementation structure (sample generation, weighting and resampling). The new schemes combine different weighting and resampling strategies to reduce the path degeneracy and achieve a higher performance at the cost of additional low computational complexity cost. Computer si…
Modeling Snow Dynamics Using a Bayesian Network
2015
In this paper we propose a novel snow accumulation and melt model, formulated as a Dynamic Bayesian Network DBN. We encode uncertainty explicitly and train the DBN using Monte Carlo analysis, carried out with a deterministic hydrology model under a wide range of plausible parameter configurations. The trained DBN was tested against field observations of snow water equivalents SWE. The results indicate that our DBN can be used to reason about uncertainty, without doing resampling from the deterministic model. In all brevity, the DBN's ability to reproduce the mean of the observations was similar to what could be obtained with the deterministic hydrology model, but with a more realistic repre…
Measuring the Spatial Homogeneity in Corneal Endotheliums by Means of a Randomization Test
1999
Quantification of regularity of cell sizes and the spatial arrangement of cells in corneal endotheliums becomes of a great importance associated to stress situations such as cataract surgery, corneal transplantation or implantation of intra-ocular lenses. A new index of regularity of the spatial distribution of cell sizes in corneal endotheliums is proposed. The corneal endothelium is described by means of a spatial marked point pattern (the cell centroids marked with the cell areas). The hypothesis of no dependency between mark and locations is tested by a Monte Carlo test. The new index is the p-value of the test validating the hypothesis. Pairs of endotheliums from different eyes of the …
Residual-based block bootstrap for cointegration testing
2010
We propose a new testing procedure to determine the rank of cointegration. This new method is based on the nonparametric resampling procedure, so-called Residual-Based Block Bootstrap (RBB), which is developed by Paparoditis and Politis (2003) in the context of unit root testing. Through Monte Carlo experiments we show that, in small samples, the RBB cointegration test has good power properties in relation to the other two well-known tests for cointegration, such as the Augmented Dickey–Fuller (ADF), applied to the residual of a cointegrating regression, and the Johansen's maximum eigenvalue tests. Likewise, this article looks at the influence played by the correlation of the ‘X’ variables …
A Note on Resampling the Integration Across the Correlation Integral with Alternative Ranges
2003
Abstract This paper reconsiders the nonlinearity test proposed by Ko[cbreve]enda (Ko[cbreve]enda, E. (2001). An alternative to the BDS test: integration across the correlation integral. Econometric Reviews20:337–351). When the analyzed series is non‐Gaussian, the empirical rejection rates can be much larger than the nominal size. In this context, the necessity of tabulating the empirical distribution of the statistic each time the test is computed is stressed. To that end, simple random permutation works reasonably well. This paper also shows, through Monte Carlo experiments, that Ko[cbreve]enda's test can be more powerful than the Brock et al. (Brock, W., Dechert, D., Scheickman, J., LeBar…
PACo: a novel procrustes application to cophylogenetic analysis.
2013
We present Procrustean Approach to Cophylogeny (PACo), a novel statistical tool to test for congruence between phylogenetic trees, or between phylogenetic distance matrices of associated taxa. Unlike previous tests, PACo evaluates the dependence of one phylogeny upon the other. This makes it especially appropriate to test the classical coevolutionary model that assumes that parasites that spend part of their life in or on their hosts track the phylogeny of their hosts. The new method does not require fully resolved phylogenies and allows for multiple host-parasite associations. PACo produces a Procrustes superimposition plot enabling a graphical assessment of the fit of the parasite phyloge…
Group Importance Sampling for particle filtering and MCMC
2018
Bayesian methods and their implementations by means of sophisticated Monte Carlo techniques have become very popular in signal processing over the last years. Importance Sampling (IS) is a well-known Monte Carlo technique that approximates integrals involving a posterior distribution by means of weighted samples. In this work, we study the assignation of a single weighted sample which compresses the information contained in a population of weighted samples. Part of the theory that we present as Group Importance Sampling (GIS) has been employed implicitly in different works in the literature. The provided analysis yields several theoretical and practical consequences. For instance, we discus…
On resampling schemes for particle filters with weakly informative observations
2022
We consider particle filters with weakly informative observations (or `potentials') relative to the latent state dynamics. The particular focus of this work is on particle filters to approximate time-discretisations of continuous-time Feynman--Kac path integral models -- a scenario that naturally arises when addressing filtering and smoothing problems in continuous time -- but our findings are indicative about weakly informative settings beyond this context too. We study the performance of different resampling schemes, such as systematic resampling, SSP (Srinivasan sampling process) and stratified resampling, as the time-discretisation becomes finer and also identify their continuous-time l…
Sensitivity of the C-band SRTM DEM Vertical Accuracy to Terrain Characteristics and Spatial Resolution
2008
This work reports the results of a careful regional analysis of the SRTM DEM (Shuttle Radar Topography Mission – Digital Elevation Model) vertical accuracy as a function of both topography and Land-Use/Land Cover (LULC). Absolute vertical errors appear LULC-dependent, with some values greater than the stated accuracy of the SRTM dataset, mostly over forested areas. The results show that the structure of the errors is well modeled by a cosine power n of the local incidence angle (θloc). SRTM quality is further assessed using slope and topographical similarity indexes. The results show a lower relative accuracy on slope with a R2 = 0.5 and a moderate agreement (Kappa ≈ 0.4) between SRTM- and …