Search results for "interval"
showing 10 items of 1703 documents
PACo: a novel procrustes application to cophylogenetic analysis.
2013
We present Procrustean Approach to Cophylogeny (PACo), a novel statistical tool to test for congruence between phylogenetic trees, or between phylogenetic distance matrices of associated taxa. Unlike previous tests, PACo evaluates the dependence of one phylogeny upon the other. This makes it especially appropriate to test the classical coevolutionary model that assumes that parasites that spend part of their life in or on their hosts track the phylogeny of their hosts. The new method does not require fully resolved phylogenies and allows for multiple host-parasite associations. PACo produces a Procrustes superimposition plot enabling a graphical assessment of the fit of the parasite phyloge…
Basic Statistical Techniques
2012
On the use of approximate Bayesian computation Markov chain Monte Carlo with inflated tolerance and post-correction
2020
Approximate Bayesian computation allows for inference of complicated probabilistic models with intractable likelihoods using model simulations. The Markov chain Monte Carlo implementation of approximate Bayesian computation is often sensitive to the tolerance parameter: low tolerance leads to poor mixing and large tolerance entails excess bias. We consider an approach using a relatively large tolerance for the Markov chain Monte Carlo sampler to ensure its sufficient mixing, and post-processing the output leading to estimators for a range of finer tolerances. We introduce an approximate confidence interval for the related post-corrected estimators, and propose an adaptive approximate Bayesi…
A probabilistic estimation and prediction technique for dynamic continuous social science models: The evolution of the attitude of the Basque Country…
2015
In this paper, a computational technique to deal with uncertainty in dynamic continuous models in Social Sciences is presented.Considering data from surveys,the method consists of determining the probability distribution of the survey output and this allows to sample data and fit the model to the sampled data using a goodness-of-fit criterion based the χ2-test. Taking the fitted parameters that were not rejected by the χ2-test, substituting them into the model and computing their outputs, 95% confidence intervals in each time instant capturing the uncertainty of the survey data (probabilistic estimation) is built. Using the same set of obtained model parameters, a prediction over …
Disentangling Derivatives, Uncertainty and Error in Gaussian Process Models
2020
Gaussian Processes (GPs) are a class of kernel methods that have shown to be very useful in geoscience applications. They are widely used because they are simple, flexible and provide very accurate estimates for nonlinear problems, especially in parameter retrieval. An addition to a predictive mean function, GPs come equipped with a useful property: the predictive variance function which provides confidence intervals for the predictions. The GP formulation usually assumes that there is no input noise in the training and testing points, only in the observations. However, this is often not the case in Earth observation problems where an accurate assessment of the instrument error is usually a…
Warped Gaussian Processes in Remote Sensing Parameter Estimation and Causal Inference
2018
This letter introduces warped Gaussian process (WGP) regression in remote sensing applications. WGP models output observations as a parametric nonlinear transformation of a GP. The parameters of such a prior model are then learned via standard maximum likelihood. We show the good performance of the proposed model for the estimation of oceanic chlorophyll content from multispectral data, vegetation parameters (chlorophyll, leaf area index, and fractional vegetation cover) from hyperspectral data, and in the detection of the causal direction in a collection of 28 bivariate geoscience and remote sensing causal problems. The model consistently performs better than the standard GP and the more a…
Can visualization alleviate dichotomous thinking? Effects of visual representations on the cliff effect
2021
Common reporting styles for statistical results in scientific articles, such as $p$ p -values and confidence intervals (CI), have been reported to be prone to dichotomous interpretations, especially with respect to the null hypothesis significance testing framework. For example when the $p$ p -value is small enough or the CIs of the mean effects of a studied drug and a placebo are not overlapping, scientists tend to claim significant differences while often disregarding the magnitudes and absolute differences in the effect sizes. This type of reasoning has been shown to be potentially harmful to science. Techniques relying on the visual estimation of the strength of evidence have been recom…
Improved FMECA for effective risk management decision making by failure modes classification under uncertainty
2022
Failure Mode, Effects, and Criticality Analysis (FMECA) is a proactive reliability and risk management technique extensively used in practice to ensure high system performance by prioritising failure modes. Owing to the limitations of traditional FMECA, multi-criteria decision-making methods have been employed over the past two decades to enhance its effectiveness. To consider the vagueness and uncertainty of the FMECA evaluation process, an interval-based extension of the Elimination et Choice Translating Reality (ELECTRE) TRI method is proposed in the present paper for the classification of failure modes into risk categories. Therefore, ratings of failure modes against risk parameters are…
Epistemic uncertainty in fault tree analysis approached by the evidence theory
2012
Abstract Process plants may be subjected to dangerous events. Different methodologies are nowadays employed to identify failure events, that can lead to severe accidents, and to assess the relative probability of occurrence. As for rare events reliability data are generally poor, leading to a partial or incomplete knowledge of the process, the classical probabilistic approach can not be successfully used. Such an uncertainty, called epistemic uncertainty, can be treated by means of different methodologies, alternative to the probabilistic one. In this work, the Evidence Theory or Dempster–Shafer theory (DST) is proposed to deal with this kind of uncertainty. In particular, the classical Fau…
Cardiovascular disease burden from ambient air pollution in Europe reassessed using novel hazard ratio functions
2019
Abstract Aims Ambient air pollution is a major health risk, leading to respiratory and cardiovascular mortality. A recent Global Exposure Mortality Model, based on an unmatched number of cohort studies in many countries, provides new hazard ratio functions, calling for re-evaluation of the disease burden. Accordingly, we estimated excess cardiovascular mortality attributed to air pollution in Europe. Methods and results The new hazard ratio functions have been combined with ambient air pollution exposure data to estimate the impacts in Europe and the 28 countries of the European Union (EU-28). The annual excess mortality rate from ambient air pollution in Europe is 790 000 [95% confidence i…