Search results for "interval"

showing 10 items of 1703 documents

PACo: a novel procrustes application to cophylogenetic analysis.

2013

We present Procrustean Approach to Cophylogeny (PACo), a novel statistical tool to test for congruence between phylogenetic trees, or between phylogenetic distance matrices of associated taxa. Unlike previous tests, PACo evaluates the dependence of one phylogeny upon the other. This makes it especially appropriate to test the classical coevolutionary model that assumes that parasites that spend part of their life in or on their hosts track the phylogeny of their hosts. The new method does not require fully resolved phylogenies and allows for multiple host-parasite associations. PACo produces a Procrustes superimposition plot enabling a graphical assessment of the fit of the parasite phyloge…

Evolutionary ProcessesParàsitsZoologylcsh:MedicineBiologia Models matemàticsAnimal PhylogeneticsBiostatisticsBiologyForms of EvolutionStatistical powerPlot (graphics)Host-Parasite InteractionsEvolution MolecularCongruence (geometry)StatisticsAnimalsEvolutionary SystematicsComputer SimulationParasiteslcsh:ScienceBiologyPhylogenyStatisticEvolutionary BiologyMultidisciplinaryPhylogenetic treeStatisticslcsh:RConfidence intervalPhylogeneticsParasitologylcsh:QZoologyJackknife resamplingMathematicsSoftwareResearch ArticleCoevolutionType I and type II errorsPLoS ONE
researchProduct

Basic Statistical Techniques

2012

Exploratory data analysisData collectionComputer scienceInterval estimationStatisticsData analysisStatistical inferenceSampling (statistics)Statistical and Managerial Techniques for Six Sigma Methodology
researchProduct

On the use of approximate Bayesian computation Markov chain Monte Carlo with inflated tolerance and post-correction

2020

Approximate Bayesian computation allows for inference of complicated probabilistic models with intractable likelihoods using model simulations. The Markov chain Monte Carlo implementation of approximate Bayesian computation is often sensitive to the tolerance parameter: low tolerance leads to poor mixing and large tolerance entails excess bias. We consider an approach using a relatively large tolerance for the Markov chain Monte Carlo sampler to ensure its sufficient mixing, and post-processing the output leading to estimators for a range of finer tolerances. We introduce an approximate confidence interval for the related post-corrected estimators, and propose an adaptive approximate Bayesi…

FOS: Computer and information sciences0301 basic medicineStatistics and Probabilitytolerance choiceGeneral MathematicsMarkovin ketjutInference01 natural sciencesStatistics - Computationapproximate Bayesian computation010104 statistics & probability03 medical and health sciencessymbols.namesakeMixing (mathematics)adaptive algorithmalgoritmit0101 mathematicsComputation (stat.CO)MathematicsAdaptive algorithmMarkov chainbayesilainen menetelmäApplied MathematicsProbabilistic logicEstimatorMarkov chain Monte CarloAgricultural and Biological Sciences (miscellaneous)Markov chain Monte CarloMonte Carlo -menetelmätimportance sampling030104 developmental biologyconfidence intervalsymbolsStatistics Probability and UncertaintyApproximate Bayesian computationGeneral Agricultural and Biological SciencesAlgorithm
researchProduct

A probabilistic estimation and prediction technique for dynamic continuous social science models: The evolution of the attitude of the Basque Country…

2015

In this paper, a computational technique to deal with uncertainty in dynamic continuous models in Social Sciences is presented.Considering data from surveys,the method consists of determining the probability distribution of the survey output and this allows to sample data and fit the model to the sampled data using a goodness-of-fit criterion based the χ2-test. Taking the fitted parameters that were not rejected by the χ2-test, substituting them into the model and computing their outputs, 95% confidence intervals in each time instant capturing the uncertainty of the survey data (probabilistic estimation) is built. Using the same set of obtained model parameters, a prediction over …

FOS: Computer and information sciencesAttitude dynamicsProbabilistic predictionComputer sciencePopulationDivergence-from-randomness modelSample (statistics)computer.software_genreMachine Learning (cs.LG)Probabilistic estimationSocial scienceeducationProbabilistic relevance modeleducation.field_of_studyApplied MathematicsProbabilistic logicConfidence intervalComputer Science - LearningComputational MathematicsSocial dynamic modelsProbability distributionSurvey data collectionData miningMATEMATICA APLICADAcomputerApplied Mathematics and Computation
researchProduct

Disentangling Derivatives, Uncertainty and Error in Gaussian Process Models

2020

Gaussian Processes (GPs) are a class of kernel methods that have shown to be very useful in geoscience applications. They are widely used because they are simple, flexible and provide very accurate estimates for nonlinear problems, especially in parameter retrieval. An addition to a predictive mean function, GPs come equipped with a useful property: the predictive variance function which provides confidence intervals for the predictions. The GP formulation usually assumes that there is no input noise in the training and testing points, only in the observations. However, this is often not the case in Earth observation problems where an accurate assessment of the instrument error is usually a…

FOS: Computer and information sciencesComputer Science - Machine Learning010504 meteorology & atmospheric sciencesComputer science0211 other engineering and technologiesMachine Learning (stat.ML)02 engineering and technology01 natural sciencesMachine Learning (cs.LG)symbols.namesakeStatistics - Machine LearningGaussian process021101 geological & geomatics engineering0105 earth and related environmental sciencesVariance functionPropagation of uncertaintyVariance (accounting)Function (mathematics)Confidence intervalNonlinear systemNoiseKernel method13. Climate actionKernel (statistics)symbolsAlgorithmIGARSS 2018 - 2018 IEEE International Geoscience and Remote Sensing Symposium
researchProduct

Warped Gaussian Processes in Remote Sensing Parameter Estimation and Causal Inference

2018

This letter introduces warped Gaussian process (WGP) regression in remote sensing applications. WGP models output observations as a parametric nonlinear transformation of a GP. The parameters of such a prior model are then learned via standard maximum likelihood. We show the good performance of the proposed model for the estimation of oceanic chlorophyll content from multispectral data, vegetation parameters (chlorophyll, leaf area index, and fractional vegetation cover) from hyperspectral data, and in the detection of the causal direction in a collection of 28 bivariate geoscience and remote sensing causal problems. The model consistently performs better than the standard GP and the more a…

FOS: Computer and information sciencesComputer Science - Machine LearningHeteroscedasticityRemote sensing applicationComputer scienceComputer Vision and Pattern Recognition (cs.CV)Maximum likelihoodComputer Science - Computer Vision and Pattern Recognition0211 other engineering and technologies02 engineering and technologyBivariate analysis010501 environmental sciences01 natural sciencesMachine Learning (cs.LG)Data modelingsymbols.namesakeElectrical and Electronic EngineeringGaussian process021101 geological & geomatics engineering0105 earth and related environmental sciencesRemote sensingParametric statisticsEstimation theoryHyperspectral imagingGeotechnical Engineering and Engineering GeologyConfidence intervalCausal inferencesymbolsIEEE Geoscience and Remote Sensing Letters
researchProduct

Can visualization alleviate dichotomous thinking? Effects of visual representations on the cliff effect

2021

Common reporting styles for statistical results in scientific articles, such as $p$ p -values and confidence intervals (CI), have been reported to be prone to dichotomous interpretations, especially with respect to the null hypothesis significance testing framework. For example when the $p$ p -value is small enough or the CIs of the mean effects of a studied drug and a placebo are not overlapping, scientists tend to claim significant differences while often disregarding the magnitudes and absolute differences in the effect sizes. This type of reasoning has been shown to be potentially harmful to science. Techniques relying on the visual estimation of the strength of evidence have been recom…

FOS: Computer and information sciencesvisualisointiBayesian inferencetilastomenetelmätComputer Science - Human-Computer Interactiontulkinta02 engineering and technologyBayesian inferenceluottamustasotHuman-Computer Interaction (cs.HC)cliff effectData visualizationhypothesis testing0202 electrical engineering electronic engineering information engineeringStatistical inferencevisualizationconfidence intervalsStatistical hypothesis testingpäättelybusiness.industrybayesilainen menetelmäOther Statistics (stat.OT)Multilevel model020207 software engineeringtilastografiikkaComputer Graphics and Computer-Aided DesignConfidence intervalStatistics - Other StatisticsSignal ProcessingComputer Vision and Pattern RecognitionbusinessPsychologyNull hypothesisValue (mathematics)SoftwareCognitive psychologystatistical inference
researchProduct

Improved FMECA for effective risk management decision making by failure modes classification under uncertainty

2022

Failure Mode, Effects, and Criticality Analysis (FMECA) is a proactive reliability and risk management technique extensively used in practice to ensure high system performance by prioritising failure modes. Owing to the limitations of traditional FMECA, multi-criteria decision-making methods have been employed over the past two decades to enhance its effectiveness. To consider the vagueness and uncertainty of the FMECA evaluation process, an interval-based extension of the Elimination et Choice Translating Reality (ELECTRE) TRI method is proposed in the present paper for the classification of failure modes into risk categories. Therefore, ratings of failure modes against risk parameters are…

Failure modes classificationPropulsion systemSettore ING-IND/17 - Impianti Industriali MeccaniciFailure modes classification; FMECA; Interval-valued ELECTRE TRI; Propulsion systemGeneral EngineeringGeneral Materials ScienceFailure modes classification FMECA Interval-valued ELECTRE TRI Propulsion systemFMECAInterval-valued ELECTRE TRIEngineering Failure Analysis
researchProduct

Epistemic uncertainty in fault tree analysis approached by the evidence theory

2012

Abstract Process plants may be subjected to dangerous events. Different methodologies are nowadays employed to identify failure events, that can lead to severe accidents, and to assess the relative probability of occurrence. As for rare events reliability data are generally poor, leading to a partial or incomplete knowledge of the process, the classical probabilistic approach can not be successfully used. Such an uncertainty, called epistemic uncertainty, can be treated by means of different methodologies, alternative to the probabilistic one. In this work, the Evidence Theory or Dempster–Shafer theory (DST) is proposed to deal with this kind of uncertainty. In particular, the classical Fau…

Fault tree analysisEpistemic uncertaintyGeneral Chemical EngineeringProbabilistic logicEnergy Engineering and Power TechnologyInterval (mathematics)Management Science and Operations Researchcomputer.software_genreIndustrial and Manufacturing EngineeringFTARisk analysiEvidence theoryControl and Systems EngineeringSettore ING-IND/17 - Impianti Industriali MeccaniciRare eventsSensitivity analysisData miningUncertainty quantificationSafety Risk Reliability and QualitycomputerUncertainty analysisFood ScienceEvent (probability theory)Mathematics
researchProduct

Cardiovascular disease burden from ambient air pollution in Europe reassessed using novel hazard ratio functions

2019

Abstract Aims Ambient air pollution is a major health risk, leading to respiratory and cardiovascular mortality. A recent Global Exposure Mortality Model, based on an unmatched number of cohort studies in many countries, provides new hazard ratio functions, calling for re-evaluation of the disease burden. Accordingly, we estimated excess cardiovascular mortality attributed to air pollution in Europe. Methods and results The new hazard ratio functions have been combined with ambient air pollution exposure data to estimate the impacts in Europe and the 28 countries of the European Union (EU-28). The annual excess mortality rate from ambient air pollution in Europe is 790 000 [95% confidence i…

Fine particulate matterPrevention and EpidemiologyAir pollutionAir pollutionFast Track Clinical Research030204 cardiovascular system & hematologymedicine.disease_cause03 medical and health sciences0302 clinical medicineEnvironmental healthPer capitaMedicinemedia_common.cataloged_instanceHumansEuropean unionDisease burdenmedia_commonExcess mortality ratebusiness.industryMortality rateHazard ratioLoss of life expectancyUncertainty030229 sport sciencesEnvironmental ExposureCardiovascular riskConfidence intervalEuropeEditor's ChoiceCardiovascular DiseasesLife expectancyCardiology and Cardiovascular MedicinebusinessHealth promotion interventionEuropean Heart Journal
researchProduct