Search results for "ARDS"

showing 10 items of 1705 documents

Opportunities and challenges of combined effect measures based on prioritized outcomes

2013

Many authors have proposed different approaches to combine multiple endpoints in a univariate outcome measure in the literature. In case of binary or time-to-event variables, composite endpoints, which combine several event types within a single event or time-to-first-event analysis are often used to assess the overall treatment effect. A main drawback of this approach is that the interpretation of the composite effect can be difficult as a negative effect in one component can be masked by a positive effect in another. Recently, some authors proposed more general approaches based on a priority ranking of outcomes, which moreover allow to combine outcome variables of different scale levels. …

Statistics and ProbabilityClinical Trials as TopicEpidemiologyUnivariatecomputer.software_genreOutcome (game theory)Treatment OutcomeRankingScale (social sciences)Component (UML)Outcome Assessment Health CareMultiple comparisons problemHumansComputer SimulationData miningcomputerProportional Hazards ModelsMathematicsStatistical hypothesis testingEvent (probability theory)Statistics in Medicine
researchProduct

Sample size planning for survival prediction with focus on high-dimensional data

2011

Sample size planning should reflect the primary objective of a trial. If the primary objective is prediction, the sample size determination should focus on prediction accuracy instead of power. We present formulas for the determination of training set sample size for survival prediction. Sample size is chosen to control the difference between optimal and expected prediction error. Prediction is carried out by Cox proportional hazards models. The general approach considers censoring as well as low-dimensional and high-dimensional explanatory variables. For dimension reduction in the high-dimensional setting, a variable selection step is inserted. If not all informative variables are included…

Statistics and ProbabilityClustering high-dimensional dataClinical Trials as TopicLung NeoplasmsModels StatisticalKaplan-Meier EstimateEpidemiologyProportional hazards modelDimensionality reductionGene ExpressionFeature selectionKaplan-Meier EstimateBiostatisticsPrognosisBrier scoreSample size determinationCarcinoma Non-Small-Cell LungSample SizeCensoring (clinical trials)StatisticsHumansProportional Hazards ModelsMathematicsStatistics in Medicine
researchProduct

Bayesian regularization for flexible baseline hazard functions in Cox survival models.

2019

Fully Bayesian methods for Cox models specify a model for the baseline hazard function. Parametric approaches generally provide monotone estimations. Semi-parametric choices allow for more flexible patterns but they can suffer from overfitting and instability. Regularization methods through prior distributions with correlated structures usually give reasonable answers to these types of situations. We discuss Bayesian regularization for Cox survival models defined via flexible baseline hazards specified by a mixture of piecewise constant functions and by a cubic B-spline function. For those "semi-parametric" proposals, different prior scenarios ranging from prior independence to particular c…

Statistics and ProbabilityComputer scienceProportional hazards modelModel selectionBayesian probabilityPosterior probabilityMarkov chain Monte CarloBayes TheoremGeneral MedicineOverfittingSurvival AnalysisMarkov Chainssymbols.namesakeStatisticsCovariatesymbolsPiecewiseStatistics Probability and UncertaintyMonte Carlo MethodProportional Hazards ModelsBiometrical journal. Biometrische ZeitschriftREFERENCES
researchProduct

Bayesian joint ordinal and survival modeling for breast cancer risk assessment

2016

We propose a joint model to analyze the structure and intensity of the association between longitudinal measurements of an ordinal marker and time to a relevant event. The longitudinal process is defined in terms of a proportional-odds cumulative logit model. Time-to-event is modeled through a left-truncated proportionalhazards model, which incorporates information of the longitudinal marker as well as baseline covariates. Both longitudinal and survival processes are connected by means of a common vector of random effects. General inferences are discussed under the Bayesian approach and include the posterior distribution of the probabilities associated to each longitudinal category and the …

Statistics and ProbabilityEpidemiologyComputer scienceBreast imagingLeft-truncated proportional-hazards modelBayesian probabilityPosterior probabilityPopulationBreast Neoplasmsleft‐truncated proportional‐hazards modelRisk Assessment:Matemàtiques i estadística::Investigació operativa [Àrees temàtiques de la UPC]01 natural sciences010104 statistics & probability03 medical and health sciencesBayes' theorem0302 clinical medicineBreast cancerStatisticsCovariateEconometricsmedicineHumansBreast0101 mathematicseducationResearch ArticlesBI-RADS scaleBreast Densityeducation.field_of_studyBI‐RADS scaleLatent processBayes TheoremRandom effects modelmedicine.disease:90 Operations research mathematical programming [Classificació AMS]030220 oncology & carcinogenesisProportional‐odds cumulative logit modelFemaleProportional-odds cumulative logit modelResearch ArticleStatistics in Medicine
researchProduct

Generating survival times to simulate Cox proportional hazards models by Ralf Bender, Thomas Augustin and Maria Blettner,Statistics in Medicine 2005;…

2006

Statistics and ProbabilityEpidemiologyProportional hazards modelComputer scienceStatisticsEconometricsMEDLINEMedical statisticsSurvival analysisStatistics in Medicine
researchProduct

Large deviations results for subexponential tails, with applications to insurance risk

1996

AbstractConsider a random walk or Lévy process {St} and let τ(u) = inf {t⩾0 : St > u}, P(u)(·) = P(· | τ(u) < ∞). Assuming that the upwards jumps are heavy-tailed, say subexponential (e.g. Pareto, Weibull or lognormal), the asymptotic form of the P(u)-distribution of the process {St} up to time τ(u) is described as u → ∞. Essentially, the results confirm the folklore that level crossing occurs as result of one big jump. Particular sharp conclusions are obtained for downwards skip-free processes like the classical compound Poisson insurance risk process where the formulation is in terms of total variation convergence. The ideas of the proof involve excursions and path decompositions for Mark…

Statistics and ProbabilityExponential distributionRegular variationRuin probabilityExcursionRandom walkDownwards skip-free processLévy processConditioned limit theoremTotal variation convergenceCombinatoricsInsurance riskPath decompositionIntegrated tailProbability theoryModelling and SimulationExtreme value theoryMaximum domain of attractionMathematicsStochastic processApplied MathematicsExtreme value theoryRandom walkSubexponential distributionModeling and SimulationLog-normal distributionLarge deviations theory60K1060F10Stochastic Processes and their Applications
researchProduct

A weighted combined effect measure for the analysis of a composite time-to-first-event endpoint with components of different clinical relevance

2018

Composite endpoints combine several events within a single variable, which increases the number of expected events and is thereby meant to increase the power. However, the interpretation of results can be difficult as the observed effect for the composite does not necessarily reflect the effects for the components, which may be of different magnitude or even point in adverse directions. Moreover, in clinical applications, the event types are often of different clinical relevance, which also complicates the interpretation of the composite effect. The common effect measure for composite endpoints is the all-cause hazard ratio, which gives equal weight to all events irrespective of their type …

Statistics and ProbabilityHazard (logic)EpidemiologyEndpoint Determination01 natural sciencesMeasure (mathematics)WIN RATIO010104 statistics & probability03 medical and health sciences0302 clinical medicineResamplingStatisticstime-to-eventHumansComputer Simulation030212 general & internal medicinerelevance weighting0101 mathematicsParametric statisticsEvent (probability theory)MathematicsProportional Hazards Modelsclinical trialsHazard ratiocomposite endpointWeightingPRIORITIZED OUTCOMESTRIALSData Interpretation StatisticalMULTISTATE MODELSINFERENCENull hypothesisMonte Carlo MethodStatistics in Medicine
researchProduct

Generating survival times to simulate Cox proportional hazards models

2005

Simulation studies present an important statistical tool to investigate the performance, properties and adequacy of statistical models in pre-specified situations. One of the most important statistical models in medical research is the proportional hazards model of Cox. In this paper, techniques to generate survival times for simulation studies regarding Cox proportional hazards models are presented. A general formula describing the relation between the hazard and the corresponding survival time of the Cox model is derived, which is useful in simulation studies. It is shown how the exponential, the Weibull and the Gompertz distribution can be applied to generate appropriate survival times f…

Statistics and ProbabilityHazard (logic)Exponential distributionEpidemiologyComputer scienceProportional hazards modelStatisticsEconometricsStatistical modelSurvival analysisGompertz distributionExponential functionWeibull distributionStatistics in Medicine
researchProduct

Using Statistical and Computer Models to Quantify Volcanic Hazards

2009

Risk assessment of rare natural hazards, such as large volcanic block and ash or pyroclastic flows, is addressed. Assessment is approached through a combination of computer modeling, statistical modeling, and extreme-event probability computation. A computer model of the natural hazard is used to provide the needed extrapolation to unseen parts of the hazard space. Statistical modeling of the available data is needed to determine the initializing distribution for exercising the computer model. In dealing with rare events, direct simulations involving the computer model are prohibitively expensive. The solution instead requires a combination of adaptive design of computer model approximation…

Statistics and ProbabilityHazard (logic)Risk analysisVolcanic hazardsComputer scienceApplied MathematicsComputationInitializationStatistical modelcomputer.software_genreModeling and SimulationNatural hazardRare eventsData miningcomputerTechnometrics
researchProduct

PROBABILISTIC QUANTIFICATION OF HAZARDS: A METHODOLOGY USING SMALL ENSEMBLES OF PHYSICS-BASED SIMULATIONS AND STATISTICAL SURROGATES

2015

This paper presents a novel approach to assessing the hazard threat to a locale due to a large volcanic avalanche. The methodology combines: (i) mathematical modeling of volcanic mass flows; (ii) field data of avalanche frequency, volume, and runout; (iii) large-scale numerical simulations of flow events; (iv) use of statistical methods to minimize computational costs, and to capture unlikely events; (v) calculation of the probability of a catastrophic flow event over the next T years at a location of interest; and (vi) innovative computational methodology to implement these methods. This unified presentation collects elements that have been separately developed, and incorporates new contri…

Statistics and ProbabilityHazard (logic)Volcanic hazardsgeographyControl and Optimizationgeography.geographical_feature_categoryProcess (engineering)Probabilistic logicHazard analysiscomputer.software_genreFlow (mathematics)VolcanoModeling and SimulationEconometricsDiscrete Mathematics and CombinatoricsEnvironmental scienceData miningcomputerEvent (probability theory)International Journal for Uncertainty Quantification
researchProduct