Search results for "Hazard"

showing 10 items of 1517 documents

Bayesian joint ordinal and survival modeling for breast cancer risk assessment

2016

We propose a joint model to analyze the structure and intensity of the association between longitudinal measurements of an ordinal marker and time to a relevant event. The longitudinal process is defined in terms of a proportional-odds cumulative logit model. Time-to-event is modeled through a left-truncated proportionalhazards model, which incorporates information of the longitudinal marker as well as baseline covariates. Both longitudinal and survival processes are connected by means of a common vector of random effects. General inferences are discussed under the Bayesian approach and include the posterior distribution of the probabilities associated to each longitudinal category and the …

Statistics and ProbabilityEpidemiologyComputer scienceBreast imagingLeft-truncated proportional-hazards modelBayesian probabilityPosterior probabilityPopulationBreast Neoplasmsleft‐truncated proportional‐hazards modelRisk Assessment:Matemàtiques i estadística::Investigació operativa [Àrees temàtiques de la UPC]01 natural sciences010104 statistics & probability03 medical and health sciencesBayes' theorem0302 clinical medicineBreast cancerStatisticsCovariateEconometricsmedicineHumansBreast0101 mathematicseducationResearch ArticlesBI-RADS scaleBreast Densityeducation.field_of_studyBI‐RADS scaleLatent processBayes TheoremRandom effects modelmedicine.disease:90 Operations research mathematical programming [Classificació AMS]030220 oncology & carcinogenesisProportional‐odds cumulative logit modelFemaleProportional-odds cumulative logit modelResearch ArticleStatistics in Medicine
researchProduct

Generating survival times to simulate Cox proportional hazards models by Ralf Bender, Thomas Augustin and Maria Blettner,Statistics in Medicine 2005;…

2006

Statistics and ProbabilityEpidemiologyProportional hazards modelComputer scienceStatisticsEconometricsMEDLINEMedical statisticsSurvival analysisStatistics in Medicine
researchProduct

Absolute Risk and Loss-of-Lifetime Estimates for Quantitative Risk Assessment

1998

Quantitative risk assessments in public health settings intend to describe the hazard of a specific exposure in a given population on the basis of epidemiological and/or experimental results. Two different risk quantities, the absolute lifetime excess risk and the loss-of-lifetime, which differ in their definition of hazard, are discussed and compared. For both measures estimation procedures are derived and the relationship between the various estimates which are currently in use are investigated. It is shown that the two most common estimators can be written as special cases of a more general concept. This leads to conclusions about the assumptions on which different estimation procedures …

Statistics and ProbabilityEstimationeducation.field_of_studyPopulationAbsolute risk reductionEstimatorGeneral MedicineVariance (accounting)Residential radonHazardStatisticsEconometricsStatistics Probability and UncertaintyeducationRisk assessmentMathematicsBiometrical Journal
researchProduct

A weighted combined effect measure for the analysis of a composite time-to-first-event endpoint with components of different clinical relevance

2018

Composite endpoints combine several events within a single variable, which increases the number of expected events and is thereby meant to increase the power. However, the interpretation of results can be difficult as the observed effect for the composite does not necessarily reflect the effects for the components, which may be of different magnitude or even point in adverse directions. Moreover, in clinical applications, the event types are often of different clinical relevance, which also complicates the interpretation of the composite effect. The common effect measure for composite endpoints is the all-cause hazard ratio, which gives equal weight to all events irrespective of their type …

Statistics and ProbabilityHazard (logic)EpidemiologyEndpoint Determination01 natural sciencesMeasure (mathematics)WIN RATIO010104 statistics & probability03 medical and health sciences0302 clinical medicineResamplingStatisticstime-to-eventHumansComputer Simulation030212 general & internal medicinerelevance weighting0101 mathematicsParametric statisticsEvent (probability theory)MathematicsProportional Hazards Modelsclinical trialsHazard ratiocomposite endpointWeightingPRIORITIZED OUTCOMESTRIALSData Interpretation StatisticalMULTISTATE MODELSINFERENCENull hypothesisMonte Carlo MethodStatistics in Medicine
researchProduct

Generating survival times to simulate Cox proportional hazards models

2005

Simulation studies present an important statistical tool to investigate the performance, properties and adequacy of statistical models in pre-specified situations. One of the most important statistical models in medical research is the proportional hazards model of Cox. In this paper, techniques to generate survival times for simulation studies regarding Cox proportional hazards models are presented. A general formula describing the relation between the hazard and the corresponding survival time of the Cox model is derived, which is useful in simulation studies. It is shown how the exponential, the Weibull and the Gompertz distribution can be applied to generate appropriate survival times f…

Statistics and ProbabilityHazard (logic)Exponential distributionEpidemiologyComputer scienceProportional hazards modelStatisticsEconometricsStatistical modelSurvival analysisGompertz distributionExponential functionWeibull distributionStatistics in Medicine
researchProduct

Using Statistical and Computer Models to Quantify Volcanic Hazards

2009

Risk assessment of rare natural hazards, such as large volcanic block and ash or pyroclastic flows, is addressed. Assessment is approached through a combination of computer modeling, statistical modeling, and extreme-event probability computation. A computer model of the natural hazard is used to provide the needed extrapolation to unseen parts of the hazard space. Statistical modeling of the available data is needed to determine the initializing distribution for exercising the computer model. In dealing with rare events, direct simulations involving the computer model are prohibitively expensive. The solution instead requires a combination of adaptive design of computer model approximation…

Statistics and ProbabilityHazard (logic)Risk analysisVolcanic hazardsComputer scienceApplied MathematicsComputationInitializationStatistical modelcomputer.software_genreModeling and SimulationNatural hazardRare eventsData miningcomputerTechnometrics
researchProduct

PROBABILISTIC QUANTIFICATION OF HAZARDS: A METHODOLOGY USING SMALL ENSEMBLES OF PHYSICS-BASED SIMULATIONS AND STATISTICAL SURROGATES

2015

This paper presents a novel approach to assessing the hazard threat to a locale due to a large volcanic avalanche. The methodology combines: (i) mathematical modeling of volcanic mass flows; (ii) field data of avalanche frequency, volume, and runout; (iii) large-scale numerical simulations of flow events; (iv) use of statistical methods to minimize computational costs, and to capture unlikely events; (v) calculation of the probability of a catastrophic flow event over the next T years at a location of interest; and (vi) innovative computational methodology to implement these methods. This unified presentation collects elements that have been separately developed, and incorporates new contri…

Statistics and ProbabilityHazard (logic)Volcanic hazardsgeographyControl and Optimizationgeography.geographical_feature_categoryProcess (engineering)Probabilistic logicHazard analysiscomputer.software_genreFlow (mathematics)VolcanoModeling and SimulationEconometricsDiscrete Mathematics and CombinatoricsEnvironmental scienceData miningcomputerEvent (probability theory)International Journal for Uncertainty Quantification
researchProduct

Sparse kernel methods for high-dimensional survival data

2008

Abstract Sparse kernel methods like support vector machines (SVM) have been applied with great success to classification and (standard) regression settings. Existing support vector classification and regression techniques however are not suitable for partly censored survival data, which are typically analysed using Cox's proportional hazards model. As the partial likelihood of the proportional hazards model only depends on the covariates through inner products, it can be ‘kernelized’. The kernelized proportional hazards model however yields a solution that is dense, i.e. the solution depends on all observations. One of the key features of an SVM is that it yields a sparse solution, dependin…

Statistics and ProbabilityLung NeoplasmsLymphomaComputer sciencecomputer.software_genreComputing MethodologiesBiochemistryPattern Recognition AutomatedArtificial IntelligenceMargin (machine learning)CovariateCluster AnalysisHumansComputer SimulationFraction (mathematics)Molecular BiologyProportional Hazards ModelsModels StatisticalTraining setProportional hazards modelGene Expression ProfilingComputational BiologyComputer Science ApplicationsSupport vector machineComputational MathematicsKernel methodComputational Theory and MathematicsRegression AnalysisData miningcomputerAlgorithmsSoftwareBioinformatics
researchProduct

Coupled variable selection for regression modeling of complex treatment patterns in a clinical cancer registry.

2013

For determining a manageable set of covariates potentially influential with respect to a time-to-event endpoint, Cox proportional hazards models can be combined with variable selection techniques, such as stepwise forward selection or backward elimination based on p-values, or regularized regression techniques such as component-wise boosting. Cox regression models have also been adapted for dealing with more complex event patterns, for example, for competing risks settings with separate, cause-specific hazard models for each event type, or for determining the prognostic effect pattern of a variable over different landmark times, with one conditional survival model for each landmark. Motivat…

Statistics and ProbabilityMaleNiacinamideBoosting (machine learning)Carcinoma HepatocellularEpidemiologyComputer scienceScoreFeature selectionAntineoplastic Agentscomputer.software_genreDecision Support TechniquesNeoplasmsCovariateHumansRegistriesAgedProportional Hazards ModelsProportional hazards modelPhenylurea CompoundsLiver NeoplasmsRegression analysisConfounding Factors EpidemiologicMiddle AgedSorafenibPrognosisRegressionCancer registryData Interpretation StatisticalRegression AnalysisData miningcomputerStatistics in medicine
researchProduct

Elasticity function of a discrete random variable and its properties

2017

ABSTRACTElasticity (or elasticity function) is a new concept that allows us to characterize the probability distribution of any random variable in the same way as characteristic functions and hazard and reverse hazard functions do. Initially defined for continuous variables, it was necessary to extend the definition of elasticity and study its properties in the case of discrete variables. A first attempt to define discrete elasticity is seen in Veres-Ferrer and Pavia (2014a). This paper develops this definition and makes a comparative study of its properties, relating them to the properties shown by discrete hazard and reverse hazard, as both defined in Chechile (2011). Similar to continuou…

Statistics and ProbabilityMathematical optimization021103 operations researchDiscretizationHazard ratio0211 other engineering and technologies02 engineering and technology01 natural sciencesElasticity of a functionContinuous variable010104 statistics & probabilityApplied mathematicsProbability distribution0101 mathematicsElasticity (economics)Random variableMathematicsCommunications in Statistics - Theory and Methods
researchProduct