Search results for "Hazard"
showing 10 items of 1517 documents
Bayesian joint ordinal and survival modeling for breast cancer risk assessment
2016
We propose a joint model to analyze the structure and intensity of the association between longitudinal measurements of an ordinal marker and time to a relevant event. The longitudinal process is defined in terms of a proportional-odds cumulative logit model. Time-to-event is modeled through a left-truncated proportionalhazards model, which incorporates information of the longitudinal marker as well as baseline covariates. Both longitudinal and survival processes are connected by means of a common vector of random effects. General inferences are discussed under the Bayesian approach and include the posterior distribution of the probabilities associated to each longitudinal category and the …
Generating survival times to simulate Cox proportional hazards models by Ralf Bender, Thomas Augustin and Maria Blettner,Statistics in Medicine 2005;…
2006
Absolute Risk and Loss-of-Lifetime Estimates for Quantitative Risk Assessment
1998
Quantitative risk assessments in public health settings intend to describe the hazard of a specific exposure in a given population on the basis of epidemiological and/or experimental results. Two different risk quantities, the absolute lifetime excess risk and the loss-of-lifetime, which differ in their definition of hazard, are discussed and compared. For both measures estimation procedures are derived and the relationship between the various estimates which are currently in use are investigated. It is shown that the two most common estimators can be written as special cases of a more general concept. This leads to conclusions about the assumptions on which different estimation procedures …
A weighted combined effect measure for the analysis of a composite time-to-first-event endpoint with components of different clinical relevance
2018
Composite endpoints combine several events within a single variable, which increases the number of expected events and is thereby meant to increase the power. However, the interpretation of results can be difficult as the observed effect for the composite does not necessarily reflect the effects for the components, which may be of different magnitude or even point in adverse directions. Moreover, in clinical applications, the event types are often of different clinical relevance, which also complicates the interpretation of the composite effect. The common effect measure for composite endpoints is the all-cause hazard ratio, which gives equal weight to all events irrespective of their type …
Generating survival times to simulate Cox proportional hazards models
2005
Simulation studies present an important statistical tool to investigate the performance, properties and adequacy of statistical models in pre-specified situations. One of the most important statistical models in medical research is the proportional hazards model of Cox. In this paper, techniques to generate survival times for simulation studies regarding Cox proportional hazards models are presented. A general formula describing the relation between the hazard and the corresponding survival time of the Cox model is derived, which is useful in simulation studies. It is shown how the exponential, the Weibull and the Gompertz distribution can be applied to generate appropriate survival times f…
Using Statistical and Computer Models to Quantify Volcanic Hazards
2009
Risk assessment of rare natural hazards, such as large volcanic block and ash or pyroclastic flows, is addressed. Assessment is approached through a combination of computer modeling, statistical modeling, and extreme-event probability computation. A computer model of the natural hazard is used to provide the needed extrapolation to unseen parts of the hazard space. Statistical modeling of the available data is needed to determine the initializing distribution for exercising the computer model. In dealing with rare events, direct simulations involving the computer model are prohibitively expensive. The solution instead requires a combination of adaptive design of computer model approximation…
PROBABILISTIC QUANTIFICATION OF HAZARDS: A METHODOLOGY USING SMALL ENSEMBLES OF PHYSICS-BASED SIMULATIONS AND STATISTICAL SURROGATES
2015
This paper presents a novel approach to assessing the hazard threat to a locale due to a large volcanic avalanche. The methodology combines: (i) mathematical modeling of volcanic mass flows; (ii) field data of avalanche frequency, volume, and runout; (iii) large-scale numerical simulations of flow events; (iv) use of statistical methods to minimize computational costs, and to capture unlikely events; (v) calculation of the probability of a catastrophic flow event over the next T years at a location of interest; and (vi) innovative computational methodology to implement these methods. This unified presentation collects elements that have been separately developed, and incorporates new contri…
Sparse kernel methods for high-dimensional survival data
2008
Abstract Sparse kernel methods like support vector machines (SVM) have been applied with great success to classification and (standard) regression settings. Existing support vector classification and regression techniques however are not suitable for partly censored survival data, which are typically analysed using Cox's proportional hazards model. As the partial likelihood of the proportional hazards model only depends on the covariates through inner products, it can be ‘kernelized’. The kernelized proportional hazards model however yields a solution that is dense, i.e. the solution depends on all observations. One of the key features of an SVM is that it yields a sparse solution, dependin…
Coupled variable selection for regression modeling of complex treatment patterns in a clinical cancer registry.
2013
For determining a manageable set of covariates potentially influential with respect to a time-to-event endpoint, Cox proportional hazards models can be combined with variable selection techniques, such as stepwise forward selection or backward elimination based on p-values, or regularized regression techniques such as component-wise boosting. Cox regression models have also been adapted for dealing with more complex event patterns, for example, for competing risks settings with separate, cause-specific hazard models for each event type, or for determining the prognostic effect pattern of a variable over different landmark times, with one conditional survival model for each landmark. Motivat…
Elasticity function of a discrete random variable and its properties
2017
ABSTRACTElasticity (or elasticity function) is a new concept that allows us to characterize the probability distribution of any random variable in the same way as characteristic functions and hazard and reverse hazard functions do. Initially defined for continuous variables, it was necessary to extend the definition of elasticity and study its properties in the case of discrete variables. A first attempt to define discrete elasticity is seen in Veres-Ferrer and Pavia (2014a). This paper develops this definition and makes a comparative study of its properties, relating them to the properties shown by discrete hazard and reverse hazard, as both defined in Chechile (2011). Similar to continuou…