Search results for "Proportional Hazards Model"

showing 10 items of 721 documents

Elasticity as a measure for online determination of remission points in ongoing epidemics.

2020

The correct identification of change-points during ongoing outbreak investigations of infectious diseases is a matter of paramount importance in epidemiology, with major implications for the management of health care resources, public health and, as the COVID-19 pandemic has shown, social live. Onsets, peaks, and inflexion points are some of them. An onset is the moment when the epidemic starts. A "peak" indicates a moment at which the incorporated values, both before and after, are lower: a maximum. The inflexion points identify moments in which the rate of growth of the incorporation of new cases changes intensity. In this study, after interpreting the concept of elasticity of a random va…

Statistics and Probability2019-20 coronavirus outbreakCoronavirus disease 2019 (COVID-19)Computer scienceEpidemiology01 natural sciencesTime010104 statistics & probability03 medical and health sciencesRemission induction0302 clinical medicinePandemicHealth careEconometricsHumansComputer Simulation030212 general & internal medicine0101 mathematicsElasticity (economics)EpidemicsPandemicsProportional Hazards Modelsbusiness.industryRemission InductionCOVID-19businessEpidemiologic MethodsRandom variableRate of growthStatistics in medicineREFERENCES
researchProduct

A Log-Rank Test for Equivalence of Two Survivor Functions

1993

We consider a hypothesis testing problem in which the alternative states that the vertical distance between the underlying survivor functions nowhere exceeds some prespecified bound delta0. Under the assumption of proportional hazards, this hypothesis is shown to be (logically) equivalent to the statement [beta[log(1 + epsilon), where beta denotes the regression coefficient associated with the treatment group indicator, and epsilon is a simple strictly increasing function of delta. The testing procedure proposed consists of carrying out in terms of beta (i.e., the standard Cox likelihood estimator of beta) the uniformly most powerful level alpha test for a suitable interval hypothesis about…

Statistics and ProbabilityBiometryGaussianGeneral Biochemistry Genetics and Molecular BiologyCombinatoricssymbols.namesakeNeoplasmsLinear regressionStatisticsChi-square testHumansComputer SimulationCerebellar NeoplasmsChildEquivalence (measure theory)Proportional Hazards ModelsStatistical hypothesis testingMathematicsClinical Trials as TopicGeneral Immunology and MicrobiologyApplied MathematicsEstimatorGeneral MedicineSurvival AnalysisLog-rank testLinear ModelssymbolsGeneral Agricultural and Biological SciencesMedulloblastomaQuantileBiometrics
researchProduct

Marginal hazard ratio estimates in joint frailty models for heart failure trials

2019

Abstract This work is motivated by clinical trials in chronic heart failure disease, where treatment has effects both on morbidity (assessed as recurrent non‐fatal hospitalisations) and on mortality (assessed as cardiovascular death, CV death). Recently, a joint frailty proportional hazards model has been proposed for these kind of efficacy outcomes to account for a potential association between the risk rates for hospital admissions and CV death. However, more often clinical trial results are presented by treatment effect estimates that have been derived from marginal proportional hazards models, that is, a Cox model for mortality and an Andersen–Gill model for recurrent hospitalisations. …

Statistics and ProbabilityBiometryleast false parameterDiseasejoint frailty modelRisk AssessmentStudy durationCardiovascular deathunexplained heterogeneitymedicineHumansTreatment effectComplex Regression ModelsProportional Hazards ModelsHeart FailureClinical Trials as TopicProportional hazards modelbusiness.industryheart failure trialsHazard ratioGeneral Medicinemedicine.diseaseClinical trialrecurrent eventsHeart failureAsymptomatic DiseasesStatistics Probability and UncertaintybusinessDemographyResearch PaperBiometrical Journal. Biometrische Zeitschrift
researchProduct

Opportunities and challenges of combined effect measures based on prioritized outcomes

2013

Many authors have proposed different approaches to combine multiple endpoints in a univariate outcome measure in the literature. In case of binary or time-to-event variables, composite endpoints, which combine several event types within a single event or time-to-first-event analysis are often used to assess the overall treatment effect. A main drawback of this approach is that the interpretation of the composite effect can be difficult as a negative effect in one component can be masked by a positive effect in another. Recently, some authors proposed more general approaches based on a priority ranking of outcomes, which moreover allow to combine outcome variables of different scale levels. …

Statistics and ProbabilityClinical Trials as TopicEpidemiologyUnivariatecomputer.software_genreOutcome (game theory)Treatment OutcomeRankingScale (social sciences)Component (UML)Outcome Assessment Health CareMultiple comparisons problemHumansComputer SimulationData miningcomputerProportional Hazards ModelsMathematicsStatistical hypothesis testingEvent (probability theory)Statistics in Medicine
researchProduct

Sample size planning for survival prediction with focus on high-dimensional data

2011

Sample size planning should reflect the primary objective of a trial. If the primary objective is prediction, the sample size determination should focus on prediction accuracy instead of power. We present formulas for the determination of training set sample size for survival prediction. Sample size is chosen to control the difference between optimal and expected prediction error. Prediction is carried out by Cox proportional hazards models. The general approach considers censoring as well as low-dimensional and high-dimensional explanatory variables. For dimension reduction in the high-dimensional setting, a variable selection step is inserted. If not all informative variables are included…

Statistics and ProbabilityClustering high-dimensional dataClinical Trials as TopicLung NeoplasmsModels StatisticalKaplan-Meier EstimateEpidemiologyProportional hazards modelDimensionality reductionGene ExpressionFeature selectionKaplan-Meier EstimateBiostatisticsPrognosisBrier scoreSample size determinationCarcinoma Non-Small-Cell LungSample SizeCensoring (clinical trials)StatisticsHumansProportional Hazards ModelsMathematicsStatistics in Medicine
researchProduct

Bayesian regularization for flexible baseline hazard functions in Cox survival models.

2019

Fully Bayesian methods for Cox models specify a model for the baseline hazard function. Parametric approaches generally provide monotone estimations. Semi-parametric choices allow for more flexible patterns but they can suffer from overfitting and instability. Regularization methods through prior distributions with correlated structures usually give reasonable answers to these types of situations. We discuss Bayesian regularization for Cox survival models defined via flexible baseline hazards specified by a mixture of piecewise constant functions and by a cubic B-spline function. For those "semi-parametric" proposals, different prior scenarios ranging from prior independence to particular c…

Statistics and ProbabilityComputer scienceProportional hazards modelModel selectionBayesian probabilityPosterior probabilityMarkov chain Monte CarloBayes TheoremGeneral MedicineOverfittingSurvival AnalysisMarkov Chainssymbols.namesakeStatisticsCovariatesymbolsPiecewiseStatistics Probability and UncertaintyMonte Carlo MethodProportional Hazards ModelsBiometrical journal. Biometrische ZeitschriftREFERENCES
researchProduct

Generating survival times to simulate Cox proportional hazards models by Ralf Bender, Thomas Augustin and Maria Blettner,Statistics in Medicine 2005;…

2006

Statistics and ProbabilityEpidemiologyProportional hazards modelComputer scienceStatisticsEconometricsMEDLINEMedical statisticsSurvival analysisStatistics in Medicine
researchProduct

A weighted combined effect measure for the analysis of a composite time-to-first-event endpoint with components of different clinical relevance

2018

Composite endpoints combine several events within a single variable, which increases the number of expected events and is thereby meant to increase the power. However, the interpretation of results can be difficult as the observed effect for the composite does not necessarily reflect the effects for the components, which may be of different magnitude or even point in adverse directions. Moreover, in clinical applications, the event types are often of different clinical relevance, which also complicates the interpretation of the composite effect. The common effect measure for composite endpoints is the all-cause hazard ratio, which gives equal weight to all events irrespective of their type …

Statistics and ProbabilityHazard (logic)EpidemiologyEndpoint Determination01 natural sciencesMeasure (mathematics)WIN RATIO010104 statistics & probability03 medical and health sciences0302 clinical medicineResamplingStatisticstime-to-eventHumansComputer Simulation030212 general & internal medicinerelevance weighting0101 mathematicsParametric statisticsEvent (probability theory)MathematicsProportional Hazards Modelsclinical trialsHazard ratiocomposite endpointWeightingPRIORITIZED OUTCOMESTRIALSData Interpretation StatisticalMULTISTATE MODELSINFERENCENull hypothesisMonte Carlo MethodStatistics in Medicine
researchProduct

Generating survival times to simulate Cox proportional hazards models

2005

Simulation studies present an important statistical tool to investigate the performance, properties and adequacy of statistical models in pre-specified situations. One of the most important statistical models in medical research is the proportional hazards model of Cox. In this paper, techniques to generate survival times for simulation studies regarding Cox proportional hazards models are presented. A general formula describing the relation between the hazard and the corresponding survival time of the Cox model is derived, which is useful in simulation studies. It is shown how the exponential, the Weibull and the Gompertz distribution can be applied to generate appropriate survival times f…

Statistics and ProbabilityHazard (logic)Exponential distributionEpidemiologyComputer scienceProportional hazards modelStatisticsEconometricsStatistical modelSurvival analysisGompertz distributionExponential functionWeibull distributionStatistics in Medicine
researchProduct

Sparse kernel methods for high-dimensional survival data

2008

Abstract Sparse kernel methods like support vector machines (SVM) have been applied with great success to classification and (standard) regression settings. Existing support vector classification and regression techniques however are not suitable for partly censored survival data, which are typically analysed using Cox's proportional hazards model. As the partial likelihood of the proportional hazards model only depends on the covariates through inner products, it can be ‘kernelized’. The kernelized proportional hazards model however yields a solution that is dense, i.e. the solution depends on all observations. One of the key features of an SVM is that it yields a sparse solution, dependin…

Statistics and ProbabilityLung NeoplasmsLymphomaComputer sciencecomputer.software_genreComputing MethodologiesBiochemistryPattern Recognition AutomatedArtificial IntelligenceMargin (machine learning)CovariateCluster AnalysisHumansComputer SimulationFraction (mathematics)Molecular BiologyProportional Hazards ModelsModels StatisticalTraining setProportional hazards modelGene Expression ProfilingComputational BiologyComputer Science ApplicationsSupport vector machineComputational MathematicsKernel methodComputational Theory and MathematicsRegression AnalysisData miningcomputerAlgorithmsSoftwareBioinformatics
researchProduct