Search results for "Hazard"
showing 10 items of 1517 documents
FEV1 and FVC predict all-cause mortality independent of cardiac function - Results from the population-based Gutenberg Health Study.
2017
Abstract Background Lung function has previously been related to increased mortality. Whether pulmonary impairment is associated with an increased mortality independent of cardiac dysfunction remains unclear. Methods In 15010 individuals from the general population (age range 35–74years, 51% men) in the Gutenberg Health Study we performed spirometry and transthoracic echocardiography. N-terminal pro-B-type natriuretic peptide (Nt-proBNP) and high-sensitive troponin I (hsTnI) were measured in all individuals. 1819 individuals with pulmonary diseases were excluded from further analysis. Results The median for forced expiratory volume in 1s (FEV1) was 94.2% and for forced vital capacity (FVC) …
Abstract 13257: FEV1 and FVC predict Mortality in Individuals Without Manifest Lung Disease Independent of Cardiac Performance - Results From the Pop…
2015
Background: Pulmonary disease has consistently been related to increased mortality. We investigated central spirometry variables in relation to total mortality in individuals from the general population without diagnosed lung disease also accounting for cardiac function. Methods: In 15,010 individuals from the general population (mean age 55±11 years, age range 35-74 years, 50.5% men) in the Gutenberg Health Study we performed spirometry and multimodal transthoracic echocardiography. The biomarkers N-terminal pro-B-type natriuretic peptide (Nt-proBNP) and high-sensitive troponin I (TnI) were measured in the first 5000 individuals using commercially available assays. Multivariable Cox regre…
Prognostic value of FEV1/FEV6 in elderly people*
2010
Summary Background: The ratio of forced expiratory volume in 1 s and forced expiratory volume in 6 s (FEV1/FEV6) has been proposed as an alternative for FEV1/forced vital capacity (FVC) to diagnose obstructive diseases with less effort during spirometry; however, its prognostic value is unknown. We evaluated whether FEV1/FEV6 is a significant predictor of mortality in elderly subjects and compared its prognostic value with that of FEV1/FVC and FEV1. Methods: One thousand nine hundred and seventy-one subjects, aged >65 years, participated in the population-based SA.R.A. study. During the baseline exam, a multidimensional assessment included spirometry. Vital status was determined during 6 …
Prevalence of bacteria and absence of anisakid parasites in raw and prepared fish and seafood dishes in Spanish restaurants
2015
This study evaluated the presence of bacteria and anisakid parasites in 45 samples of raw anchovies in vinegar, a dish widely eaten in Spain, and in 227 samples of cooked fish and cephalopods served in Spanish food service establishments. Our analysis showed that, according to European and Spanish regulation, 14 to 30% of the prepared fish and cephalopod dishes exceeded the maximum allowable level for mesophilic aerobic counts, and 10 to 40% of these samples exceeded the allowable levels for Enterobacteriaceae. None of the studied samples showed evidence of anisakid parasites, Escherichia coli, Staphylococcus aureus, Salmonella, or Listeria monocyto genes. These results indicate that applic…
Elasticity as a measure for online determination of remission points in ongoing epidemics.
2020
The correct identification of change-points during ongoing outbreak investigations of infectious diseases is a matter of paramount importance in epidemiology, with major implications for the management of health care resources, public health and, as the COVID-19 pandemic has shown, social live. Onsets, peaks, and inflexion points are some of them. An onset is the moment when the epidemic starts. A "peak" indicates a moment at which the incorporated values, both before and after, are lower: a maximum. The inflexion points identify moments in which the rate of growth of the incorporation of new cases changes intensity. In this study, after interpreting the concept of elasticity of a random va…
A Log-Rank Test for Equivalence of Two Survivor Functions
1993
We consider a hypothesis testing problem in which the alternative states that the vertical distance between the underlying survivor functions nowhere exceeds some prespecified bound delta0. Under the assumption of proportional hazards, this hypothesis is shown to be (logically) equivalent to the statement [beta[log(1 + epsilon), where beta denotes the regression coefficient associated with the treatment group indicator, and epsilon is a simple strictly increasing function of delta. The testing procedure proposed consists of carrying out in terms of beta (i.e., the standard Cox likelihood estimator of beta) the uniformly most powerful level alpha test for a suitable interval hypothesis about…
Marginal hazard ratio estimates in joint frailty models for heart failure trials
2019
Abstract This work is motivated by clinical trials in chronic heart failure disease, where treatment has effects both on morbidity (assessed as recurrent non‐fatal hospitalisations) and on mortality (assessed as cardiovascular death, CV death). Recently, a joint frailty proportional hazards model has been proposed for these kind of efficacy outcomes to account for a potential association between the risk rates for hospital admissions and CV death. However, more often clinical trial results are presented by treatment effect estimates that have been derived from marginal proportional hazards models, that is, a Cox model for mortality and an Andersen–Gill model for recurrent hospitalisations. …
Opportunities and challenges of combined effect measures based on prioritized outcomes
2013
Many authors have proposed different approaches to combine multiple endpoints in a univariate outcome measure in the literature. In case of binary or time-to-event variables, composite endpoints, which combine several event types within a single event or time-to-first-event analysis are often used to assess the overall treatment effect. A main drawback of this approach is that the interpretation of the composite effect can be difficult as a negative effect in one component can be masked by a positive effect in another. Recently, some authors proposed more general approaches based on a priority ranking of outcomes, which moreover allow to combine outcome variables of different scale levels. …
Sample size planning for survival prediction with focus on high-dimensional data
2011
Sample size planning should reflect the primary objective of a trial. If the primary objective is prediction, the sample size determination should focus on prediction accuracy instead of power. We present formulas for the determination of training set sample size for survival prediction. Sample size is chosen to control the difference between optimal and expected prediction error. Prediction is carried out by Cox proportional hazards models. The general approach considers censoring as well as low-dimensional and high-dimensional explanatory variables. For dimension reduction in the high-dimensional setting, a variable selection step is inserted. If not all informative variables are included…
Bayesian regularization for flexible baseline hazard functions in Cox survival models.
2019
Fully Bayesian methods for Cox models specify a model for the baseline hazard function. Parametric approaches generally provide monotone estimations. Semi-parametric choices allow for more flexible patterns but they can suffer from overfitting and instability. Regularization methods through prior distributions with correlated structures usually give reasonable answers to these types of situations. We discuss Bayesian regularization for Cox survival models defined via flexible baseline hazards specified by a mixture of piecewise constant functions and by a cubic B-spline function. For those "semi-parametric" proposals, different prior scenarios ranging from prior independence to particular c…