Search results for "Models"
showing 10 items of 8211 documents
Elasticity as a measure for online determination of remission points in ongoing epidemics.
2020
The correct identification of change-points during ongoing outbreak investigations of infectious diseases is a matter of paramount importance in epidemiology, with major implications for the management of health care resources, public health and, as the COVID-19 pandemic has shown, social live. Onsets, peaks, and inflexion points are some of them. An onset is the moment when the epidemic starts. A "peak" indicates a moment at which the incorporated values, both before and after, are lower: a maximum. The inflexion points identify moments in which the rate of growth of the incorporation of new cases changes intensity. In this study, after interpreting the concept of elasticity of a random va…
Disorder relevance for the random walk pinning model in dimension 3
2011
We study the continuous time version of the random walk pinning model, where conditioned on a continuous time random walk Y on Z^d with jump rate \rho>0, which plays the role of disorder, the law up to time t of a second independent random walk X with jump rate 1 is Gibbs transformed with weight e^{\beta L_t(X,Y)}, where L_t(X,Y) is the collision local time between X and Y up to time t. As the inverse temperature \beta varies, the model undergoes a localization-delocalization transition at some critical \beta_c>=0. A natural question is whether or not there is disorder relevance, namely whether or not \beta_c differs from the critical point \beta_c^{ann} for the annealed model. In Birkner a…
A Log-Rank Test for Equivalence of Two Survivor Functions
1993
We consider a hypothesis testing problem in which the alternative states that the vertical distance between the underlying survivor functions nowhere exceeds some prespecified bound delta0. Under the assumption of proportional hazards, this hypothesis is shown to be (logically) equivalent to the statement [beta[log(1 + epsilon), where beta denotes the regression coefficient associated with the treatment group indicator, and epsilon is a simple strictly increasing function of delta. The testing procedure proposed consists of carrying out in terms of beta (i.e., the standard Cox likelihood estimator of beta) the uniformly most powerful level alpha test for a suitable interval hypothesis about…
Automatic variable selection for exposure-driven propensity score matching with unmeasured confounders.
2020
Multivariable model building for propensity score modeling approaches is challenging. A common propensity score approach is exposure-driven propensity score matching, where the best model selection strategy is still unclear. In particular, the situation may require variable selection, while it is still unclear if variables included in the propensity score should be associated with the exposure and the outcome, with either the exposure or the outcome, with at least the exposure or with at least the outcome. Unmeasured confounders, complex correlation structures, and non-normal covariate distributions further complicate matters. We consider the performance of different modeling strategies in …
Testing for homogeneity in meta-analysis I. The one-parameter case: standardized mean difference.
2010
Meta-analysis seeks to combine the results of several experiments in order to improve the accuracy of decisions. It is common to use a test for homogeneity to determine if the results of the several experiments are sufficiently similar to warrant their combination into an overall result. Cochran's Q statistic is frequently used for this homogeneity test. It is often assumed that Q follows a chi-square distribution under the null hypothesis of homogeneity, but it has long been known that this asymptotic distribution for Q is not accurate for moderate sample sizes. Here, we present an expansion for the mean of Q under the null hypothesis that is valid when the effect and the weight for each s…
Marginal hazard ratio estimates in joint frailty models for heart failure trials
2019
Abstract This work is motivated by clinical trials in chronic heart failure disease, where treatment has effects both on morbidity (assessed as recurrent non‐fatal hospitalisations) and on mortality (assessed as cardiovascular death, CV death). Recently, a joint frailty proportional hazards model has been proposed for these kind of efficacy outcomes to account for a potential association between the risk rates for hospital admissions and CV death. However, more often clinical trial results are presented by treatment effect estimates that have been derived from marginal proportional hazards models, that is, a Cox model for mortality and an Andersen–Gill model for recurrent hospitalisations. …
Cluster-Localized Sparse Logistic Regression for SNP Data
2012
The task of analyzing high-dimensional single nucleotide polymorphism (SNP) data in a case-control design using multivariable techniques has only recently been tackled. While many available approaches investigate only main effects in a high-dimensional setting, we propose a more flexible technique, cluster-localized regression (CLR), based on localized logistic regression models, that allows different SNPs to have an effect for different groups of individuals. Separate multivariable regression models are fitted for the different groups of individuals by incorporating weights into componentwise boosting, which provides simultaneous variable selection, hence sparse fits. For model fitting, th…
Morphology changes induced by intercellular gap junction blocking: A reaction-diffusion mechanism.
2021
Complex anatomical form is regulated in part by endogenous physiological communication between cells; however, the dynamics by which gap junctional (GJ) states across tissues regulate morphology are still poorly understood. We employed a biophysical modeling approach combining different signaling molecules (morphogens) to qualitatively describe the anteroposterior and lateral morphology changes in model multicellular systems due to intercellular GJ blockade. The model is based on two assumptions for blocking-induced patterning: (i) the local concentrations of two small antagonistic morphogens diffusing through the GJs along the axial direction, together with that of an independent, uncouple…
Opportunities and challenges of combined effect measures based on prioritized outcomes
2013
Many authors have proposed different approaches to combine multiple endpoints in a univariate outcome measure in the literature. In case of binary or time-to-event variables, composite endpoints, which combine several event types within a single event or time-to-first-event analysis are often used to assess the overall treatment effect. A main drawback of this approach is that the interpretation of the composite effect can be difficult as a negative effect in one component can be masked by a positive effect in another. Recently, some authors proposed more general approaches based on a priority ranking of outcomes, which moreover allow to combine outcome variables of different scale levels. …
Sample size planning for survival prediction with focus on high-dimensional data
2011
Sample size planning should reflect the primary objective of a trial. If the primary objective is prediction, the sample size determination should focus on prediction accuracy instead of power. We present formulas for the determination of training set sample size for survival prediction. Sample size is chosen to control the difference between optimal and expected prediction error. Prediction is carried out by Cox proportional hazards models. The general approach considers censoring as well as low-dimensional and high-dimensional explanatory variables. For dimension reduction in the high-dimensional setting, a variable selection step is inserted. If not all informative variables are included…