Search results for "Core"
showing 10 items of 1999 documents
Imperfect information and consumer inflation expectations:evidence from microdata
2017
This paper explores which factors trigger an adjustment in consumers’ inflation expectations and looks at the implications regarding forecast errors. We find support for imperfect information models, as inflation volatility and news trigger an adjustment in expectations. Furthermore, we document that individual expectations become more accurate if they have been adjusted.
Coupled variable selection for regression modeling of complex treatment patterns in a clinical cancer registry.
2013
For determining a manageable set of covariates potentially influential with respect to a time-to-event endpoint, Cox proportional hazards models can be combined with variable selection techniques, such as stepwise forward selection or backward elimination based on p-values, or regularized regression techniques such as component-wise boosting. Cox regression models have also been adapted for dealing with more complex event patterns, for example, for competing risks settings with separate, cause-specific hazard models for each event type, or for determining the prognostic effect pattern of a variable over different landmark times, with one conditional survival model for each landmark. Motivat…
Dilute and semi dilute solutions of block copolymers in water, near-critical and super-critical CO2: a small angle scattering study of the monomer–ag…
2002
Abstract Small angle neutron (SANS) and X-ray (SAXS) Scattering measurements on aggregate formation of block copolymers in water and in near-critical and supercritical CO2 are reported here. Time Resolved SAXS (TR-SAXS) has also been performed in the supercritical region. Experiments have been carried out for a series of different thermodynamic conditions, changing the solvent density by profiling the pressure at constant temperature. A sharp transition between monomers dissolved as random coils and micelles characterized by a solvo-philic shell and a solvo-phobic core occurs when the solvent density reaches the critical micellization value. This is easily shown in the case of scCO2.
Bayesian analysis of a Gibbs hard-core point pattern model with varying repulsion range
2014
A Bayesian solution is suggested for the modelling of spatial point patterns with inhomogeneous hard-core radius using Gaussian processes in the regularization. The key observation is that a straightforward use of the finite Gibbs hard-core process likelihood together with a log-Gaussian random field prior does not work without penalisation towards high local packing density. Instead, a nearest neighbour Gibbs process likelihood is used. This approach to hard-core inhomogeneity is an alternative to the transformation inhomogeneous hard-core modelling. The computations are based on recent Markovian approximation results for Gaussian fields. As an application, data on the nest locations of Sa…
A parallel and sensitive software tool for methylation analysis on multicore platforms.
2015
Abstract Motivation: DNA methylation analysis suffers from very long processing time, as the advent of Next-Generation Sequencers has shifted the bottleneck of genomic studies from the sequencers that obtain the DNA samples to the software that performs the analysis of these samples. The existing software for methylation analysis does not seem to scale efficiently neither with the size of the dataset nor with the length of the reads to be analyzed. As it is expected that the sequencers will provide longer and longer reads in the near future, efficient and scalable methylation software should be developed. Results: We present a new software tool, called HPG-Methyl, which efficiently maps bis…
Hard-Core Thinnings of Germ‒Grain Models with Power-Law Grain Sizes
2013
Random sets with long-range dependence can be generated using a Boolean model with power-law grain sizes. We study thinnings of such Boolean models which have the hard-core property that no grains overlap in the resulting germ‒grain model. A fundamental question is whether long-range dependence is preserved under such thinnings. To answer this question, we study four natural thinnings of a Poisson germ‒grain model where the grains are spheres with a regularly varying size distribution. We show that a thinning which favors large grains preserves the slow correlation decay of the original model, whereas a thinning which favors small grains does not. Our most interesting finding concerns the c…
Nearly exact sample size calculation for powerful non-randomized tests for differences between binomial proportions
2015
In the case of two independent samples, it turns out that among the procedures taken in consideration, BOSCHLOO'S technique of raising the nominal level in the standard conditional test as far as admissible performs best in terms of power against almost all alternatives. The computational burden entailed in exact sample size calculation is comparatively modest for both the uniformly most powerful unbiased randomized and the conservative non-randomized version of the exact Fisher-type test. Computing these values yields a pair of bounds enclosing the exact sample size required for the Boschloo test, and it seems reasonable to replace the exact value with the middle of the corresponding inter…
The “ThreePlusOne” Likelihood-Based Test Statistics: Unified Geometrical and Graphical Interpretations
2014
The presentation of the well known Likelihood Ratio, Wald and Score test statistics in textbooks appears to lack a unified graphical and geometrical interpretation. We present two simple graphical representations on a common scale for these three test statistics, and also the recently proposed Gradient test statistic. These unified graphical displays may favour better understanding of the geometrical meaning of the likelihood based statistics and provide useful insights into their connections.
Inferential tools in penalized logistic regression for small and sparse data: A comparative study.
2016
This paper focuses on inferential tools in the logistic regression model fitted by the Firth penalized likelihood. In this context, the Likelihood Ratio statistic is often reported to be the preferred choice as compared to the ‘traditional’ Wald statistic. In this work, we consider and discuss a wider range of test statistics, including the robust Wald, the Score, and the recently proposed Gradient statistic. We compare all these asymptotically equivalent statistics in terms of interval estimation and hypothesis testing via simulation experiments and analyses of two real datasets. We find out that the Likelihood Ratio statistic does not appear the best inferential device in the Firth penal…
Testing with a nuisance parameter present only under the alternative: a score-based approach with application to segmented modelling
2016
ABSTRACTWe introduce a score-type statistic to test for a non-zero regression coefficient when the relevant term involves a nuisance parameter present only under the alternative. Despite the non-regularity and complexity of the problem and unlike the previous approaches, the proposed test statistic does not require the nuisance to be estimated. It is simple to implement by relying on the conventional distributions, such as Normal or t, and it justified in the setting of probabilistic coherence. We focus on testing for the existence of a breakpoint in segmented regression, and illustrate the methodology with an analysis on data of DNA copy number aberrations and gene expression profiles from…