Search results for "methods"
showing 10 items of 4526 documents
A network agent-based model of ethnocentrism and intergroup cooperation
2019
We present a network agent-based model of ethnocentrism and intergroup cooperation in which agents from two groups (majority and minority) change their communality (feeling of group solidarity), cooperation strategy and social ties, depending on a barrier of “likeness” (affinity). Our purpose was to study the model’s capability for describing how the mechanisms of preexisting markers (or “tags”) that can work as cues for inducing in-group bias, imitation, and reaction to non-cooperating agents, lead to ethnocentrism or intergroup cooperation and influence the formation of the network of mixed ties between agents of different groups. We explored the model’s behavior via four experiments in w…
A Comparison of Formulae for Calculating Cost-Efficient Sample Sizes of Case-Control Studies with an Internal Validation Scheme
2000
When a case-control study is planned to include an internal validation study, the sample size of the study and the proportion of validated observations has to be calculated. There are a variety of alternative methods to accomplish this. In this article some possible procedures will be compared in order to clarify whether considerable differences in the suggested optimal designs occur, dependent on the used method.
Testing for homogeneity in meta-analysis I. The one-parameter case: standardized mean difference.
2010
Meta-analysis seeks to combine the results of several experiments in order to improve the accuracy of decisions. It is common to use a test for homogeneity to determine if the results of the several experiments are sufficiently similar to warrant their combination into an overall result. Cochran's Q statistic is frequently used for this homogeneity test. It is often assumed that Q follows a chi-square distribution under the null hypothesis of homogeneity, but it has long been known that this asymptotic distribution for Q is not accurate for moderate sample sizes. Here, we present an expansion for the mean of Q under the null hypothesis that is valid when the effect and the weight for each s…
Online Principal Component Analysis in High Dimension: Which Algorithm to Choose?
2017
Summary Principal component analysis (PCA) is a method of choice for dimension reduction. In the current context of data explosion, online techniques that do not require storing all data in memory are indispensable to perform the PCA of streaming data and/or massive data. Despite the wide availability of recursive algorithms that can efficiently update the PCA when new data are observed, the literature offers little guidance on how to select a suitable algorithm for a given application. This paper reviews the main approaches to online PCA, namely, perturbation techniques, incremental methods and stochastic optimisation, and compares the most widely employed techniques in terms statistical a…
Sequential Monte Carlo methods in Bayesian joint models for longitudinal and time-to-event data
2020
The statistical analysis of the information generated by medical follow-up is a very important challenge in the field of personalized medicine. As the evolutionary course of a patient's disease progresses, his/her medical follow-up generates more and more information that should be processed immediately in order to review and update his/her prognosis and treatment. Hence, we focus on this update process through sequential inference methods for joint models of longitudinal and time-to-event data from a Bayesian perspective. More specifically, we propose the use of sequential Monte Carlo (SMC) methods for static parameter joint models with the intention of reducing computational time in each…
Weighted distance-based trees for ranking data
2017
Within the framework of preference rankings, the interest can lie in finding which predictors and which interactions are able to explain the observed preference structures, because preference decisions will usually depend on the characteristics of both the judges and the objects being judged. This work proposes the use of a univariate decision tree for ranking data based on the weighted distances for complete and incomplete rankings, and considers the area under the ROC curve both for pruning and model assessment. Two real and well-known datasets, the SUSHI preference data and the University ranking data, are used to display the performance of the methodology.
Second-order diagnostics for space-time point processes with application to seismic events
2008
A diagnostic method for space-time point process is introduced and used to interpret and assess the goodness of fit of particular models to real data such as the seismic ones. The proposed method is founded on the definition of a weighted process and allows to detect second-order features of data, like long-range dependence and fractal behavior, that are not accounted for by the fitted model. Applications to earthquake data are provided. Copyright © 2008 John Wiley & Sons, Ltd.
Dans quelle mesure les préférences individuelles contraignent-elles le développement du marché de l'assurance dépendance ?
2015
Dans un contexte de vieillissement de la population, différents scenarii sont envisagés pour réformer l’organisation et le financement de la prise en charge des personnes âgées dépendantes. La place de la prévoyance individuelle dans le financement de la dépendance est à ce titre largement débattue. À l’heure actuelle, malgré des restes à charge potentiellement conséquents, peu d’individus disposent d’une couverture assurantielle. Cet article vise à enrichir la littérature existante en évaluant dans quelle mesure les préférences observées dans la population limitent cette couverture. Nous mobilisons pour cela l’enquête Patrimoine et préférences vis-à-vis du temps et du risque (Pater) de 201…
On the derivation of a linear Boltzmann equation from a periodic lattice gas
2004
We consider the problem of deriving the linear Boltzmann equation from the Lorentz process with hard spheres obstacles. In a suitable limit (the Boltzmann-Grad limit), it has been proved that the linear Boltzmann equation can be obtained when the position of obstacles are Poisson distributed, while the validation fails, also for the "correct" ratio between obstacle size and lattice parameter, when they are distributed on a purely periodic lattice, because of the existence of very long free trajectories. Here we validate the linear Boltzmann equation, in the limit when the scatterer's radius epsilon vanishes, for a family of Lorentz processes such that the obstacles have a random distributio…
Wardowski conditions to the coincidence problem
2015
In this article we first discuss the existence and uniqueness of a solution for the coincidence problem: Find p ∈ X such that Tp = Sp, where X is a nonempty set, Y is a complete metric space, and T, S:X → Y are two mappings satisfying a Wardowski type condition of contractivity. Later on, we will state the convergence of the Picard-Juncgk iteration process to the above coincidence problem as well as a rate of convergence for this iteration scheme. Finally, we shall apply our results to study the existence and uniqueness of a solution as well as the convergence of the Picard-Juncgk iteration process toward the solution of a second order differential equation. Ministerio de Economía y Competi…