Search results for "Statistical Model"
showing 10 items of 163 documents
Individual measurements and nested designs in aquaculture experiments: a simulation study
1998
Simple and nested models for analysis of variance (ANOVA) in aquaculture experiments were compared with the help of computer simulations. Simple models for analysing variables that are based on tank means, such as final weight and growth rate, were found to be sensitive to differences in the number of individual observations in each tank. In comparison to nested models that take into account individual measurements, the simple models were found to overestimate the F ratio and increase the risk of committing type I error, i.e., accepting a false alternative hypothesis. Further, nested models permit greater flexibility in experimental design, and allow more economical solutions within a given…
Rateless Codes Performance Analysis in Correlated Channel Model for GEO Free Space Optics Downlinks
2012
TSVD as a Statistical Estimator in the Latent Semantic Analysis Paradigm
2015
The aim of this paper is to present a new point of view that makes it possible to give a statistical interpretation of the traditional latent semantic analysis (LSA) paradigm based on the truncated singular value decomposition (TSVD) technique. We show how the TSVD can be interpreted as a statistical estimator derived from the LSA co-occurrence relationship matrix by mapping probability distributions on Riemanian manifolds. Besides, the quality of the estimator model can be expressed by introducing a figure of merit arising from the Solomonoff approach. This figure of merit takes into account both the adherence to the sample data and the simplicity of the model. In our model, the simplicity…
A Conceptual Probabilistic Model for the Induction of Image Semantics
2010
In this paper we propose a model based on a conceptual space automatically induced from data. The model is inspired to a well-founded robotics cognitive architecture which is organized in three computational areas: sub-conceptual, linguistic and conceptual. Images are objects in the sub-conceptual area, that become "knoxels" into the conceptual area. The application of the framework grants the automatic emerging of image semantics into the linguistic area. The core of the model is a conceptual space induced automatically from a set of annotated images that exploits and mixes different information concerning the set of images. Multiple low level features are extracted to represent images and…
Deep Gaussian processes for biogeophysical parameter retrieval and model inversion
2020
Parameter retrieval and model inversion are key problems in remote sensing and Earth observation. Currently, different approximations exist: a direct, yet costly, inversion of radiative transfer models (RTMs); the statistical inversion with in situ data that often results in problems with extrapolation outside the study area; and the most widely adopted hybrid modeling by which statistical models, mostly nonlinear and non-parametric machine learning algorithms, are applied to invert RTM simulations. We will focus on the latter. Among the different existing algorithms, in the last decade kernel based methods, and Gaussian Processes (GPs) in particular, have provided useful and informative so…
Understanding disease mechanisms with models of signaling pathway activities
2014
Background Understanding the aspects of the cell functionality that account for disease or drug action mechanisms is one of the main challenges in the analysis of genomic data and is on the basis of the future implementation of precision medicine. Results Here we propose a simple probabilistic model in which signaling pathways are separated into elementary sub-pathways or signal transmission circuits (which ultimately trigger cell functions) and then transforms gene expression measurements into probabilities of activation of such signal transmission circuits. Using this model, differential activation of such circuits between biological conditions can be estimated. Thus, circuit activation s…
Southern-Tyrrhenian seismicity in space-time-magnitude domain
2006
An analysis is conducted on a catalogue containing more than 2000 seismic events occurred in the southern Tyrrhenian Sea between 1988 and October 2002, as an attempt to characterise the main seismogenetic processes active in the area in space, time and magnitude domain by means of the parameters of phenomenological laws. We chose to adopt simple phenomenological models, since the low number of data did not allow to use more complex laws. The two main seismogenetic volumes present in the area were considered for the purpose of this work. The first includes a nearly homogeneous distribution of hypocentres in a NW steeply dipping layer as far as a…
Covid-19 in Italy: Modelling, Communications, and Collaborations
2022
Abstract When Covid-19 arrived in Italy in early 2020, a group of statisticians came together to provide tools to make sense of the unfolding epidemic and to counter misleading media narratives. Here, members of StatGroup-19 reflect on their work to date
Generating survival times to simulate Cox proportional hazards models
2005
Simulation studies present an important statistical tool to investigate the performance, properties and adequacy of statistical models in pre-specified situations. One of the most important statistical models in medical research is the proportional hazards model of Cox. In this paper, techniques to generate survival times for simulation studies regarding Cox proportional hazards models are presented. A general formula describing the relation between the hazard and the corresponding survival time of the Cox model is derived, which is useful in simulation studies. It is shown how the exponential, the Weibull and the Gompertz distribution can be applied to generate appropriate survival times f…
Using Statistical and Computer Models to Quantify Volcanic Hazards
2009
Risk assessment of rare natural hazards, such as large volcanic block and ash or pyroclastic flows, is addressed. Assessment is approached through a combination of computer modeling, statistical modeling, and extreme-event probability computation. A computer model of the natural hazard is used to provide the needed extrapolation to unseen parts of the hazard space. Statistical modeling of the available data is needed to determine the initializing distribution for exercising the computer model. In dealing with rare events, direct simulations involving the computer model are prohibitively expensive. The solution instead requires a combination of adaptive design of computer model approximation…