Search results for "Statistical Model"

showing 10 items of 163 documents

Individual measurements and nested designs in aquaculture experiments: a simulation study

1998

Simple and nested models for analysis of variance (ANOVA) in aquaculture experiments were compared with the help of computer simulations. Simple models for analysing variables that are based on tank means, such as final weight and growth rate, were found to be sensitive to differences in the number of individual observations in each tank. In comparison to nested models that take into account individual measurements, the simple models were found to overestimate the F ratio and increase the risk of committing type I error, i.e., accepting a false alternative hypothesis. Further, nested models permit greater flexibility in experimental design, and allow more economical solutions within a given…

Set (abstract data type)Power analysisAlternative hypothesisStatisticsStatistical modelReplicateAquatic ScienceBiologyStatistical powerType I and type II errorsNested set modelAquaculture
researchProduct

Rateless Codes Performance Analysis in Correlated Channel Model for GEO Free Space Optics Downlinks

2012

Settore ING-INF/03 - TelecomunicazioniFree Space Optics (FSO) technologies for satellite communications offer several advantages: wide bandwidth high rate capability immunity to electromagnetic interference and small equipment size. Thus they are suitable for inter-satellite links deep space communications and also for high data rate ground-to-satellite/satellite-to-ground communications. Nevertheless FSO links suffer impairments that cause power signal degradation at the receiver. Scattering and absorption cause power signal attenuations predictable by suitable deterministic models. Optical turbulence causes random irradiance fluctuations which can generate signal fading events and can thereby only be predicted by statistical models. Attenuation and fading events can corrupt FSO links and so it would be recommended to add mitigation error codes on the communication link. FSO channel can be described as an erasure channel: fading events can cause erasure errors. We have identified in rateless codes (RCs) a suitable solution to be employed in FSO links. RCs do not need feedback and they add a redundant coding on the source data that allows the receiver to recover the whole payload despite erasure errors. We implemented two different of rateless codes: Luby Transform (LT) and Raptor. We analyzed their performances on a simulated turbulent GEO FSO downlink (1 Gbps - OOK modulation) at a 106 μm wavelength and for different values of zenith angles. Assuming a plane-wave propagation and employing Hufnagel-Valley we modeled the downlink using: 1) a temporal correlated channel model based on Gamma-Gamma probability distribution and 2) an irradiance covariance function that we converted on a time function using Taylor frozen eddies hypothesis. Our new channel model is able to simulate irradiance fluctuations at different turbulence conditions as it will be shown in the full paper. We will also report performance results of LT and Raptor codes at overhead range varying between 0 and 50% and for different values of source packets.Settore ING-INF/01 - Elettronica
researchProduct

TSVD as a Statistical Estimator in the Latent Semantic Analysis Paradigm

2015

The aim of this paper is to present a new point of view that makes it possible to give a statistical interpretation of the traditional latent semantic analysis (LSA) paradigm based on the truncated singular value decomposition (TSVD) technique. We show how the TSVD can be interpreted as a statistical estimator derived from the LSA co-occurrence relationship matrix by mapping probability distributions on Riemanian manifolds. Besides, the quality of the estimator model can be expressed by introducing a figure of merit arising from the Solomonoff approach. This figure of merit takes into account both the adherence to the sample data and the simplicity of the model. In our model, the simplicity…

Settore ING-INF/05 - Sistemi Di Elaborazione Delle InformazioniHellinger DistanceLatent semantic analysisComputer sciencebusiness.industryProbabilistic logicEstimatorStatistical modelPattern recognitionComputer Science ApplicationsHuman-Computer Interactiondata-driven modelingData models Semantics Probability distribution Matrix decomposition Computational modeling Probabilistic logicLSASingular value decompositionComputer Science (miscellaneous)Probability distributionTruncation (statistics)Artificial intelligenceHellinger distancebusinessAlgorithmInformation SystemsIEEE Transactions on Emerging Topics in Computing
researchProduct

A Conceptual Probabilistic Model for the Induction of Image Semantics

2010

In this paper we propose a model based on a conceptual space automatically induced from data. The model is inspired to a well-founded robotics cognitive architecture which is organized in three computational areas: sub-conceptual, linguistic and conceptual. Images are objects in the sub-conceptual area, that become "knoxels" into the conceptual area. The application of the framework grants the automatic emerging of image semantics into the linguistic area. The core of the model is a conceptual space induced automatically from a set of annotated images that exploits and mixes different information concerning the set of images. Multiple low level features are extracted to represent images and…

Settore ING-INF/05 - Sistemi Di Elaborazione Delle InformazioniImage ClassificationComputer sciencebusiness.industryFeature extractionimage semantics conceptual spaceConceptual model (computer science)Statistical modelcomputer.software_genreConceptual schemaVisualizationSet (abstract data type)Data setAutomatic image annotationLatent Semantic AnalysisArtificial intelligencebusinesscomputerNatural language processing
researchProduct

Deep Gaussian processes for biogeophysical parameter retrieval and model inversion

2020

Parameter retrieval and model inversion are key problems in remote sensing and Earth observation. Currently, different approximations exist: a direct, yet costly, inversion of radiative transfer models (RTMs); the statistical inversion with in situ data that often results in problems with extrapolation outside the study area; and the most widely adopted hybrid modeling by which statistical models, mostly nonlinear and non-parametric machine learning algorithms, are applied to invert RTM simulations. We will focus on the latter. Among the different existing algorithms, in the last decade kernel based methods, and Gaussian Processes (GPs) in particular, have provided useful and informative so…

Signal Processing (eess.SP)FOS: Computer and information sciencesComputer Science - Machine LearningEarth observation010504 meteorology & atmospheric sciencesIASIComputer science0211 other engineering and technologiesExtrapolation02 engineering and technologyDeep Gaussian Processes01 natural sciencesArticleMachine Learning (cs.LG)symbols.namesakeCopernicus programmeSentinelsMachine learningRadiative transferFOS: Electrical engineering electronic engineering information engineeringElectrical Engineering and Systems Science - Signal ProcessingComputers in Earth SciencesModel inversionStatistical retrievalEngineering (miscellaneous)Gaussian processChlorophyll contentMoisture021101 geological & geomatics engineering0105 earth and related environmental sciencesbusiness.industryInorganic suspended matterTemperatureInversion (meteorology)Statistical modelAtomic and Molecular Physics and OpticsComputer Science ApplicationsInfrared sounderNonlinear systemsymbolsGlobal Positioning SystemColoured dissolved matterbusinessAlgorithm
researchProduct

Understanding disease mechanisms with models of signaling pathway activities

2014

Background Understanding the aspects of the cell functionality that account for disease or drug action mechanisms is one of the main challenges in the analysis of genomic data and is on the basis of the future implementation of precision medicine. Results Here we propose a simple probabilistic model in which signaling pathways are separated into elementary sub-pathways or signal transmission circuits (which ultimately trigger cell functions) and then transforms gene expression measurements into probabilities of activation of such signal transmission circuits. Using this model, differential activation of such circuits between biological conditions can be estimated. Thus, circuit activation s…

Signaling pathwaysComputer scienceSystems biologyStem cellsDiseaseDrug actionComputational biologyModels BiologicalMiceSpecies SpecificityStructural BiologyModelling and SimulationAnimalsHumansComputer SimulationDiseaseObesityMolecular BiologyCancerRegulation of gene expressionInternetMechanism (biology)Methodology ArticleApplied MathematicsProbabilistic modelPrecision medicineStatistical modelPrecision medicineComputer Science ApplicationsGene Expression RegulationFanconi anemiaModeling and SimulationDisease mechanismSignal transductionAlgorithmBiomarkersSoftwareSignal TransductionBMC Systems Biology
researchProduct

Southern-Tyrrhenian seismicity in space-time-magnitude domain

2006

An analysis is conducted on a catalogue containing more than 2000 seismic events
 occurred in the southern Tyrrhenian Sea between 1988 and October 2002, as an attempt
 to characterise the main seismogenetic processes active in the area in space, time and magnitude domain by means of the parameters of phenomenological laws.
 
 We chose to adopt simple phenomenological models, since the low number of data did
 not allow to use more complex laws.
 
 The two main seismogenetic volumes present in the area were considered for the purpose
 of this work. The first includes a nearly homogeneous distribution of hypocentres in a
 NW steeply dipping layer as far as a…

Southern-Tyrrhenian SeaPlane (geometry)Space timelcsh:QC801-809Magnitude (mathematics)Southern-Tyrrhenian Sea statistical models aftershock sequences background seismicitystatistical modelslcsh:QC851-999Induced seismicityHomogeneous distributionDomain (mathematical analysis)lcsh:Geophysics. Cosmic physicsGeophysicsaftershock sequencesLithosphereSlablcsh:Meteorology. ClimatologySeismologyGeologybackground seismicity.Annals of Geophysics
researchProduct

Covid-19 in Italy: Modelling, Communications, and Collaborations

2022

Abstract When Covid-19 arrived in Italy in early 2020, a group of statisticians came together to provide tools to make sense of the unfolding epidemic and to counter misleading media narratives. Here, members of StatGroup-19 reflect on their work to date

Statistics and ProbabilityCOVID-19statistical modellingSettore SECS-S/01Settore SECS-S/01 - StatisticaRichards generalised logistic curveSignificance
researchProduct

Generating survival times to simulate Cox proportional hazards models

2005

Simulation studies present an important statistical tool to investigate the performance, properties and adequacy of statistical models in pre-specified situations. One of the most important statistical models in medical research is the proportional hazards model of Cox. In this paper, techniques to generate survival times for simulation studies regarding Cox proportional hazards models are presented. A general formula describing the relation between the hazard and the corresponding survival time of the Cox model is derived, which is useful in simulation studies. It is shown how the exponential, the Weibull and the Gompertz distribution can be applied to generate appropriate survival times f…

Statistics and ProbabilityHazard (logic)Exponential distributionEpidemiologyComputer scienceProportional hazards modelStatisticsEconometricsStatistical modelSurvival analysisGompertz distributionExponential functionWeibull distributionStatistics in Medicine
researchProduct

Using Statistical and Computer Models to Quantify Volcanic Hazards

2009

Risk assessment of rare natural hazards, such as large volcanic block and ash or pyroclastic flows, is addressed. Assessment is approached through a combination of computer modeling, statistical modeling, and extreme-event probability computation. A computer model of the natural hazard is used to provide the needed extrapolation to unseen parts of the hazard space. Statistical modeling of the available data is needed to determine the initializing distribution for exercising the computer model. In dealing with rare events, direct simulations involving the computer model are prohibitively expensive. The solution instead requires a combination of adaptive design of computer model approximation…

Statistics and ProbabilityHazard (logic)Risk analysisVolcanic hazardsComputer scienceApplied MathematicsComputationInitializationStatistical modelcomputer.software_genreModeling and SimulationNatural hazardRare eventsData miningcomputerTechnometrics
researchProduct