Search results for "Data point"

showing 9 items of 29 documents

Analysis methods of safe Coulomb-excitation experiments with radioactive ion beams using the gosia code

2016

With the recent advances in radioactive ion beam technology, Coulomb excitation at safe energies becomes an important experimental tool in nuclear-structure physics. The usefulness of the technique to extract key information on the electromagnetic properties of nuclei has been demonstrated since the 1960's with stable beam and target combinations. New challenges present themselves when studying exotic nuclei with this technique, including dealing with low statistics or number of data points, absolute and relative normalisation of the measured cross sections and a lack of complimentary experimental data, such as excited-state lifetimes and branching ratios. This paper addresses some of these…

Radioactive ion beamsNuclear and High Energy PhysicsIon beamfuusioreaktioCoulomb excitationData analysisFOS: Physical sciencesCoulomb excitation[PHYS.NEXP]Physics [physics]/Nuclear Experiment [nucl-ex]01 natural sciencesNuclear physicsElectromagnetic moments25.70.De 21.10.Ky; 29.38.Gj 29.85.Fj0103 physical sciencesNuclear Experiment (nucl-ex)particle and nuclear physics010306 general physicsheavy ionsNuclear ExperimentAnalysis methodPhysics010308 nuclear & particles physicsReaccelerated radioactive beams3. Good healthData pointhadronsQuadrupoleydinfysiikka
researchProduct

Iteratively reweighted least squares in crystal structure refinements

2011

The use of robust techniques in crystal structure multipole refinements of small molecules as an alternative to the commonly adopted weighted least squares is presented and discussed. As is well known, the main disadvantage of least-squares fitting is its sensitivity to outliers. The elimination from the data set of the most aberrant reflections (due to both experimental errors and incompleteness of the model) is an effective practice that could yield satisfactory results, but it is often complicated in the presence of a great number of bad data points, whose one-by-one elimination could become unattainable. This problem can be circumvented by means of a robust least-squares regression that…

Settore GEO/06 - MineralogiaLeast trimmed squarescomputer.software_genreRegressionRobust regressionIteratively reweighted least squaresData setRobust regression outlier refinementData pointStructural BiologyOutlierSensitivity (control systems)Data miningcomputerAlgorithmMathematicsActa Crystallographica Section A Foundations of Crystallography
researchProduct

Optimal signed-rank tests based on hyperplanes

2005

Abstract For analysing k -variate data sets, Randles (J. Amer. Statist. Assoc. 84 (1989) 1045) considered hyperplanes going through k - 1 data points and the origin. He then introduced an empirical angular distance between two k -variate data vectors based on the number of hyperplanes (the so-called interdirections ) that separate these two points, and proposed a multivariate sign test based on those interdirections. In this paper, we present an analogous concept (namely, lift-interdirections ) to measure the regular distances between data points. The empirical distance between two k -variate data vectors is again determined by the number of hyperplanes that separate these two points; in th…

Statistics and ProbabilityApplied MathematicsStudentized residualCombinatoricsRandom variateData pointHyperplaneNorm (mathematics)Test statisticCalculusSign testStatistics Probability and UncertaintyStatistique mathématiqueElliptical distributionMathematics
researchProduct

Agritourism and local development: A methodology for assessing the role of public contributions in the creation of competitive advantage

2018

Abstract Agriculture has been an economic mainstay of countries throughout history. Recently, its importance and essential role in local and regional growth and development have also received due acknowledgment. However, agritourism entrepreneurs often lack the necessary means, when relying only on their own funds alone. For these reasons, targeted aid measures are provided at the regional, national and European levels, provided they fulfill specified subjective and objective eligibility criteria. This investigation aims at understanding the rationale, if any, according to which the Public Administration allots such contributions. To this end, a two-tiered analysis was conducted on the data…

Value (ethics)Geography Planning and Development0211 other engineering and technologiesContext (language use)02 engineering and technology010501 environmental sciencesManagement Monitoring Policy and Law01 natural sciencesCompetitive advantagelaw.inventionlawSettore AGR/01 - Economia Ed Estimo RuraleNatural (music)Rural developmentMarketing0105 earth and related environmental sciencesNature and Landscape ConservationDescriptive statistics021107 urban & regional planningStatistical modelForestryResource conservationAgritourismData pointCLARITYBusinessSettore SECS-S/01 - Statistica
researchProduct

A principled approach to network-based classification and data representation

2013

Measures of similarity are fundamental in pattern recognition and data mining. Typically the Euclidean metric is used in this context, weighting all variables equally and therefore assuming equal relevance, which is very rare in real applications. In contrast, given an estimate of a conditional density function, the Fisher information calculated in primary data space implicitly measures the relevance of variables in a principled way by reference to auxiliary data such as class labels. This paper proposes a framework that uses a distance metric based on Fisher information to construct similarity networks that achieve a more informative and principled representation of data. The framework ena…

business.industryCognitive NeuroscienceFisher kernelPattern recognitionProbability density functionConditional probability distributionExternal Data Representationcomputer.software_genreComputer Science ApplicationsWeightingEuclidean distancesymbols.namesakeData pointArtificial IntelligencesymbolsArtificial intelligenceData miningFisher informationbusinesscomputerMathematicsNeurocomputing
researchProduct

Information Transfer in Linear Multivariate Processes Assessed through Penalized Regression Techniques: Validation and Application to Physiological N…

2020

The framework of information dynamics allows the dissection of the information processed in a network of multiple interacting dynamical systems into meaningful elements of computation that quantify the information generated in a target system, stored in it, transferred to it from one or more source systems, and modified in a synergistic or redundant way. The concepts of information transfer and modification have been recently formulated in the context of linear parametric modeling of vector stochastic processes, linking them to the notion of Granger causality and providing efficient tools for their computation based on the state&ndash

conditional transfer entropyInformation transferlinear predictionDynamical systems theoryComputer scienceState–space modelsGeneral Physics and Astronomylcsh:AstrophysicsNetwork topologycomputer.software_genrenetwork physiology01 natural sciencesArticle03 medical and health sciences0302 clinical medicinepenalized regression techniquelcsh:QB460-4660103 physical sciencesEntropy (information theory)Statistics::Methodologylcsh:Science010306 general physicspartial information decompositionmultivariate time series analysisinformation dynamics; partial information decomposition; entropy; conditional transfer entropy; network physiology; multivariate time series analysis; State–space models; vector autoregressive model; penalized regression techniques; linear predictionState–space modellcsh:QC1-999multivariate time series analysiInformation dynamicData pointpenalized regression techniquesAutoregressive modelSettore ING-INF/06 - Bioingegneria Elettronica E InformaticaParametric modelOrdinary least squaresvector autoregressive modellcsh:QData mininginformation dynamicsentropycomputerlcsh:Physics030217 neurology & neurosurgery
researchProduct

Towards an efficient meshfree solver

2016

In this paper we focus on the enhancement in accuracy approximating a function and its derivatives via smoothed particle hydrodynamics. We discuss about improvements in the solution by reformulating the original method by means of the Taylor series expansion and by projecting with the kernel function and its derivatives. The accuracy of a function and its derivatives, up to a fixed order, can be simultaneously improved by assuming them as unknowns of a linear system. The improved formulation has been assessed with gridded and scattered data points distribution and the convergence has been analyzed referring to a case study in a 2D domain.

convergenceaccuracyLinear systemFunction (mathematics)SolverSmoothed-particle hydrodynamicssymbols.namesakeSettore MAT/08 - Analisi NumericaSettore ING-IND/31 - ElettrotecnicaData pointConvergence (routing)Taylor seriessymbolsmeshless methodFocus (optics)AlgorithmMathematics
researchProduct

Evaluation of Innovation Statistics Data Quality Dimensions.

2017

Promocijas darbā analizēta praktiskā situācija statistikas par inovācijām vākšanas un apstrādes procesā Latvijā, tiek apskatīta statistikas par inovācijām metodoloģija un tās pielietošanas prakse Latvijā, īpašu uzmanību pievēršot Latvijas inovāciju sistēmas īpatnībām. Pētījumā izstrādātās Latvijas inovāciju vadības un informācijas plūsmas sistēmas ietvaros tiek pamatots informatīvā atbalsta nozīmīgums lēmumu pieņemšanā. Balstoties uz Datu kvalitātes dimensiju novērtēšanas metodiku, veikts inovāciju statistikas datu kvalitātes novērtējums Latvijā. Ir izstrādāta un aprobēta Iteratīvā metode neraksturīgo punktu ietekmes mazināšanai. Pētījuma ietvaros tiek aplūkoti statistikas par inovācijām kv…

neraksturīgie punktistatistikas datu vākšanas un apstrādes metodoloģijaquality of statistical dataquality of methodologyEconomicsdatu kvalitātes dimensijasStatisticsEkonomikaStatistikadata quality dimensionsoutlying data pointsstatistikas datu kvalitāte
researchProduct

Statistical Approximation of Fourier Transform-IR Spectroscopy Data for Zinc White Pigment from Twentieth-Century Russian Paintings

2017

We present a statistical model for approximation of experimental Fourier transform-IR spectroscopy (FTIR) data for paint samples from paintings of different ages. The model utilizes random variations in some parameters (initial ageing rate, degree of change in ageing rate and time at which the change occurs). We determine the parameters characterizing variation in the paint composition and the storage conditions for the paintings. The numerical calculation is qualitatively consistent with the experimental data. In the proposed model, changes in the initial composition of the paint and the storage conditions make about the same contribution to the experimentally observed scatter in the data …

zinc whiteMineralogyInfrared spectroscopy02 engineering and technology01 natural sciencessymbols.namesakeFourier transform infrared spectroscopySpectroscopySpectroscopyPaintingChemistry010401 analytical chemistryExperimental dataStatistical modelsimulation021001 nanoscience & nanotechnologyCondensed Matter PhysicsSettore FIS/07 - Fisica Applicata(Beni Culturali Ambientali Biol.e Medicin)0104 chemical sciencesComputer Science::GraphicsData pointFourier transformIR spectroscopysymbols0210 nano-technologyIR spectroscopy; dating; zinc white; simulationdatingJournal of Applied Spectroscopy
researchProduct