Search results for "weighting"

showing 10 items of 117 documents

Le score de propension : un guide méthodologique pour les recherches expérimentales et quasi expérimentales en éducation

2016

La méthode du score de propension devient de plus en plus populaire pour estimer les effets causaux d’un programme d’intervention. Si les applications empiriques de cette méthode sont encore rares dans les recherches en éducation, des exemples de son utilisation se trouvent aisément dans d’autres disciplines. Cependant, sa mise en place soulève plusieurs questions. L’objectif de cet article est de fournir des éléments de réponses guidant le chercheur et l’évaluateur du domaine de l’éducation pour l’estimation et l’utilisation du score de propension. Les différentes étapes de son application sont présentées pas à pas : évaluation du biais de sélection, construction du score de propension et …

aparelhamento da pontuação de propensãoappariement sur score de propensiondoubles différencesSocial Sciences and Humanities[SHS.EDU]Humanities and Social Sciences/Education[SHS.EDU] Humanities and Social Sciences/Educationponderação inversa sobre as probabilidades de ser tratadobiais de sélection01 natural sciences010104 statistics & probability0504 sociologyselection bias0101 mathematicsinverse probability of treatment weightingpropensity scorepontuação de propensãodifference in differencesscore de propensionpropensity score matching4. Education05 social sciences050401 social sciences methods[ SHS.EDU ] Humanities and Social Sciences/EducationGeneral Medicinepondération inverse sur les probabilités d’être traitéviés de seleçãopondération inverse sur les probabilités d'être traitéselection bias.Sciences Humaines et Socialesdiferenças duplas
researchProduct

Modeling user preferences in content-based image retrieval: A novel attempt to bridge the semantic gap

2015

This paper is concerned with content-based image retrieval from a stochastic point of view. The semantic gap problem is addressed in two ways. First, a dimensional reduction is applied using the (pre-calculated) distances among images. The dimension of the reduced vector is the number of preferences that we allow the user to choose from, in this case, three levels. Second, the conditional probability distribution of the random user preference, given this reduced feature vector, is modeled using a proportional odds model. A new model is fitted at each iteration. The score used to rank the image database is based on the estimated probability function of the random preference. Additionally, so…

business.industryCognitive NeuroscienceFeature vectorDimensionality reductionPattern recognitionProbability density functionConditional probability distributionContent-based image retrievalcomputer.software_genreComputer Science ApplicationsWeightingArtificial IntelligenceArtificial intelligenceData miningbusinessImage retrievalcomputerSemantic gapMathematicsNeurocomputing
researchProduct

A principled approach to network-based classification and data representation

2013

Measures of similarity are fundamental in pattern recognition and data mining. Typically the Euclidean metric is used in this context, weighting all variables equally and therefore assuming equal relevance, which is very rare in real applications. In contrast, given an estimate of a conditional density function, the Fisher information calculated in primary data space implicitly measures the relevance of variables in a principled way by reference to auxiliary data such as class labels. This paper proposes a framework that uses a distance metric based on Fisher information to construct similarity networks that achieve a more informative and principled representation of data. The framework ena…

business.industryCognitive NeuroscienceFisher kernelPattern recognitionProbability density functionConditional probability distributionExternal Data Representationcomputer.software_genreComputer Science ApplicationsWeightingEuclidean distancesymbols.namesakeData pointArtificial IntelligencesymbolsArtificial intelligenceData miningFisher informationbusinesscomputerMathematicsNeurocomputing
researchProduct

Perceptually weighted optical flow for motion-based segmentation in MPEG-4 paradigm

2000

In the MPEG-4 paradigm, the sequence must be described in terms of meaningful objects. This meaningful, high-level representation should emerge from low-level primitives such as optical flow and prediction error which are the basic elements of previous-generation video coders. The accuracy of the high-level models strongly depends on the robustness of the primitives used. It is shown how perceptual weighting in optical flow computation gives rise to better motion estimates which consistently improve motion-based segmentation compared to equivalent unweighted motion estimates.

business.industryMean squared prediction errorComputingMethodologies_IMAGEPROCESSINGANDCOMPUTERVISIONOptical flowcomputer.file_formatPerceptual weightingOptical flow computationRobustness (computer science)Motion estimationComputer Science::MultimediaMPEG-4Computer visionSegmentationArtificial intelligenceElectrical and Electronic EngineeringbusinesscomputerMathematicsElectronics Letters
researchProduct

Computational Techniques for the Analysis of Small Signals in High-Statistics Neutrino Oscillation Experiments

2020

The current and upcoming generation of Very Large Volume Neutrino Telescopes – collecting unprecedented quantities of neutrino events – can be used to explore subtle effects in oscillation physics, such as (but not restricted to) the neutrino mass ordering. The sensitivity of an experiment to these effects can be estimated from Monte Carlo simulations. With the high number of events that will be collected, there is a trade-off between the computational expense of running such simulations and the inherent statistical uncertainty in the determined values. In such a scenario, it becomes impractical to produce and use adequately-sized sets of simulated events with traditional methods, such as M…

data analysis methodNuclear and High Energy PhysicsMonte Carlo methodFVLV nu TData analysis; Detector; KDE; MC; Monte Carlo; Neutrino; Neutrino mass ordering; Smoothing; Statistics; VLVνTData analysisKDEFOS: Physical sciences01 natural sciencesIceCubeHigh Energy Physics - ExperimentHigh Energy Physics - Experiment (hep-ex)statistical analysisnumerical methods0103 physical sciencesStatisticsNeutrinoddc:530Sensitivity (control systems)MC010306 general physicsNeutrino oscillationInstrumentation and Methods for Astrophysics (astro-ph.IM)InstrumentationMonte CarloPhysicsVLVνT010308 nuclear & particles physicsOscillationStatisticsoscillation [neutrino]ObservableDetectorMonte Carlo [numerical calculations]WeightingNeutrino mass orderingPhysics and AstronomyPhysics - Data Analysis Statistics and ProbabilityPhysique des particules élémentairesNeutrinoAstrophysics - Instrumentation and Methods for AstrophysicsMATTERData Analysis Statistics and Probability (physics.data-an)SmoothingSmoothing
researchProduct

Adaptive variable structure fuzzy neural identification and control for a class of MIMO nonlinear system

2013

This paper presents a novel adaptive variable structure (AVS) method to design a fuzzy neural network (FNN). This AVS-FNN is based on radial basis function (RBF) neurons, which have center and width vectors. The network performs sequential learning through sliding data window reflecting system dynamic changes, and dynamic growing-and-pruning structure of FNN. The salient characteristics of the AVS-FNN are as follows: (1) Structure-learning and parameters estimation are performed automatically and simultaneously without partitioning input space and selecting initial parameters a priori. The structure-learning approach relies on the contribution of the size of the output. (2) A set of fuzzy r…

fuzzy neural networkArtificial neural networkNeuro-fuzzyComputer Networks and CommunicationsApplied MathematicsProcess (computing)Fuzzy logicWeightingControl and Systems EngineeringControl theorySignal ProcessingA priori and a posterioriRadial basis functionSequence learningMathematics
researchProduct

Can Dasymetric Mapping Significantly Improve Population Data Reallocation in a Dense Urban Area?

2016

The issue of reallocating population figures from a set of geographical units onto another set of units has received a great deal of attention in the literature. Every other day, a new algorithm is proposed, claiming that it outperforms competitor procedures. Unfortunately, when the new (usually more complex) methods are applied to a new data set, the improvements attained are sometimes just marginal. The relationship cost-effectiveness of the solutions is case-dependent. The majority of studies have focused on large areas with heterogeneous population density distributions. The general conclusion is that as a rule more sophisticated methods are worth the effort. It could be argued, however…

geographyeducation.field_of_studygeography.geographical_feature_categoryComputer science05 social sciencesGeography Planning and DevelopmentPopulation0211 other engineering and technologies0507 social and economic geography02 engineering and technologyUrban areaWeightingData setSet (abstract data type)Variable (computer science)Dasymetric mapStatisticsEconometricseducation050703 geography021101 geological & geomatics engineeringEarth-Surface ProcessesInterpolationGeographical Analysis
researchProduct

An efficient grid-based RF fingerprint positioning algorithm for user location estimation in heterogeneous small cell networks

2014

This paper proposes a novel technique to enhance the performance of grid-based Radio Frequency (RF) fingerprint position estimation framework. First enhancement is an introduction of two overlapping grids of training signatures. As the second enhancement, the location of the testing signature is estimated to be a weighted geometric center of a set of nearest grid units whereas in a traditional grid-based RF fingerprinting only the center point of the nearest grid unit is used for determining the user location. By using the weighting-based location estimation, the accuracy of the location estimation can be improved. The performance evaluation of the enhanced RF fingerprinting algorithm was c…

grid-based RF fingerprintKullback-Leibler divergencePosition (vector)Computer scienceFingerprint (computing)Point (geometry)Small cellRadio frequencyGridAlgorithmWeightingInterpolationminimization of drive tests
researchProduct

Dealing with risk: Gender, stakes, and probability effects

2015

This paper investigates how subjects deal with financial risk, both "upside" (with a small chance of a high payoff) and "downside" (with a small chance of a low payoff). We find that the same people who avoid risk in the downside setting tend to make more risky choices in the upside one. The experiment is designed to disentangle the probability-weighting and utility-curvature components of risk attitudes, and to differentiate settings in which gender differences arise from those in which they do not. Women are more risk averse for downside risks, but gender differences are diminished for upside risks.

jel:C91jel:G02risk aversion probability weighting rank-dependent utility gender differences experiments
researchProduct

Unstable feature relevance in classification tasks

2011

knowledge discoveryaineistottiedonhallintatekoälyfeature relevancefeature weightingrelevanssifeature selectionmachine learningkoneoppiminenclassificationanalyysiensemble learningtietokannattiedonlouhintaData miningtiedonhakuclusteringluokitus
researchProduct