Search results for "methodologie"

showing 10 items of 2141 documents

Surface Reconstruction Based on a Descriptive Approach

2000

The design of complex surfaces is generally hard to achieve. A natural method consists in the subdivision of the global surface into basic surface elements. The different elements are independently designed and then assembled together to represent the final surface. This method requires a classification and a formal description of the basic elements. This chapter presents a general framework for surface description, based on a constructive tree approach. In this tree the leaves are surface primitives and the nodes are constructive operators.

Bézier surfaceSurface (mathematics)Tree (data structure)Theoretical computer sciencebusiness.industryComputer scienceDescriptive researchbusinessConstructiveSurface reconstructionFormal descriptionComputingMethodologies_COMPUTERGRAPHICSSubdivision
researchProduct

A Fuzzy Logic C-Means Clustering Algorithm to Enhance Microcalcifications Clusters in Digital Mammograms

2011

The detection of microcalcifications is a hard task, since they are quite small and often poorly contrasted against the background of images. The Computer Aided Detection (CAD) systems could be very useful for breast cancer control. In this paper, we report a method to enhance microcalcifications cluster in digital mammograms. A Fuzzy Logic clustering algorithm with a set of features is used for clustering microcalcifications. The method described was tested on simulated clusters of microcalcifications, so that the location of the cluster within the breast and the exact number of microcalcifications is known.

C-meanCOMPUTER-AIDED DETECTIONComputer scienceCADFuzzy logicSet (abstract data type)Cluster (physics)medicineMammographycancerComputer visionCLASSIFICATION.Cluster analysisbreastmedicine.diagnostic_testbusiness.industryPattern recognitionImage enhancementComputer aided detectionSettore FIS/07 - Fisica Applicata(Beni Culturali Ambientali Biol.e Medicin)microcalcificationComputingMethodologies_PATTERNRECOGNITIONbreast; cancer; microcalcifications; clustering; fuzzy logic; C-means; COMPUTER-AIDED DETECTION; CLASSIFICATION.Artificial intelligencefuzzy logicbusinessclustering
researchProduct

Transverse-momentum-dependent Multiplicities of Charged Hadrons in Muon-Deuteron Deep Inelastic Scattering

2017

A semi-inclusive measurement of charged hadron multiplicities in deep inelastic muon scattering off an isoscalar target was performed using data collected by the COMPASS Collaboration at CERN. The following kinematic domain is covered by the data: photon virtuality $Q^{2}>1$ (GeV/$c$)$^2$, invariant mass of the hadronic system $W > 5$ GeV/$c^2$, Bjorken scaling variable in the range $0.003 < x < 0.4$, fraction of the virtual photon energy carried by the hadron in the range $0.2 < z < 0.8$, square of the hadron transverse momentum with respect to the virtual photon direction in the range 0.02 (GeV/$c)^2 < P_{\rm{hT}}^{2} < 3$ (GeV/$c$)$^2$. The multiplicities are pres…

CERN LabComputerSystemsOrganization_COMPUTERSYSTEMIMPLEMENTATIONMULTIPLICITIESdimension: 3PT DEPENDENTComputingMethodologies_IMAGEPROCESSINGANDCOMPUTERVISIONFOS: Physical sciencesComputerApplications_COMPUTERSINOTHERSYSTEMStarget: isoscalarmuon deuteron: deep inelastic scattering[PHYS.NEXP]Physics [physics]/Nuclear Experiment [nucl-ex]nucl-extransverse momentum dependencehadron: transverse momentumSIDISCOMPASSGeneralLiterature_MISCELLANEOUSHigh Energy Physics - Experimentscaling: BjorkenSubatomär fysikcharged particle: multiplicityHigh Energy Physics - Experiment (hep-ex)[ PHYS.HEXP ] Physics [physics]/High Energy Physics - Experiment [hep-ex]mass: hadronicSubatomic Physics[PHYS.HEXP]Physics [physics]/High Energy Physics - Experiment [hep-ex]Nuclear Physics - Experiment[ PHYS.NEXP ] Physics [physics]/Nuclear Experiment [nucl-ex]Nuclear Experiment (nucl-ex)quantum chromodynamics: perturbation theoryNuclear ExperimentNuclear ExperimentDIShep-exhadron: multiplicityeffect: nonperturbativeperturbation theory: higher-orderCERN SPSphoton: energysemi-inclusive reactionComputingMethodologies_PATTERNRECOGNITIONkinematicsDIS; SIDIS; MULTIPLICITIES; PT DEPENDENTHigh Energy Physics::ExperimentParticle Physics - Experimentexperimental resultsphoton: virtual
researchProduct

Measurement of the lifetime of tau-lepton

1996

The tau lepton lifetime is measured with the L3 detector at LEP using the complete data taken at centre-of-mass energies around the Z pole resulting in tau_tau = 293.2 +/- 2.0 (stat) +/- 1.5 (syst) fs. The comparison of this result with the muon lifetime supports lepton universality of the weak charged current at the level of six per mille. Assuming lepton universality, the value of the strong coupling constant, alpha_s is found to be alpha_s(m_tau^2) = 0.319 +/- 0.015(exp.) +/- 0.014 (theory). The tau lepton lifetime is measured with the L3 detector at LEP using the complete data taken at centre-of-mass energies around the Z pole resulting in τ τ =293.2 ± 2.0 (stat) ± 1.5 (syst) fs . The c…

COLLISIONSNuclear and High Energy PhysicsParticle physicsLUND MONTE-CARLOPAIR PRODUCTIONElectron–positron annihilationFOS: Physical sciencesElementary particleddc:500.201 natural sciences7. Clean energyResonance (particle physics)JET FRAGMENTATIONDECAYSHigh Energy Physics - ExperimentNuclear physicsParticle decayHigh Energy Physics - Experiment (hep-ex)0103 physical sciences[PHYS.HEXP]Physics [physics]/High Energy Physics - Experiment [hep-ex]SILICON MICROVERTEX DETECTORPRECISE MEASUREMENTLimit (mathematics)QCD ANALYSIS010306 general physicsL3 EXPERIMENTCoupling constantPhysicsMuonAnnihilationTEST BEAME+E-PHYSICS010308 nuclear & particles physicsALPHA(S)High Energy Physics::PhenomenologyDetectorPair productionSPECTRAL FUNCTIONSComputingMethodologies_DOCUMENTANDTEXTPROCESSINGHigh Energy Physics::ExperimentParticle Physics - ExperimentLeptonNuclear and Particle Physics Proceedings
researchProduct

Stronger proprioceptive BOLD-responses in the somatosensory cortices reflect worse sensorimotor function in adolescents with and without cerebral pal…

2020

Graphical abstract

CP-oireyhtymäCHILDRENSM1PASSIVE FINGERDP diplegic3124 Neurology and psychiatryEVOKED-POTENTIALSBRAINChildMOTOR CORTEXPassive movementTE echo timeEM expectation maximizationliikeaistiBOLD Blood-Oxygen-Level-Dependent signalRegular ArticleMagnetic Resonance ImagingTD typically-developedTR repetition timeSIIGMFCS Gross Motor Function Classification SystemMANCOVA Multivariate analysis of covarianceEPI echo planar imagingHP hemiplegicfMRI functional magnetic resonance imagingFemaleTACTILE STIMULATIONhalvausAGE-RELATED DIFFERENCESAdolescentComputer applications to medicine. Medical informaticsR858-859.7HemiplegiaORGANIZATIONDiplegiatuntoaistiMOVEMENTSIPT Sensory Integration and Praxis TestsROI regions of interestHumansSISII cortex secondary somatosensory cortexCP cerebral palsyRC346-429ComputingMethodologies_COMPUTERGRAPHICSGLM General Linear ModelCerebral Palsy3112 NeurosciencesSPM Statistical Parametric MappingSomatosensory CortexHandProprioceptionSI cortex primary somatosensory cortexGABA CONCENTRATIONKinesthesiaNeurology. Diseases of the nervous systemPSC percent signal change
researchProduct

A Fast GPU-Based Motion Estimation Algorithm for H.264/AVC

2012

H.264/AVC is the most recent predictive video compression standard to outperform other existing video coding standards by means of higher computational complexity. In recent years, heterogeneous computing has emerged as a cost-efficient solution for high-performance computing. In the literature, several algorithms have been proposed to accelerate video compression, but so far there have not been many solutions that deal with video codecs using heterogeneous systems. This paper proposes an algorithm to perform H.264/AVC inter prediction. The proposed algorithm performs the motion estimation, both with full-pixel and sub-pixel accuracy, using CUDA to assist the CPU, obtaining remarkable time …

CUDAComputational complexity theoryComputer scienceMotion estimationComputingMethodologies_IMAGEPROCESSINGANDCOMPUTERVISIONCodecSymmetric multiprocessor systemImage processingData_CODINGANDINFORMATIONTHEORYCentral processing unitParallel computingData compression
researchProduct

Fuzzy subgroup mining for gene associations

2004

When studying the therapeutic efficacy of potential new drugs, it would be much more efficient to use predictors in order to assess their toxicity before going into clinical trials. One promising line of research has focused on the discovery of sets of candidate gene profiles to be used as toxicity indicators in future drug development. In particular genomic microarrays may be used to analyze the causality relationship between the administration of the drugs and the so-called gene expression, a parameter typically used by biologists to measure its influence at gene level. This kind of experiments involves a high throughput analysis of noisy and particularly unreliable data, which makes the …

Candidate geneApriori algorithmMeasure (data warehouse)Fuzzy control systemBiologycomputer.software_genreCausalityFuzzy logicComputingMethodologies_PATTERNRECOGNITIONDrug developmentData miningddc:004Throughput (business)computer
researchProduct

Unmanned aerial system imagery and photogrammetric canopy height data in area-based estimation of forest variables

2015

In this paper we examine the feasibility of data from unmanned aerial vehicle (UAV)-borne aerial imagery in stand-level forest inventory. As airborne sensor platforms, UAVs offer advantages cost and flexibility over traditional manned aircraft in forest remote sensing applications in small areas, but they lack range and endurance in larger areas. On the other hand, advances in the processing of digital stereo photography make it possible to produce three-dimensional (3D) forest canopy data on the basis of images acquired using simple lightweight digital camera sensors. In this study, an aerial image orthomosaic and 3D photogrammetric canopy height data were derived from the images acquired …

CanopyAerial surveyUAVta1172ComputingMethodologies_IMAGEPROCESSINGANDCOMPUTERVISIONta1171ComputerApplications_COMPUTERSINOTHERSYSTEMSphotogrammetric surface modelBasal areaAerial photographyaerial imagerylcsh:Forestryforest inventorycanopy height modelRemote sensingta113Forest inventoryEcological ModelingForestryta4112unmanned aerial systemAerial imageryPhotogrammetrylcsh:SD1-669.5Environmental scienceWoody plantSilva fennica
researchProduct

GridNet with Automatic Shape Prior Registration for Automatic MRI Cardiac Segmentation

2018

In this paper, we propose a fully automatic MRI cardiac segmentation method based on a novel deep convolutional neural network (CNN) designed for the 2017 ACDC MICCAI challenge. The novelty of our network comes with its embedded shape prior and its loss function tailored to the cardiac anatomy. Our model includes a cardiac center-of-mass regression module which allows for an automatic shape prior registration. Also, since our method processes raw MR images without any manual preprocessing and/or image cropping, our CNN learns both high-level features (useful to distinguish the heart from other organs with a similar shape) and low-level features (useful to get accurate segmentation results).…

Cardiac anatomybusiness.industryComputer scienceComputingMethodologies_IMAGEPROCESSINGANDCOMPUTERVISIONNovelty030204 cardiovascular system & hematologyGridConvolutional neural networkAccurate segmentation030218 nuclear medicine & medical imaging03 medical and health sciences0302 clinical medicineFully automaticPreprocessorSegmentationComputer visionArtificial intelligencebusiness
researchProduct

Rethinking the sGLOH Descriptor

2018

sGLOH (shifting GLOH) is a histogram-based keypoint descriptor that can be associated to multiple quantized rotations of the keypoint patch without any recomputation. This property can be exploited to define the best distance between two descriptor vectors, thus avoiding computing the dominant orientation. In addition, sGLOH can reject incongruous correspondences by adding a global constraint on the rotations either as an a priori knowledge or based on the data. This paper thoroughly reconsiders sGLOH and improves it in terms of robustness, speed and descriptor dimension. The revised sGLOH embeds more quantized rotations, thus yielding more correct matches. A novel fast matching scheme is a…

Cascade matching0209 industrial biotechnologyHistogram binarizationRFDComputer scienceGLOHComputingMethodologies_IMAGEPROCESSINGANDCOMPUTERVISION02 engineering and technologyCNN descriptorLIOP020901 industrial engineering & automationMROGHArtificial IntelligenceRobustness (computer science)Keypoint matchingSIFTHistogram0202 electrical engineering electronic engineering information engineeringSettore ING-INF/05 - Sistemi Di Elaborazione Delle InformazioniSettore INF/01 - Informaticabusiness.industryApplied MathematicsCognitive neuroscience of visual object recognitionPattern recognitionRotation invariant descriptorsGLOHMIOPComputational Theory and MathematicsKeypoint matching SIFT sGLOH RFDs LIOP MIOP MROGH CNN descriptors rotation invariant descriptors histogram binarization cascade matchingPrincipal component analysis020201 artificial intelligence & image processingComputer Vision and Pattern RecognitionArtificial intelligencebusinessSoftware
researchProduct