Search results for " VALIDATION"

showing 10 items of 313 documents

A Simple Cluster Validation Index with Maximal Coverage

2017

Clustering is an unsupervised technique to detect general, distinct profiles from a given dataset. Similarly to the existence of various different clustering methods and algorithms, there exists many cluster validation methods and indices to suggest the number of clusters. The purpose of this paper is, firstly, to propose a new, simple internal cluster validation index. The index has a maximal coverage: also one cluster, i.e., lack of division of a dataset into disjoint subsets, can be detected. Secondly, the proposed index is compared to the available indices from five different packages implemented in R or Matlab to assess its utilizability. The comparison also suggests many interesting f…

ComputingMethodologies_PATTERNRECOGNITIONcluster validation
researchProduct

Analysis a DSP Implementation and Experimental Validation of a Loss Minimization Algorithm Applied to Permanent Magnet Synchronous Motor Drives

2004

In this paper a new loss minimization control algorithm for inverter-fed permanent-magnet synchronous motors (PMSM), which allows to reduce the power losses of the electric drive without penalty on its dynamic performances, is analyzed, experimentally realized and validated. In particular, after a brief recall of two loss minimization control strategies (the "search control" and the "loss-model control"), both a modified dynamic model of the PMSM, which takes into account the iron losses, and a "loss-model" control strategy, are treated. Experimental tests on a specific PMSM drive employing the proposed loss minimization algorithm were performed aiming to validate the actual implementation.…

Control systemsPermanent magnet synchronous motorComputer sciencebusiness.industryExperimental validationSettore ING-IND/32 - Convertitori Macchine E Azionamenti ElettriciControl theoryEfficiency improvementMotor drivesLoss minimizationVariable speed drivesbusinessSynchronous motorAlgorithmDigital signal processingPermanent magnet synchronous motor
researchProduct

Psychometric evaluation of the Finnish version of the impact on participation and autonomy questionnaire in persons with multiple sclerosis

2017

Objective: The objective of this study was to evaluate the psychometric properties of the impact on participation and autonomy (IPA) questionnaire. The Finnish version of IPA (IPAFin) was translated into Finnish using the protocol for linguistic validation for patient-reported outcomes instruments. Methods: A total of 194 persons with multiple sclerosis (MS) (mean age 50 years SD 9, 72% female) with moderate to severe disability participated in this study. A confirmatory factor analysis (CFA) was used to confirm the four factor structure of the IPAFin. The work and educational opportunities domain was excluded from analysis, because it was only applicable to 51 persons. Internal consistency…

Cross-Cultural ComparisonMaleOccupational therapyconfirmatory factor analysis030506 rehabilitationmedicine.medical_specialtyPsychometricsassessmentmedia_common.quotation_subjectmultiple sclerosisFactor structureLinguistic validationStructural equation modelingrehabilitation03 medical and health sciences0302 clinical medicineOccupational TherapyCronbach's alphaMS-tautiSurveys and QuestionnairesmedicineHumansparticipationDisabled PersonsautonomyFinlandosallistuminenmedia_commonPublic Health Environmental and Occupational HealthReproducibility of ResultsConstruct validityta3141Middle AgedTranslatingConfirmatory factor analysisCross-Sectional StudiesPersonal AutonomykuntoutusFemalePatient Participation0305 other medical sciencePsychology030217 neurology & neurosurgeryAutonomyClinical psychologyScandinavian Journal of Occupational Therapy
researchProduct

Applying a Data Quality Model to Experiments in Software Engineering

2014

Data collection and analysis are key artifacts in any software engineering experiment. However, these data might contain errors. We propose a Data Quality model specific to data obtained from software engineering experiments, which provides a framework for analyzing and improving these data. We apply the model to two controlled experiments, which results in the discovery of data quality problems that need to be addressed. We conclude that data quality issues have to be considered before obtaining the experimental results.

Data collectionSoftware sizingbusiness.industryComputer scienceData qualitySoftware constructionSoftware verification and validationComputer-aided engineeringbusinessSoftware engineeringSoftware verificationData modeling
researchProduct

Sun Induced Fluorescence Calibration and Validation for Field Phenotyping

2018

Reliable measurements of Sun Induced Fluorescence (SIF) require a good instrument characterization as well as a complex processing chain. In this paper, we summarize the state of the art SIF retrieval methods and measurements platforms for field phenotyping. Furthermore, we use HyScreen, hyperspectral-imaging system for top of canopy measurements of SIF, as an example of the instrument requirements, data process, and data validation needed to obtain reliable measurements of SIF.

Data processingAnd field spectrometerCalibration and validationRetrievals method010504 meteorology & atmospheric sciencesField (physics)FIS/06 - FISICA PER IL SISTEMA TERRA E PER IL MEZZO CIRCUMTERRESTREComputer scienceSun Induced fluorescenceGEO/12 - OCEANOGRAFIA E FISICA DELL'ATMOSFERAData validationHyperspectral measurement010501 environmental sciences01 natural sciencesReflectivityFluorescenceField phenotyping0105 earth and related environmental sciencesRemote sensingIGARSS 2018 - 2018 IEEE International Geoscience and Remote Sensing Symposium
researchProduct

A multicenter evaluation of a deep learning software (LungQuant) for lung parenchyma characterization in COVID-19 pneumonia

2023

Abstract Background The role of computed tomography (CT) in the diagnosis and characterization of coronavirus disease 2019 (COVID-19) pneumonia has been widely recognized. We evaluated the performance of a software for quantitative analysis of chest CT, the LungQuant system, by comparing its results with independent visual evaluations by a group of 14 clinical experts. The aim of this work is to evaluate the ability of the automated tool to extract quantitative information from lung CT, relevant for the design of a diagnosis support model. Methods LungQuant segments both the lungs and lesions associated with COVID-19 pneumonia (ground-glass opacities and consolidations) and computes derived…

Deep LearningSoftware validationCOVID-19Radiology Nuclear Medicine and imagingTomography (x-ray computed)Lung
researchProduct

CREAPP K6-12: Tool to evaluate the creative potential of app oriented to the design of personal digital storytelling

2018

An instrument is presented that allows primary school teachers to evaluate the potentiality to develop creativity that have playful  online app K6-12 focused on the development of digital storytelling (DST), so they can select those that can be used in the classroom for that purpose. The instrument validated consists of 48 indicators associated with the six dimensions of the creativity: flexibility, originality, fluency, problem solving, elaboration of products and co-edition and dissemination. It was designed based on the opinions of experts in creativity and Information and Communication Technologies of the didactic field, theory of education and methodology; also with the assessments and…

Digital storytellingComputer sciencemedia_common.quotation_subjectPrimary educationFlexibility (personality)educación; pedagogía;Digital storytellingCreativityFocus groupPrimary Educationmobile applicationsexpert validationComputer Science ApplicationsEducationFluencyCohen's kappaOriginalityMathematics educationDigital storytelling; creativity; Primary Education; mobile applications; expert validationcreativitymedia_common
researchProduct

Building Semantic Trees from XML Documents

2016

International audience; The distributed nature of the Web, as a decentralized system exchanging information between heterogeneous sources, has underlined the need to manage interoperability, i.e., the ability to automatically interpret information in Web documents exchanged between different sources, necessary for efficient information management and search applications. In this context, XML was introduced as a data representation standard that simplifies the tasks of interoperation and integration among heterogeneous data sources, allowing to represent data in (semi-) structured documents consisting of hierarchically nested elements and atomic attributes. However, while XML was shown most …

Document Structure DescriptionComputer Networks and CommunicationsComputer sciencecomputer.internet_protocolSemantic analysis (machine learning)Efficient XML InterchangeInteroperabilityXML SignatureWord sense disambiguation02 engineering and technologycomputer.software_genreSemantic networkSemantic ambiguityXML Schema Editor020204 information systemsNode (computer science)0202 electrical engineering electronic engineering information engineering[INFO]Computer Science [cs]XML schemaContext representationcomputer.programming_languageXML treeInformation retrievalKnowledge basesSemi-structured dataXML validationcomputer.file_formatSemantic interoperabilityXMLHuman-Computer InteractionXML databaseSemantic similaritySemantic-aware processing020201 artificial intelligence & image processingWeb servicecomputerSoftwareXML
researchProduct

A novel XML document structure comparison framework based-on sub-tree commonalities and label semantics

2012

International audience; XML similarity evaluation has become a central issue in the database and information communities, its applications ranging over document clustering, version control, data integration and ranked retrieval. Various algorithms for comparing hierarchically structured data, XML documents in particular, have been proposed in the literature. Most of them make use of techniques for finding the edit distance between tree structures, XML documents being commonly modeled as Ordered Labeled Trees. Yet, a thorough investigation of current approaches led us to identify several similarity aspects, i.e., sub-tree related structural and semantic similarities, which are not sufficient…

Document Structure DescriptionComputer Networks and Communicationscomputer.internet_protocolComputer scienceEfficient XML Interchange[SCCO.COMP]Cognitive science/Computer science0102 computer and information sciences02 engineering and technologycomputer.software_genre01 natural sciencesSemantic similarityXML Schema Editor020204 information systems0202 electrical engineering electronic engineering information engineeringXML schemacomputer.programming_languageInformation retrieval[INFO.INFO-DB]Computer Science [cs]/Databases [cs.DB][INFO.INFO-WB]Computer Science [cs]/Web[INFO.INFO-MM]Computer Science [cs]/Multimedia [cs.MM]XML validationcomputer.file_formatDocument clusteringHuman-Computer InteractionXML frameworkTree (data structure)XML databaseTree structure010201 computation theory & mathematics[INFO.INFO-IR]Computer Science [cs]/Information Retrieval [cs.IR]020201 artificial intelligence & image processingSemi-structured dataEdit distancecomputerSoftwareXMLXML CatalogData integration
researchProduct

Building Ontologies from XML Data Sources

2009

In this paper, we present a tool called X2OWL that aims at building an OWL ontology from an XML datasource. This method is based on XML schema to automatically generate the ontology structure, as well as, a set of mapping bridges. The presented method also includes a refinement step that allows to clean the mapping bridges and possibly to restructure the generated ontology.

Document Structure DescriptionComputer sciencecomputer.internet_protocolProcess ontologyEfficient XML InterchangeXML SignatureOntology (information science)computer.software_genreXML Schema EditorStreaming XMLUpper ontologyXML schemaRDFSemantic Webcomputer.programming_languageInformation retrievalOntology-based data integrationSuggested Upper Merged OntologyWeb Ontology LanguageXML validationcomputer.file_formatXML frameworkXML databaseComputingMethodologies_DOCUMENTANDTEXTPROCESSINGOntologycomputerXML2009 20th International Workshop on Database and Expert Systems Application
researchProduct