Search results for "Data model"

showing 10 items of 162 documents

Quality Improvement Based on Big Data Analysis

2016

Big data analysis has become an important trend in computer science. Quality improvement is a constant in current industry trends. In this paper, we present an idea of quality improvement based on big data analysis with the aid of linked data and ontologies in order to implement it in the case of automotive parts production. We consider defective automotive products and try to find the best refurbishment solution for them considering their characteristics. Moreover, we propose to develop a recommender system that is able to give recommendations in order to prevent or to alleviate defects and to provide insights for possible causes that led to these defective parts. This study intends to hel…

Quality managementbusiness.industryComputer sciencemedia_common.quotation_subjectBig dataAutomotive industryLinked dataRecommender systemData modelingRisk analysis (engineering)Business intelligenceQuality (business)businessmedia_common
researchProduct

Low-Rank Tucker-2 Model for Multi-Subject fMRI Data Decomposition with Spatial Sparsity Constraint

2022

Tucker decomposition can provide an intuitive summary to understand brain function by decomposing multi-subject fMRI data into a core tensor and multiple factor matrices, and was mostly used to extract functional connectivity patterns across time/subjects using orthogonality constraints. However, these algorithms are unsuitable for extracting common spatial and temporal patterns across subjects due to distinct characteristics such as high-level noise. Motivated by a successful application of Tucker decomposition to image denoising and the intrinsic sparsity of spatial activations in fMRI, we propose a low-rank Tucker-2 model with spatial sparsity constraint to analyze multi-subject fMRI dat…

Rank (linear algebra)Computer scienceMatrix normlow-rankmatrix decompositionsymbols.namesaketoiminnallinen magneettikuvausOrthogonalitytensorsTensor (intrinsic definition)Kronecker deltaTucker decompositionHumansElectrical and Electronic Engineeringcore tensorsparsity constraintRadiological and Ultrasound Technologybusiness.industrysignaalinkäsittelyfeature extractionsparse matricesBrainPattern recognitionbrain modelingMagnetic Resonance Imagingfunctional magnetic resonance imagingComputer Science ApplicationsConstraint (information theory)data modelssymbolsNoise (video)Artificial intelligencebusinessmulti-subject fMRI dataSoftwareAlgorithmsTucker decomposition
researchProduct

Compartmental analysis of dynamic nuclear medicine data: Models and identifiability

2016

Compartmental models based on tracer mass balance are extensively used in clinical and pre-clinical nuclear medicine in order to obtain quantitative information on tracer metabolism in the biological tissue. This paper is the first of a series of two that deal with the problem of tracer coefficient estimation via compartmental modelling in an inverse problem framework. Specifically, here we discuss the identifiability problem for a general n-dimension compartmental system and provide uniqueness results in the case of two-compartment and three-compartment compartmental models. The second paper will utilize this framework in order to show how non-linear regularization schemes can be applied t…

Regularization (mathematics)Quantitative Biology - Quantitative Methods030218 nuclear medicine & medical imagingTheoretical Computer ScienceData modeling03 medical and health sciences0302 clinical medicinecompartmental analysis; identifiability; nuclear medicine dataTRACERFOS: Mathematicscompartmental analysisUniquenessMathematics - Numerical AnalysisMathematical PhysicsQuantitative Methods (q-bio.QM)Mathematicsbusiness.industryApplied MathematicsBiological tissueNumerical Analysis (math.NA)Inverse problemidentifiabilityComputer Science ApplicationsNonlinear systemnuclear medicine dataFOS: Biological sciencesSignal ProcessingIdentifiabilityNuclear medicinebusiness030217 neurology & neurosurgery
researchProduct

Dataset shift adaptation with active queries

2011

In remote sensing image classification, it is commonly assumed that the distribution of the classes is stable over the entire image. This way, training pixels labeled by photointerpretation are assumed to be representative of the whole image. However, differences in distribution of the classes throughout the image make this assumption weak and a model built on a single area may be suboptimal when applied to the rest of the image. In this paper, we investigate the use of active learning to correct the shifts that may appear when training and test data do not come from the same distribution. Experiments are carried out on a VHR remote sensing classification scenario showing that active learni…

Rest (physics)PixelContextual image classificationComputer scienceActive learning (machine learning)Life ScienceData miningCovariancecomputer.software_genrecomputerTest dataImage (mathematics)Data modeling
researchProduct

Investigating the Potential of 3D GIS for Full Lifecycle Road Cadastral Modelling. Requirements and Opportunities

Road Cadastre 3D GIS data model
researchProduct

An ontology-based metamodel for multiagent-based simulations

2014

Multiagent-based simulations enable us to validate dierent use-case scenarios in a lot of application domains. The idea is to develop a realistic virtual environment to test particular domain-specic procedures. This paper presents our general framework for interactive multiagent-based simulations in virtual environments. The major contribution of this paper is the integration of the notion of ontology as a core element to the design process of a behavioral simulation. The proposed metamodel describes the concepts of a multiagent simulation using situated agents moving in a semantically enriched 3D environment. The agents perceive the geometric and semantic data in the surrounding environmen…

SIMPLE (military communications protocol)business.industryComputer scienceOntology (information science)Semantic data modelcomputer.software_genreMetamodelingHardware and ArchitectureVirtual machineModeling and SimulationIndustry Foundation ClassesSituatedSystems engineeringSoftware engineeringbusinessEngineering design processcomputerSoftwareSimulation Modelling Practice and Theory
researchProduct

An Information Aggregation and Analytics System for ATLAS Frontier

2019

International audience; ATLAS event processing requires access to centralized database systems where information about calibrations, detector status and data-taking conditions are stored. This processing is done on more than 150 computing sites on a world-wide computing grid which are able to access the database using the Squid-Frontier system. Some processing workflows have been found which overload the Frontier system due to the Conditions data model currently in use, specifically because some of the Conditions data requests have been found to have a low caching efficiency. The underlying cause is that non-identical requests as far as the caching are actually retrieving a much smaller num…

SQLQC1-999Complex event processingcomputer.software_genre01 natural sciencesprogrammingdata compilation0103 physical sciencesSpark (mathematics)[INFO]Computer Science [cs]010306 general physicsactivity reportcomputer.programming_languageDatabase010308 nuclear & particles physicsbusiness.industryPhysicsATLASCentralized databaseData modelAnalyticsefficiencyContainer (abstract data type)interfacedata managementUser interfacebusinesscomputerParticle Physics - Experimentperformance
researchProduct

An Approach to Data Quality Evaluation

2018

This research proposes a new approach to data quality evaluation comprising 3 aspects: (1) data object definition, which quality will be analyzed, (2) quality requirements specification for the data object using Domain Specific Language (DSL), (3) implementation of an executable data quality model that would enable scanning of data object and detect its shortages. Like the Model Driven Architecture (MDA) the data quality modelling is divided into platform independent (PIM) and platform-specific (PSM) models. PIM comprises informal specifications of data quality, PSM describes implementation of data quality model, thus making the data quality model executable. The approbation of the proposed…

SQLbusiness.industryComputer sciencemedia_common.quotation_subjectSoftware requirements specificationcomputer.file_formatLinked dataData modelingData integrityData qualityQuality (business)ExecutableSoftware engineeringbusinesscomputermedia_commoncomputer.programming_language2018 Fifth International Conference on Social Networks Analysis, Management and Security (SNAMS)
researchProduct

On the distribution of education and democracy

2006

This paper empirically analyzes the influence of the distribution of education on democracy by controlling for unobservable heterogeneity and by taking into account the persistency of some of the variables. The most novel finding is that increase in the education attained by the majority of the population is what matters for the implementation and sustainability of democracy, rather than the average years of schooling. We show this result is robust to issues pertaining omitted variables, outliers, sample selection, or a narrow definition of the variables used to measure democracy.

Sample selectionEconomics and Econometricseducation.field_of_studybusiness.industrymedia_common.quotation_subjectPopulationDistribution (economics)DevelopmentUnobservableDemocracyMicroeconomicsDemocracy political economy education inequality dynamic panel data modelSustainabilityOutlierjel:O10EconometricsEconomicsjel:P16educationbusinessmedia_commonJournal of Development Economics
researchProduct

Contributions to the knowledge base on PV performance: Evaluation of the operation of PV systems using different technologies installed in southern N…

2011

To assist in establishing an accepted knowledge base on PV-modules and systems performance using a representative range of technologies, devices have to be installed at diverse locations, covering a broad range of environmental conditions. For the example of a high latitude location, modules and systems are installed and under investigation in southern Norway (Kristiansand region) by the University of Agder in cooperation with industrial partners. This paper presents first results of the analysis of module performance. The operational behavior of the modules is used to derive a modeling scheme applicable for performance prediction. This use is demonstrated by giving the expected annual perf…

Scheme (programming language)business.industryComputer sciencePhotovoltaic systemElectrical engineeringData modelingKnowledge baseRange (aeronautics)Systems engineeringPerformance predictionOperational behaviorbusinesscomputercomputer.programming_language2011 37th IEEE Photovoltaic Specialists Conference
researchProduct