Search results for "data quality"

showing 10 items of 96 documents

The Performance of Belle II High Level Trigger in the First Physics Run

2020

The Belle II experiment is a new generation B-factory experiment at KEK in Japan aiming at the search for New Physics in a huge sample of B-meson decays. The commissioning of the accelerator and the detector for the first physics run has started from March this year. The Belle II High Level Trigger (HLT) is fully working in the beam run. The HLT is now operated with 1600 cores clusterized in 5 units, which is 1/4 of the full configuration. The software trigger is performed using the same offline reconstruction code, and events are classified into a set of physics categories. Only the events in the categories of interest are finally sent out to the storage. Live data quality monitoring is also…

PhysicsHigh level trigger010308 nuclear & particles physicsbusiness.industryPhysicsQC1-999DetectorReal-time computing01 natural sciencesSample (graphics)Set (abstract data type)Data acquisitionSoftwareData quality0103 physical sciencesReal-time data010306 general physicsbusinessEPJ Web of Conferences
researchProduct

Data quality monitors of vertex detectors at the start of the Belle II experiment

2019

The Belle II experiment features a substantial upgrade of the Belle detector and will operate at the SuperKEKB energy-asymmetric e+e− collider at KEK in Tsukuba, Japan. The accelerator completed its first phase of commissioning in 2016, and the Belle II detector saw its first electron-positron collisions in April 2018. Belle II features a newly designed silicon vertex detector based on double-sided strip layers and DEPFET pixel layers. A subset of the vertex detector was operated in 2018 to determine background conditions (Phase 2 operation). The collaboration completed full detector installation in January 2019, and the experiment started full data taking. This paper will report on the fin…

Physics::Instrumentation and DetectorsQC1-999vertex detectorBELLEquality: monitoring01 natural sciences7. Clean energyprogrammingSilicon vertex detectorlaw.inventionNuclear physicssemiconductor detector: pixellaw0103 physical sciencesQuality monitoring[INFO]Computer Science [cs][PHYS.PHYS.PHYS-INS-DET]Physics [physics]/Physics [physics]/Instrumentation and Detectors [physics.ins-det]010306 general physicsCollidernumerical calculationsdetector: designactivity reportPhysics010308 nuclear & particles physicsPhysicsDetectorUpgradeFull dataData qualityPhysics::Accelerator Physicssemiconductor detector: microstripHigh Energy Physics::ExperimentupgradeVertex detectormonitoring: on-lineperformance
researchProduct

A Approach to Clinical Proteomics Data Quality Control and Import

2011

International audience; Biomedical domain and proteomics in particular are faced with an increasing volume of data. The heterogeneity of data sources implies heterogeneity in the representation and in the content of data. Data may also be incorrect, implicate errors and can compromise the analysis of experiments results. Our approach aims to ensure the initial quality of data during import into an information system dedicated to proteomics. It is based on the joint use of models, which represent the system sources, and ontologies, which are use as mediators between them. The controls, we propose, ensure the validity of values, semantics and data consistency during import process.

Process (engineering)Computer sciencemedia_common.quotation_subject02 engineering and technologyOntology (information science)Proteomicscomputer.software_genreDomain (software engineering)03 medical and health sciences020204 information systems[ INFO.INFO-BI ] Computer Science [cs]/Bioinformatics [q-bio.QM]0202 electrical engineering electronic engineering information engineeringInformation systemQuality (business)[ SDV.BIBS ] Life Sciences [q-bio]/Quantitative Methods [q-bio.QM]030304 developmental biologymedia_common0303 health sciences[INFO.INFO-DB]Computer Science [cs]/Databases [cs.DB][SDV.BIBS]Life Sciences [q-bio]/Quantitative Methods [q-bio.QM]Data science[ INFO.INFO-DB ] Computer Science [cs]/Databases [cs.DB]Data qualityData mining[INFO.INFO-BI]Computer Science [cs]/Bioinformatics [q-bio.QM]computer
researchProduct

Enterprise System Implementation in a Franchise Context: An Action Case Study

2015

Abstract This research paper reports on a study focusing on a franchise relationship that utilizes an enterprise system (ES) to optimize its supply chain. The ES is in its post-implementation phase; however, the franchisee has challenges with inventory management due to poor data quality, which causes problems with vendor analysis and revenue control. An action case study was carried out to identify and diagnose the source of the problem, and interventions were implemented and evaluated. The findings demonstrate that several of the challenges related to poor data quality in the supply chain were influenced by problems of a socio-technical character. These included a lack of understanding of…

Process managementERP post-implementationVendorComputer scienceSupply chainControl (management)Enterprise systemsContext (language use)Computer securitycomputer.software_genreopportunismEnterprise systemOpportunismRevenuedata qualityAction researchGeneral Environmental ScienceWorkaroundinterorganizational challengesfranchiseretailaction caseSCMaction researchData qualityGeneral Earth and Planetary SciencescomputerProcedia Computer Science
researchProduct

Advantages and limitations of using national administrative data on obstetric blood transfusions to estimate the frequency of obstetric hemorrhages

2013

International audience; BACKGROUND: Obstetric hemorrhages are a frequent cause of maternal death all over the world, but are not routinely monitored. Health systems administrative databases could be used for this purpose, but data quality needs to be assessed. OBJECTIVES: Using blood transfusion data recorded in administrative databases to estimate the frequency of obstetric hemorrhages. Research design A population-based study. Subjects Validation sub-sample: all mothers who gave birth in a French region in 2006-07 (35 123 pregnancies). Main study: all mothers who gave birth in France in 2006-07 (1 629 537 pregnancies). METHOD: Linkage and comparison of administrative data on blood transfu…

Research designBlood transfusionmedicine.medical_treatmentPopulation[ SDV.MHEP.PED ] Life Sciences [q-bio]/Human health and pathology/Pediatrics03 medical and health sciences0302 clinical medicinePregnancymedicineHumansBlood Transfusion030212 general & internal medicineeducationeducation.field_of_studyPregnancy[SDV.MHEP.PED]Life Sciences [q-bio]/Human health and pathology/Pediatrics030219 obstetrics & reproductive medicinebusiness.industryData CollectionPostpartum HemorrhageInfant NewbornPublic Health Environmental and Occupational HealthInfantReproducibility of ResultsGeneral Medicinemedicine.diseaseNewborn3. Good healthLogistic ModelsDatabases as TopicData qualityFeasibility StudiesMaternal deathFemaleMedical emergencyFrancebusinessDatabases as TopicHealthcare system
researchProduct

Determining the appropriate timing of the next forest inventory: incorporating forest owner risk preferences and the uncertainty of forest data quali…

2017

Key message The timing to conduct new forest inventories should be based on the requirements of the decision maker. Importance should be placed on the objectives of the decision maker and his/her risk preferences related to those objectives. Context The appropriate use of pertinent and available information is paramount in any decision-making process. Within forestry, a new forest inventory is typically conducted prior to creating a forest management plan. The acquisition of new forest inventory data is justified by the simple statement of “good decisions require good data.” Aims By integrating potential risk preferences, we examine the specific needs to collect new forest information. Meth…

RiskOperations researchComputer scienceProcess (engineering)Stochastic modelling[SDV]Life Sciences [q-bio]Forest management0211 other engineering and technologiesStochastic programmingEven-flow forestry02 engineering and technologyRisk neutralstochastic programmingRecourse optionssortuncertaintyriskit040101 forestry021103 operations researchForest inventoryEcologybusiness.industryEnvironmental resource managementForestry04 agricultural and veterinary sciences15. Life on landeven-flow forestryStochastic programmingData qualityrecourse options0401 agriculture forestry and fisheriesbusinessAnnals of Forest Science
researchProduct

An Approach to Data Quality Evaluation

2018

This research proposes a new approach to data quality evaluation comprising 3 aspects: (1) data object definition, which quality will be analyzed, (2) quality requirements specification for the data object using Domain Specific Language (DSL), (3) implementation of an executable data quality model that would enable scanning of data object and detect its shortages. Like the Model Driven Architecture (MDA) the data quality modelling is divided into platform independent (PIM) and platform-specific (PSM) models. PIM comprises informal specifications of data quality, PSM describes implementation of data quality model, thus making the data quality model executable. The approbation of the proposed…

SQLbusiness.industryComputer sciencemedia_common.quotation_subjectSoftware requirements specificationcomputer.file_formatLinked dataData modelingData integrityData qualityQuality (business)ExecutableSoftware engineeringbusinesscomputermedia_commoncomputer.programming_language2018 Fifth International Conference on Social Networks Analysis, Management and Security (SNAMS)
researchProduct

An Approach to Cadastre Map Quality Evaluation

2008

An approach to data quality evaluation is proposed, which is elaborated and implemented by State Land Service of the Republic of Latvia. The approach is based on opinion of Land Service experts about Cadastre map quality that depends on its usage points. Quality parameters of Cadastre map objects identified by experts and its limit values are used for evaluation. The assessment matrix is used, which allow to define Cadastre map quality that depends on its usage purpose. The matrix is used to find out, of what quality a Cadastre map should be in order to be used for the chosen purpose. The given approach is flexible, it gives a possibility to change sets of quality parameters and their limit…

Service (systems architecture)Computer scienceCadastremedia_common.quotation_subjectData qualityQuality (business)Limit (mathematics)Data miningMap qualitycomputer.software_genrecomputermedia_common
researchProduct

An Approach to Cadastral Map Quality Evaluation in the Republic of Latvia

2009

An approach to cadastral map quality evaluation is proposed, which is elaborated and implemented by State Land Service of the Republic of Latvia. The approach is based on opinion of Land Service experts about cadastral map quality that depends on its usage points. Quality parameters of cadastral map objects identified by experts and its limit values are used for evaluation. The assessment matrix is used, which allow to define cadastral map quality that depends on its usage purpose. The matrix is used to find out, of what quality a cadastral map should be in order to be used for the chosen purpose. The given approach is flexible, it gives a possibility to change sets of quality parameters an…

Service (systems architecture)business.industryCadastremedia_common.quotation_subjectEnvironmental resource managementcomputer.software_genreThe RepublicGeographyData qualityQuality (business)Limit (mathematics)Data miningbusinesscomputermedia_commonData Quality and High-Dimensional Data Analysis
researchProduct

Data Quality in Tourism. Non-Sampling Errors in the Aeolian Islands Research.

2013

Settore SECS-S/05 - Statistica SocialeData quality Non-sampling errors Interviewer effect Aeolian Islands
researchProduct