Search results for "data quality"
showing 10 items of 96 documents
The Performance of Belle II High Level Trigger in the First Physics Run
2020
The Belle II experiment is a new generation B-factory experiment at KEK in Japan aiming at the search for New Physics in a huge sample of B-meson decays. The commissioning of the accelerator and the detector for the first physics run has started from March this year. The Belle II High Level Trigger (HLT) is fully working in the beam run. The HLT is now operated with 1600 cores clusterized in 5 units, which is 1/4 of the full configuration. The software trigger is performed using the same offline reconstruction code, and events are classified into a set of physics categories. Only the events in the categories of interest are finally sent out to the storage. Live data quality monitoring is also…
Data quality monitors of vertex detectors at the start of the Belle II experiment
2019
The Belle II experiment features a substantial upgrade of the Belle detector and will operate at the SuperKEKB energy-asymmetric e+e− collider at KEK in Tsukuba, Japan. The accelerator completed its first phase of commissioning in 2016, and the Belle II detector saw its first electron-positron collisions in April 2018. Belle II features a newly designed silicon vertex detector based on double-sided strip layers and DEPFET pixel layers. A subset of the vertex detector was operated in 2018 to determine background conditions (Phase 2 operation). The collaboration completed full detector installation in January 2019, and the experiment started full data taking. This paper will report on the fin…
A Approach to Clinical Proteomics Data Quality Control and Import
2011
International audience; Biomedical domain and proteomics in particular are faced with an increasing volume of data. The heterogeneity of data sources implies heterogeneity in the representation and in the content of data. Data may also be incorrect, implicate errors and can compromise the analysis of experiments results. Our approach aims to ensure the initial quality of data during import into an information system dedicated to proteomics. It is based on the joint use of models, which represent the system sources, and ontologies, which are use as mediators between them. The controls, we propose, ensure the validity of values, semantics and data consistency during import process.
Enterprise System Implementation in a Franchise Context: An Action Case Study
2015
Abstract This research paper reports on a study focusing on a franchise relationship that utilizes an enterprise system (ES) to optimize its supply chain. The ES is in its post-implementation phase; however, the franchisee has challenges with inventory management due to poor data quality, which causes problems with vendor analysis and revenue control. An action case study was carried out to identify and diagnose the source of the problem, and interventions were implemented and evaluated. The findings demonstrate that several of the challenges related to poor data quality in the supply chain were influenced by problems of a socio-technical character. These included a lack of understanding of…
Advantages and limitations of using national administrative data on obstetric blood transfusions to estimate the frequency of obstetric hemorrhages
2013
International audience; BACKGROUND: Obstetric hemorrhages are a frequent cause of maternal death all over the world, but are not routinely monitored. Health systems administrative databases could be used for this purpose, but data quality needs to be assessed. OBJECTIVES: Using blood transfusion data recorded in administrative databases to estimate the frequency of obstetric hemorrhages. Research design A population-based study. Subjects Validation sub-sample: all mothers who gave birth in a French region in 2006-07 (35 123 pregnancies). Main study: all mothers who gave birth in France in 2006-07 (1 629 537 pregnancies). METHOD: Linkage and comparison of administrative data on blood transfu…
Determining the appropriate timing of the next forest inventory: incorporating forest owner risk preferences and the uncertainty of forest data quali…
2017
Key message The timing to conduct new forest inventories should be based on the requirements of the decision maker. Importance should be placed on the objectives of the decision maker and his/her risk preferences related to those objectives. Context The appropriate use of pertinent and available information is paramount in any decision-making process. Within forestry, a new forest inventory is typically conducted prior to creating a forest management plan. The acquisition of new forest inventory data is justified by the simple statement of “good decisions require good data.” Aims By integrating potential risk preferences, we examine the specific needs to collect new forest information. Meth…
An Approach to Data Quality Evaluation
2018
This research proposes a new approach to data quality evaluation comprising 3 aspects: (1) data object definition, which quality will be analyzed, (2) quality requirements specification for the data object using Domain Specific Language (DSL), (3) implementation of an executable data quality model that would enable scanning of data object and detect its shortages. Like the Model Driven Architecture (MDA) the data quality modelling is divided into platform independent (PIM) and platform-specific (PSM) models. PIM comprises informal specifications of data quality, PSM describes implementation of data quality model, thus making the data quality model executable. The approbation of the proposed…
An Approach to Cadastre Map Quality Evaluation
2008
An approach to data quality evaluation is proposed, which is elaborated and implemented by State Land Service of the Republic of Latvia. The approach is based on opinion of Land Service experts about Cadastre map quality that depends on its usage points. Quality parameters of Cadastre map objects identified by experts and its limit values are used for evaluation. The assessment matrix is used, which allow to define Cadastre map quality that depends on its usage purpose. The matrix is used to find out, of what quality a Cadastre map should be in order to be used for the chosen purpose. The given approach is flexible, it gives a possibility to change sets of quality parameters and their limit…
An Approach to Cadastral Map Quality Evaluation in the Republic of Latvia
2009
An approach to cadastral map quality evaluation is proposed, which is elaborated and implemented by State Land Service of the Republic of Latvia. The approach is based on opinion of Land Service experts about cadastral map quality that depends on its usage points. Quality parameters of cadastral map objects identified by experts and its limit values are used for evaluation. The assessment matrix is used, which allow to define cadastral map quality that depends on its usage purpose. The matrix is used to find out, of what quality a cadastral map should be in order to be used for the chosen purpose. The given approach is flexible, it gives a possibility to change sets of quality parameters an…