Search results for "data quality"
showing 10 items of 96 documents
The identifiability analysis for setting up measuring campaigns in integrated water quality modelling.
2012
Abstract Identifiability analysis enables the quantification of the number of model parameters that can be assessed by calibration with respect to a data set. Such a methodology is based on the appraisal of sensitivity coefficients of the model parameters by means of Monte Carlo runs. By employing the Fisher Information Matrix, the methodology enables one to gain insights with respect to the number of model parameters that can be reliably assessed. The paper presents a study where identifiability analysis is used as a tool for setting up measuring campaigns for integrated water quality modelling. Particularly, by means of the identifiability analysis, the information about the location and …
A parsimonious dynamic model for river water quality assessment
2010
Water quality modelling is of crucial importance for the assessment of physical, chemical, and biological changes in water bodies. Mathematical approaches to water modelling have become more prevalent over recent years. Different model types ranging from detailed physical models to simplified conceptual models are available. Actually, a possible middle ground between detailed and simplified models may be parsimonious models that represent the simplest approach that fits the application. The appropriate modelling approach depends on the research goal as well as on data available for correct model application. When there is inadequate data, it is mandatory to focus on a simple river water qua…
Open Data Quality Evaluation: A Comparative Analysis of Open Data in Latvia
2020
Nowadays open data is entering the mainstream - it is free available for every stakeholder and is often used in business decision-making. It is important to be sure data is trustable and error-free as its quality problems can lead to huge losses. The research discusses how (open) data quality could be assessed. It also covers main points which should be considered developing a data quality management solution. One specific approach is applied to several Latvian open data sets. The research provides a step-by-step open data sets analysis guide and summarizes its results. It is also shown there could exist differences in data quality depending on data supplier (centralized and decentralized d…
Relative risk estimation of dengue disease at small spatial scale
2017
Abstract Background Dengue is a high incidence arboviral disease in tropical countries around the world. Colombia is an endemic country due to the favourable environmental conditions for vector survival and spread. Dengue surveillance in Colombia is based in passive notification of cases, supporting monitoring, prediction, risk factor identification and intervention measures. Even though the surveillance network works adequately, disease mapping techniques currently developed and employed for many health problems are not widely applied. We select the Colombian city of Bucaramanga to apply Bayesian areal disease mapping models, testing the challenges and difficulties of the approach. Methods…
Design, validation, and testing of an observational tool for technical and tactical analysis in the taekwondo competition at the 2016 Olympic games.
2020
Abstract Observational methodology uses validated observational tools to collect information in sports with multiple variables that interact in the sporting context. Given the importance of data quality for observational tools, the purpose of this study was to design, validate, and test the reliability of a mixed observational instrument combining field formats and category systems for analyzing technical and tactical actions in an Olympic taekwondo (TKD) tournament. The instrument collects information of six criteria and 25 categories of the tactical and technical actions, kicking zone, laterality, kicking leg, guard, and score. A total of 2 374 actions were analyzed from 10 bouts involvin…
ATLAS tile calorimeter data quality assessment with commissioning data
2008
TileCal is the barrel hadronic calorimeter of the ATLAS experiment presently in an advanced state of installation and commissioning at the LHC accelerator. The complexity of the experiment, the number of electronics channels and the high rate of acquired events requires a detailed commissioning of the detector, during the installation phase of the experiment and in the early life of ATLAS, to verify the correct behaviour of the hardware and software systems. This is done through the acquisition, monitoring, reconstruction and validation of calibration signals as well as processing data obtained with cosmic ray muons. To assess the detector status and verify its performance a set of tools ha…
Single-channel and two-channel methods for land surface temperature retrieval from DAIS data and its application to the Barrax site
2004
In this paper, a methodology using a single-channel and a two-channel method is presented to estimate the land surface temperature from the DAIS (Digital Airborne Imaging Spectrometer) thermal channels 74 (8.747 µm), 75 (9.648 µm), 76 (10.482 µm), 77 (11.266 µm), 78 (11.997 µm) and 79 (12.668 µm). The land surface temperature retrieved with both methods has been validated over the Barrax site (Albacete, Spain) in the framework of the DAISEX (Digital Airborne Imaging Spectrometer Experiment) field campaigns. Prior to the validation an analysis of the DAIS data quality has been performed in order to check the agreement between in situ data and the values extracted from the DAIS images supplie…
Bayesian approach for uncertainty quantification in water quality modelling: The influence of prior distribution
2010
Summary Mathematical models are of common use in urban drainage, and they are increasingly being applied to support decisions about design and alternative management strategies. In this context, uncertainty analysis is of undoubted necessity in urban drainage modelling. However, despite the crucial role played by uncertainty quantification, several methodological aspects need to be clarified and deserve further investigation, especially in water quality modelling. One of them is related to the “a priori” hypotheses involved in the uncertainty analysis. Such hypotheses are usually condensed in “a priori” distributions assessing the most likely values for model parameters. This paper explores…
Water quality modelling for ephemeral rivers: Model development and parameter assessment
2010
Summary River water quality models can be valuable tools for the assessment and management of receiving water body quality. However, such water quality models require accurate model calibration in order to specify model parameters. Reliable model calibration requires an extensive array of water quality data that are generally rare and resource-intensive, both economically and in terms of human resources, to collect. In the case of small rivers, such data are scarce due to the fact that these rivers are generally considered too insignificant, from a practical and economic viewpoint, to justify the investment of such considerable time and resources. As a consequence, the literature contains v…
A practical framework for data management processes and their evaluation in population-based medical registries.
2013
We present a framework for data management processes in population-based medical registries. Existing guidelines lack the concreteness we deem necessary for them to be of practical use, especially concerning the establishment of new registries. Therefore, we propose adjustments and concretisations with regard to data quality, data privacy, data security and registry purposes.First, we separately elaborate on the issues to be included into the framework and present proposals for their improvements. Thereafter, we provide a framework for medical registries based on quasi-standard-operation procedures.The main result is a concise and scientifically based framework that tries to be both broad a…