Search results for "Data Quality"

showing 10 items of 96 documents

Population geocoding for healthcare management. Technical challenges and quality issues

2015

The present work aims at describing the main issues related with population geocoding for healthcare management. Some of the available procedures for geocoding multiple addresses are described and an indicator of quality of the geocoded addresses is proposed. As a case study, the geocoding of population addresses of a set of 9 Sicilian Municipalities is described and results deriving from the use of two different methods are compared in terms of quality. Some potential applications of population geocoding in healthcare management are finally discussed.

Data quality Population health Address geocoding Spatial databasesSettore SECS-S/05 - Statistica Sociale
researchProduct

Attention Check Items and Instructions in Online Surveys: Boon or Bane for Data Quality?

2019

In this paper, we examine rates of careless responding and reactions to detection methods (i.e., attention check items and instructions) in an experimental setting based on two different samples. First, we use a quota sample (with monetary incentive), a central data source for internet-based surveys in sociological and political research. Second, we include a voluntary opt-in panel (without monetary incentive) well suited for conducting survey experiments (e.g., factorial surveys). Respondents’ reactions to the detection items are analyzed by objective, nonreactive indicators (i.e., break-off, item non-response, and measurement quality), and two self-report scales. Our reaction analyses rev…

Data sourceIncentivebusiness.industrymedia_common.quotation_subjectData qualityApplied psychologyQuota samplingSatisficingQuality (business)The InternetJustice (ethics)businessmedia_commonSSRN Electronic Journal
researchProduct

A framework for data quality handling in enterprise service bus

2013

Enterprise Service Bus (ESB) is proposed to address the application integration problem by facilitating communication among different systems in a loosely coupled, standard-based, and protocol independent manner. Data sources are maintained out of the ESB's control and there should be a mechanism to select the most suitable data source among all available data sources. Especially, when two or more data sources are about the same object. For instance, it is normal to use more than one sensor to measure pressure or temperature at a particular point. Data quality can play an important role in selecting data sources in ESB since quality of data is an essential factor in the success of organizat…

DatabaseComputer sciencecomputer.internet_protocolGroup method of data handlingmedia_common.quotation_subjectService-oriented architecturecomputer.software_genreObject (computer science)Enterprise service busComponent (UML)Data qualityQuality (business)computerProtocol (object-oriented programming)media_commonThird International Conference on Innovative Computing Technology (INTECH 2013)
researchProduct

Advances in automated diffraction tomography

2009

Crystal structure solution by means of electron diffraction or investigation of special structural features needs high quality data acquisition followed by data processing which delivers cell parameters, space group and in the end a 3D data set. The final step is the structure analysis itself including structure solution and subsequent refinement.

Diffraction tomographyData setData processingMaterials scienceElectron diffractionbusiness.industryData qualityTomographyCrystal structurebusinessAutomationComputational science
researchProduct

Models of Data Quality

2018

The research proposes a new approach to data quality management presenting three groups of DSL (Domain Specific Language). The first language group uses concept of data object in order to describe data to be analysed, the second group describes the requirements on data quality, and the third group describes data quality management process. The proposed approach deals with development of executable quality specifications for each kind of data objects. The specification can be executed step-by-step according to business process descriptions, ensuring the gradual accumulation of data in the database and data quality verification according to the specific use case.

Domain-specific languageBusiness processComputer scienceProcess (engineering)business.industrymedia_common.quotation_subjectFirst languagecomputer.file_formatDigital subscriber lineData qualityQuality (business)ExecutableSoftware engineeringbusinesscomputermedia_common
researchProduct

Towards Data Quality Runtime Verification

2019

This paper discusses data quality checking during business process execution by using runtime verification. While runtime verification verifies the correctness of business process execution, data quality checks assure that particular process did not negatively impact the stored data. Both, runtime verification and data quality checks run in parallel with the base processes affecting them insignificantly. The proposed idea allows verifying (a) if the process was ended correctly as well as (b) whether the results of the correct process did not negatively impact the stored data in result of its modification caused by the specific process. The desired result will be achieved by use of domain sp…

Domain-specific languageCorrectnessBusiness processbusiness.industryComputer scienceData qualityRuntime verificationProcess (computing)Software engineeringbusinessProceedings of the 2019 Federated Conference on Computer Science and Information Systems
researchProduct

A Step Towards a Data Quality Theory

2019

Data quality issues have been topical for many decades. However, a unified data quality theory has not been proposed yet, since many concepts associated with the term “data quality” are not straightforward enough. The paper proposes a user-oriented data quality theory based on clearly defined concepts. The concepts are defined by using three groups of domain-specific languages (DSLs): (1) the first group uses the concept of a data object to describe the data to be analysed, (2) the second group describes the data quality requirements, and (3) the third group describes the process of data quality evaluation. The proposed idea proved to be simple enough, but at the same time very effective in…

Domain-specific languageSQLInformation retrievalComputer sciencemedia_common.quotation_subjectcomputer.file_formatData qualityInformation systemQuality (business)ExecutablecomputerNatural languagemedia_commoncomputer.programming_languageAbstraction (linguistics)2019 Sixth International Conference on Social Networks Analysis, Management and Security (SNAMS)
researchProduct

Uncertainty assessment of a membrane bioreactor model using the GLUE methodology

2010

A mathematical model for the simulation of physical-biological organic removal by means of a membrane bioreactor (MBR) has been previously developed and tested. This paper presents an analysis of the uncertainty of the MBR model. Particularly, the research explores the applicability of the Generalised Likelihood Uncertainty Estimation (GLUE) methodology that is one of the most widely used methods for investigating the uncertainties in the hydrology and that now on is spreading in other research field. For the application of the GLUE methodology, several Monte Carlo simulations have been run varying the all model influential parameters simultaneously. The model was applied to an MBR pilot pl…

EngineeringEnvironmental EngineeringSettore ICAR/03 - Ingegneria Sanitaria-AmbientaleMathematical modelbusiness.industrySettore ICAR/02 - Costruzioni Idrauliche E Marittime E IdrologiaMonte Carlo methodBiomedical EngineeringBackwashingBioengineeringMembrane bioreactorMathematical modelPilot plantRobustness (computer science)Data qualityASMMembrane bioreactorUncertainty assessmentProcess engineeringbusinessGLUEBiotechnologyBiochemical Engineering Journal
researchProduct

Proposal of geographic information systems methodology for quality control procedures of data obtained in naturalistic driving studies

2015

The primary goal of naturalistic driving studies is to provide a comprehensive observation of the driver's behaviour under real-life conditions by measureing a great number of parameters at high temporal frequencies. Achieving this goal, however, is a complex endeavor that faces many challenges such as the complexity of the vehicle instrumentation during the phase of data collection, and the difficult handling of large data volumes during the phase of data analysis. These drawbacks often cause episodes of data losses. Improving the technical aspects of the collection of naturalistic data is of paramount importance to increase the return of the investment made in it. An aspect to consider is…

EngineeringGeographic information systemData collectionbusiness.industryGroup method of data handlingMechanical EngineeringQuality controlPoison controlTransportationcomputer.software_genreData qualityInstrumentation (computer programming)Data miningbusinessLawcomputerReliability (statistics)General Environmental ScienceIET Intelligent Transport Systems
researchProduct

Uncertainty assessment of a model for biological nitrogen and phosphorus removal: Application to a large wastewater treatment plant

2012

Abstract In the last few years, the use of mathematical models in WasteWater Treatment Plant (WWTP) processes has become a common way to predict WWTP behaviour. However, mathematical models generally demand advanced input for their implementation that must be evaluated by an extensive data-gathering campaign, which cannot always be carried out. This fact, together with the intrinsic complexity of the model structure, leads to model results that may be very uncertain. Quantification of the uncertainty is imperative. However, despite the importance of uncertainty quantification, only few studies have been carried out in the wastewater treatment field, and those studies only included a few of …

EngineeringSettore ICAR/03 - Ingegneria Sanitaria-AmbientaleMathematical modelbusiness.industryNitrogen phosphorus removalMonte Carlo methodUncertainty analysiEnvironmental engineeringWastewater modellingGeophysicsGeochemistry and PetrologyData qualityCalibrationProbability distributionBiochemical engineeringUncertainty quantificationGLUEbusinessActivated-sludge modelReliability (statistics)Uncertainty analysisPhysics and Chemistry of the Earth, Parts A/B/C
researchProduct