0000000000113648
AUTHOR
Ivo Oditis
An Approach to Data Quality Evaluation
This research proposes a new approach to data quality evaluation comprising 3 aspects: (1) data object definition, which quality will be analyzed, (2) quality requirements specification for the data object using Domain Specific Language (DSL), (3) implementation of an executable data quality model that would enable scanning of data object and detect its shortages. Like the Model Driven Architecture (MDA) the data quality modelling is divided into platform independent (PIM) and platform-specific (PSM) models. PIM comprises informal specifications of data quality, PSM describes implementation of data quality model, thus making the data quality model executable. The approbation of the proposed…
Data Quality Model-based Testing of Information Systems
This paper proposes a model-based testing approach by offering to use the data quality model (DQ-model) instead of the program’s control flow graph as a testing model. The DQ-model contains definitions and conditions for data objects to consider the data object as correct. The study proposes to automatically generate a complete test set (CTS) using a DQmodel that allows all data quality conditions to be tested, resulting in a full coverage of DQ-model. In addition, the possibility to check the conformity of the data to be entered and already stored in the database is ensured. The proposed alternative approach changes the testing process: (1) CTS can be generated prior to software developmen…
Data Quality Model-based Testing of Information Systems: the Use-case of E-scooters
The paper proposes a data quality model-based testing methodology aimed at improving testing methodology of information systems (IS) using previously proposed data quality model. The solution supposes creation of a description of the data to be processed by IS and the data quality requirements used for the development of the tests, followed by performing an automated test of the system on the generated tests verifying the correctness of data to be entered and stored in the database. The generation of tests for all possible data quality conditions creates a complete set of tests that verify the operation of the IS under all possible data quality conditions. The proposed solution is demonstra…
Data Quality Model-Based Testing of Information Systems: Two-Level Testing of the Insurance System
In order to develop reliable software, its operating must be verified for all possible cases of use. This can be achieved, at least partly, by means of a model-based testing (MBT), by establishing tests that check all conditions covered by the model. This paper presents a Data Quality Model-based Testing (DQMBT) using the data quality model (DQ-model) as a testing model. The DQ-model contains definitions and conditions for data objects to consider the data object as correct. The proposed testing approach allows complete testing of the conformity of the data to be entered and the data already stored in the database. The data to be entered shall be verified by means of predefined pre-conditio…
Towards Data Quality Runtime Verification
This paper discusses data quality checking during business process execution by using runtime verification. While runtime verification verifies the correctness of business process execution, data quality checks assure that particular process did not negatively impact the stored data. Both, runtime verification and data quality checks run in parallel with the base processes affecting them insignificantly. The proposed idea allows verifying (a) if the process was ended correctly as well as (b) whether the results of the correct process did not negatively impact the stored data in result of its modification caused by the specific process. The desired result will be achieved by use of domain sp…
Asynchronous Runtime Verification of Business Processes
The authors propose a runtime verification mechanism for business processes. This mechanism allows verifying the correctness of business process execution and it runs in parallel with the base processes affecting them insignificantly. The authors have identified the case where the use of business process runtime verification is helpful and applicable. The verification mechanism monitors the business process execution and verifies compliance with the base process description. The verification mechanism prototype was developed and tested in real business processes, as well as limits of runtime verification overhead were evaluated.
Smart Technologies for Improved Software Maintenance
Steadily increasing complexity of software systems makes them difficult to configure and use without special IT knowledge. One of the solutions is to improve software systems making them “smarter”, i.e. to supplement software systems with features of self-management, at least partially. This paper describes several software components known as smart technologies, which facilitate software use and maintenance. As to date smart technologies incorporate version updating, execution environment testing, self-testing, runtime verification and business process execution. The proposed approach has been successfully applied in several software projects.
Models of Data Quality
The research proposes a new approach to data quality management presenting three groups of DSL (Domain Specific Language). The first language group uses concept of data object in order to describe data to be analysed, the second group describes the requirements on data quality, and the third group describes data quality management process. The proposed approach deals with development of executable quality specifications for each kind of data objects. The specification can be executed step-by-step according to business process descriptions, ensuring the gradual accumulation of data in the database and data quality verification according to the specific use case.
Data quality evaluation: a comparative analysis of company registers’ open data in four European countries
A Step Towards a Data Quality Theory
Data quality issues have been topical for many decades. However, a unified data quality theory has not been proposed yet, since many concepts associated with the term “data quality” are not straightforward enough. The paper proposes a user-oriented data quality theory based on clearly defined concepts. The concepts are defined by using three groups of domain-specific languages (DSLs): (1) the first group uses the concept of a data object to describe the data to be analysed, (2) the second group describes the data quality requirements, and (3) the third group describes the process of data quality evaluation. The proposed idea proved to be simple enough, but at the same time very effective in…
Asynchronous Runtime Verification of Business Processes: Proof of Concept
Self-management of Information Systems
The paper discusses self-management features that are intended to support the usage and maintenance processes in the information system life. Instead of a universal solutions that are evolved by many researchers in the autonomic computing field, this approach, called smart technologies, anticipates self-management features by including autonomic components into information systems directly. The approach is practically applied in several information systems, and the gained results show that the implementation of self-management features requires relatively modest resources. Thereby the approach is suitable even for smaller projects and companies.
Risks of Concurrent Execution in E-Commerce Processes
The development of ICT facilitates replacing the traditional buying and selling processes with e-commerce solutions. If several customers are served concurrently, e.g. at the same time, the processes can interference each other causing risks for both the buyer and the seller. The paper offers a method to identify purchase/sale risks in simultaneous multi-customer service processes. First, an exact model of buying-selling processes is created and the conditions for the correct process execution are formulated. Then an analysis of all the possible scenarios, including the concurrently executed buying-selling scenarios, is performed using a symbolic execution of process descriptions. The obtai…
Domain-Specific Characteristics of Data Quality
The research discusses the issue how to describe data quality and what should be taken into account when developing an universal data quality management solution. The proposed approach is to create quality specifications for each kind of data objects and to make them executable. The specification can be executed step-by-step according to business process descriptions, ensuring the gradual accumulation of data in the database and data quality checking according to the specific use case. The described approach can be applied to check the completeness, accuracy, timeliness and consistency of accumulated data.