0000000000113648

AUTHOR

Ivo Oditis

showing 14 related works from this author

An Approach to Data Quality Evaluation

2018

This research proposes a new approach to data quality evaluation comprising 3 aspects: (1) data object definition, which quality will be analyzed, (2) quality requirements specification for the data object using Domain Specific Language (DSL), (3) implementation of an executable data quality model that would enable scanning of data object and detect its shortages. Like the Model Driven Architecture (MDA) the data quality modelling is divided into platform independent (PIM) and platform-specific (PSM) models. PIM comprises informal specifications of data quality, PSM describes implementation of data quality model, thus making the data quality model executable. The approbation of the proposed…

SQLbusiness.industryComputer sciencemedia_common.quotation_subjectSoftware requirements specificationcomputer.file_formatLinked dataData modelingData integrityData qualityQuality (business)ExecutableSoftware engineeringbusinesscomputermedia_commoncomputer.programming_language2018 Fifth International Conference on Social Networks Analysis, Management and Security (SNAMS)
researchProduct

Data Quality Model-based Testing of Information Systems

2020

This paper proposes a model-based testing approach by offering to use the data quality model (DQ-model) instead of the program’s control flow graph as a testing model. The DQ-model contains definitions and conditions for data objects to consider the data object as correct. The study proposes to automatically generate a complete test set (CTS) using a DQmodel that allows all data quality conditions to be tested, resulting in a full coverage of DQ-model. In addition, the possibility to check the conformity of the data to be entered and already stored in the database is ensured. The proposed alternative approach changes the testing process: (1) CTS can be generated prior to software developmen…

Model-based testingbusiness.industryComputer scienceSoftware developmentProcess (computing)020207 software engineering02 engineering and technologycomputer.software_genreSoftwareSystem under test020204 information systemsData qualityTest set0202 electrical engineering electronic engineering information engineeringControl flow graphData miningbusinesscomputerProceedings of the 2020 Federated Conference on Computer Science and Information Systems
researchProduct

Data Quality Model-based Testing of Information Systems: the Use-case of E-scooters

2020

The paper proposes a data quality model-based testing methodology aimed at improving testing methodology of information systems (IS) using previously proposed data quality model. The solution supposes creation of a description of the data to be processed by IS and the data quality requirements used for the development of the tests, followed by performing an automated test of the system on the generated tests verifying the correctness of data to be entered and stored in the database. The generation of tests for all possible data quality conditions creates a complete set of tests that verify the operation of the IS under all possible data quality conditions. The proposed solution is demonstra…

Model-based testingProgram testingCorrectnessComputer science0102 computer and information sciences02 engineering and technology01 natural sciencesReliability engineeringTest (assessment)Set (abstract data type)010201 computation theory & mathematicsData quality0202 electrical engineering electronic engineering information engineeringInformation system020201 artificial intelligence & image processing2020 7th International Conference on Internet of Things: Systems, Management and Security (IOTSMS)
researchProduct

Data Quality Model-Based Testing of Information Systems: Two-Level Testing of the Insurance System

2021

In order to develop reliable software, its operating must be verified for all possible cases of use. This can be achieved, at least partly, by means of a model-based testing (MBT), by establishing tests that check all conditions covered by the model. This paper presents a Data Quality Model-based Testing (DQMBT) using the data quality model (DQ-model) as a testing model. The DQ-model contains definitions and conditions for data objects to consider the data object as correct. The proposed testing approach allows complete testing of the conformity of the data to be entered and the data already stored in the database. The data to be entered shall be verified by means of predefined pre-conditio…

Model-based testingbusiness.industryComputer sciencemedia_common.quotation_subjectContext (language use)computer.software_genrePre-conditionSoftwareData qualityInformation systemQuality (business)Data miningbusinesscomputerData objectsmedia_common
researchProduct

Towards Data Quality Runtime Verification

2019

This paper discusses data quality checking during business process execution by using runtime verification. While runtime verification verifies the correctness of business process execution, data quality checks assure that particular process did not negatively impact the stored data. Both, runtime verification and data quality checks run in parallel with the base processes affecting them insignificantly. The proposed idea allows verifying (a) if the process was ended correctly as well as (b) whether the results of the correct process did not negatively impact the stored data in result of its modification caused by the specific process. The desired result will be achieved by use of domain sp…

Domain-specific languageCorrectnessBusiness processbusiness.industryComputer scienceData qualityRuntime verificationProcess (computing)Software engineeringbusinessProceedings of the 2019 Federated Conference on Computer Science and Information Systems
researchProduct

Asynchronous Runtime Verification of Business Processes

2015

The authors propose a runtime verification mechanism for business processes. This mechanism allows verifying the correctness of business process execution and it runs in parallel with the base processes affecting them insignificantly. The authors have identified the case where the use of business process runtime verification is helpful and applicable. The verification mechanism monitors the business process execution and verifies compliance with the base process description. The verification mechanism prototype was developed and tested in real business processes, as well as limits of runtime verification overhead were evaluated.

Business Process Model and NotationHigh-level verificationFunctional verificationbusiness.industryComputer scienceDistributed computingRuntime verificationVerificationBusiness process modelingSoftware engineeringbusinessSoftware verificationIntelligent verification2015 7th International Conference on Computational Intelligence, Communication Systems and Networks
researchProduct

Smart Technologies for Improved Software Maintenance

2015

Steadily increasing complexity of software systems makes them difficult to configure and use without special IT knowledge. One of the solutions is to improve software systems making them “smarter”, i.e. to supplement software systems with features of self-management, at least partially. This paper describes several software components known as smart technologies, which facilitate software use and maintenance. As to date smart technologies incorporate version updating, execution environment testing, self-testing, runtime verification and business process execution. The proposed approach has been successfully applied in several software projects.

business.industryComputer scienceSoftware developmentSoftware maintenancecomputer.software_genreSoftware analyticsSoftware constructionOperating systemPackage development processBackportingSoftware systemSoftware verification and validationbusinessSoftware engineeringcomputerAnnals of Computer Science and Information Systems
researchProduct

Models of Data Quality

2018

The research proposes a new approach to data quality management presenting three groups of DSL (Domain Specific Language). The first language group uses concept of data object in order to describe data to be analysed, the second group describes the requirements on data quality, and the third group describes data quality management process. The proposed approach deals with development of executable quality specifications for each kind of data objects. The specification can be executed step-by-step according to business process descriptions, ensuring the gradual accumulation of data in the database and data quality verification according to the specific use case.

Domain-specific languageBusiness processComputer scienceProcess (engineering)business.industrymedia_common.quotation_subjectFirst languagecomputer.file_formatDigital subscriber lineData qualityQuality (business)ExecutableSoftware engineeringbusinesscomputermedia_common
researchProduct

Data quality evaluation: a comparative analysis of company registers’ open data in four European countries

2018

Open dataComputer scienceData qualityData scienceCommunication Papers of the 2018 Federated Conference on Computer Science and Information Systems
researchProduct

A Step Towards a Data Quality Theory

2019

Data quality issues have been topical for many decades. However, a unified data quality theory has not been proposed yet, since many concepts associated with the term “data quality” are not straightforward enough. The paper proposes a user-oriented data quality theory based on clearly defined concepts. The concepts are defined by using three groups of domain-specific languages (DSLs): (1) the first group uses the concept of a data object to describe the data to be analysed, (2) the second group describes the data quality requirements, and (3) the third group describes the process of data quality evaluation. The proposed idea proved to be simple enough, but at the same time very effective in…

Domain-specific languageSQLInformation retrievalComputer sciencemedia_common.quotation_subjectcomputer.file_formatData qualityInformation systemQuality (business)ExecutablecomputerNatural languagemedia_commoncomputer.programming_languageAbstraction (linguistics)2019 Sixth International Conference on Social Networks Analysis, Management and Security (SNAMS)
researchProduct

Asynchronous Runtime Verification of Business Processes: Proof of Concept

2020

Business processProgramming languageProof of conceptComputer scienceAsynchronous communicationModeling and SimulationRuntime verificationcomputer.software_genrecomputerSoftwareInternational Journal of Simulation Systems Science & Technology
researchProduct

Self-management of Information Systems

2016

The paper discusses self-management features that are intended to support the usage and maintenance processes in the information system life. Instead of a universal solutions that are evolved by many researchers in the autonomic computing field, this approach, called smart technologies, anticipates self-management features by including autonomic components into information systems directly. The approach is practically applied in several information systems, and the gained results show that the implementation of self-management features requires relatively modest resources. Thereby the approach is suitable even for smaller projects and companies.

System of systems0209 industrial biotechnologybusiness.industryComputer scienceInformation architecture020208 electrical & electronic engineering02 engineering and technologyAutonomic computingManagement information systems020901 industrial engineering & automationKnowledge baseRisk analysis (engineering)0202 electrical engineering electronic engineering information engineeringInformation systemAutomated information systembusinessInformation integration
researchProduct

Risks of Concurrent Execution in E-Commerce Processes

2021

The development of ICT facilitates replacing the traditional buying and selling processes with e-commerce solutions. If several customers are served concurrently, e.g. at the same time, the processes can interference each other causing risks for both the buyer and the seller. The paper offers a method to identify purchase/sale risks in simultaneous multi-customer service processes. First, an exact model of buying-selling processes is created and the conditions for the correct process execution are formulated. Then an analysis of all the possible scenarios, including the concurrently executed buying-selling scenarios, is performed using a symbolic execution of process descriptions. The obtai…

Service (systems architecture)Exact modelRisk analysis (engineering)Process (engineering)Computer sciencebusiness.industryInformation and Communications TechnologyComputingMilieux_COMPUTERSANDSOCIETYE-commerceSymbolic executionbusinessAnnals of Computer Science and Information Systems
researchProduct

Domain-Specific Characteristics of Data Quality

2017

The research discusses the issue how to describe data quality and what should be taken into account when developing an universal data quality management solution. The proposed approach is to create quality specifications for each kind of data objects and to make them executable. The specification can be executed step-by-step according to business process descriptions, ensuring the gradual accumulation of data in the database and data quality checking according to the specific use case. The described approach can be applied to check the completeness, accuracy, timeliness and consistency of accumulated data.

Business processComputer sciencecomputer.file_formatcomputer.software_genreElectronic mailData modelingUnified Modeling LanguageData qualityData miningExecutableCompleteness (statistics)Data objectscomputercomputer.programming_languageProceedings of the 2017 Federated Conference on Computer Science and Information Systems
researchProduct