Search results for " quality."

showing 10 items of 2995 documents

An integrated methodological approach for optimising complex systems subjected to predictive maintenance

2021

Abstract The present paper addresses the relevant topic of maintenance management, widely recognised as a fundamental issue involving complex engineering systems and leading companies towards the optimisation of their assets while pursuing cost efficiency. With this regard, our research aims to provide companies with a hybrid methodological approach based on Multi-Criteria Decision-Making (MCDM) capable to deal with the main failures potentially involving complex systems subjected to predictive maintenance. Such an approach is going to be integrated within the framework of traditional Failure Mode Effects and Criticality Analysis (FMECA), whose strengths and weaknesses are considered. In pa…

Complex systemsService systemCost efficiencyComputer scienceSpecific riskPredictive maintenanceDEMATELMultiple-criteria decision analysisIndustrial and Manufacturing EngineeringPredictive maintenanceFailure mode effects and criticality analysisRisk managementRisk analysis (engineering)Settore ING-IND/17 - Impianti Industriali MeccaniciELECTRE TRIELECTREMaintenance managementSafety Risk Reliability and QualityStrengths and weaknessesFMECAReliability Engineering & System Safety
researchProduct

Use of wavelet for image processing in smart cameras with low hardware resources

2013

International audience; Images from embedded sensors need digital processing to recover high-quality images and to extract features of a scene. Depending on the properties of the sensor and on the application, the designer fits together different algorithms to process images. In the context of embedded devices, the hardware supporting those applications is very constrained in terms of power consumption and silicon area. Thus, the algorithms have to be compliant with the embedded specifications i.e. reduced computational complexity and low memory requirements. We investigate the opportunity to use the wavelet representation to perform good quality image processing algorithms at a lower compu…

Computational complexity theoryComputer scienceImage qualityEmbedded systemsComputingMethodologies_IMAGEPROCESSINGANDCOMPUTERVISIONImage processing02 engineering and technology[SPI]Engineering Sciences [physics]WaveletDigital image processing0202 electrical engineering electronic engineering information engineering[ SPI ] Engineering Sciences [physics]Computer visionSmart cameraDWTDigital signal processingDenoisingDemosaicingbusiness.industry020202 computer hardware & architectureDemosaicingRecognitionHardware and Architecture020201 artificial intelligence & image processingArtificial intelligencebusinessWaveletSoftwareComputer hardware
researchProduct

Hypervisor-based Protection of Code

2019

The code of a compiled program is susceptible to reverse-engineering attacks on the algorithms and the business logic that are contained within the code. The main existing countermeasure to reverse-engineering is obfuscation. Generally, obfuscation methods suffer from two main deficiencies: 1) the obfuscated code is less efficient than the original and 2) with sufficient effort, the original code may be reconstructed. We propose a method that is based on cryptography and virtualization. The most valuable functions are encrypted and remain inaccessible even during their execution, thus preventing their reconstruction. A specially crafted hypervisor is responsible for decryption, execution, a…

Computer Networks and CommunicationsComputer science0211 other engineering and technologiesCryptography02 engineering and technologysecurityComputer securitycomputer.software_genreEncryptionkryptografiaObfuscationCode (cryptography)tietoturvavirtual machine monitorsSafety Risk Reliability and QualitySystem bustrusted platform moduleta113021110 strategic defence & security studiescode protectioncryptographybusiness.industryHypervisorVirtualizationObfuscation (software)businesscomputerIEEE Transactions on Information Forensics and Security
researchProduct

A holistic modeling for QoE estimation in live video streaming applications over LTE Advanced technologies with Full and Non Reference approaches

2018

Abstract Current mobile networks are providing high speed access to Internet at a rate of Gigabits per second. In this scenario, traditional services over wired networks are an alternative, in particular those based on live video streaming. But in the transition, different issues should be considered due to the rapid changing network conditions and the limited resources of the mobile devices. These issues should be taken into account to keep a good Quality of Experience (QoE) of the video in terms of a high Mean Opinion Score (MOS), a subjective video quality. Our goal is to estimate and predict this subjective metric in a holistic manner. Thus, we have analyzed and measured different varia…

Computer Networks and Communicationsbusiness.industryComputer scienceQuality of serviceMean opinion scoreReal-time computing020206 networking & telecommunications02 engineering and technologyVideo qualityLTE Advanced0202 electrical engineering electronic engineering information engineering020201 artificial intelligence & image processingThe InternetQuality of experiencebusinessMobile deviceSubjective video qualityComputer Communications
researchProduct

Energy saving and user satisfaction for a new advanced public lighting system

2019

Abstract The retrofit of urban lighting systems is often an advantageous means of achieving notable energy savings and improvements in the quality of light. User habits, expectations and lifestyle can contribute to the design of these systems, for example in deciding on the most appropriate control strategies or the light quality. The influence of such variables can be extended to the overall system performance. This paper presents a method of street lighting design based on two kinds of analysis carried out in a defined test area: measurements (by means of a monitoring study) and user preferences (by means of a survey). The results of this data analysis create the basis for the final desig…

Computer science020209 energymedia_common.quotation_subjectControl (management)Energy Engineering and Power TechnologySample (statistics)User satisfaction02 engineering and technologySettore ING-IND/32 - Convertitori Macchine E Azionamenti ElettriciSmart lightingTransport engineering020401 chemical engineering0202 electrical engineering electronic engineering information engineeringQuality (business)0204 chemical engineeringSmart lightingmedia_commonRate of returnInformation and communication technology integrationSettore ING-IND/11 - Fisica Tecnica AmbientaleRenewable Energy Sustainability and the EnvironmentLighting designTest (assessment)Light qualitySettore ING-IND/33 - Sistemi Elettrici Per L'EnergiaEnergy efficiencyFuel TechnologyNuclear Energy and EngineeringEfficient energy useEnergy Conversion and Management
researchProduct

A chirp-z transform-based synchronizer for power system measurements

2005

In the last few years, increased interest in power and voltage quality has forced international working groups to standardize testing and measurement techniques. IEC 61000-4-30, which defines the characteristics of instrumentation for the measurement of power quality, refers to IEC 61000-4-7 for the evaluation of harmonics and interharmonics. This standard, revised in 2002, requires a synchronous sampling of voltage or current signal, in order to limit errors and to ensure reproducible results even in the presence of nonstationary signals. Therefore, an accurate estimation of the fundamental frequency is required, even in the presence of disturbances. In this paper, an algorithm to detect t…

Computer scienceBluestein's FFT algorithmFast Fourier transformChirp-z transform power quality synchronizationFundamental frequencyPower (physics)Electric power systemSampling (signal processing)SynchronizerHarmonicsElectronic engineeringElectrical and Electronic EngineeringInstrumentationSettore ING-INF/07 - Misure Elettriche E ElettronicheInterpolation
researchProduct

Methodological considerations for interrupted time series analysis in radiation epidemiology: an overview

2021

Interrupted time series analysis (ITSA) is a method that can be applied to evaluate health outcomes in populations exposed to ionizing radiation following major radiological events. Using aggregated time series data, ITSA evaluates whether the time trend of a health indicator shows a change associated with the radiological event. That is, ITSA checks whether there is a statistically significant discrepancy between the projection of a pre-event trend and the data empirically observed after the event. Conducting ITSA requires one to consider specific methodological issues due to unique threats to internal validity that make ITSA prone to bias. We here discuss the strengths and limitations of …

Computer scienceConfoundingPublic Health Environmental and Occupational HealthInterrupted Time Series AnalysisStatistical modelGeneral MedicineHealth indicatorInterrupted Time Series AnalysisResearch DesignData qualityEconometricsInternal validityTime seriesSpurious relationshipWaste Management and DisposalForecastingJournal of Radiological Protection
researchProduct

UNIVERSITY IS ARCHITECTURE FOR THE RESEARCH EVALUATION SUPPORT

2017

The measuring of research results can be used in different ways e.g. for assignment of research grants and afterwards for evaluation of project’s results. It can be used also for recruiting or promoting research institutions’ staff. Because of a wide usage of such measurement, the selection of appropriate measures is important. At the same time there does not exist a common view which metrics should be used in this field, moreover many existing metrics that are widely used are often misleading due to different reasons, e.g. computed from incomplete or faulty data, the metric’s computation formula may be invalid or the computation results can be interpreted wrongly. To produce a good framewo…

Computer scienceData qualityInformation systemScopusSelection (linguistics)research evaluation; research metrics; data integration; information system; data qualityMetric (unit)Architecturecomputer.software_genrecomputerData scienceField (computer science)Data integrationEnvironment. Technology. Resources.
researchProduct

Executable Data Quality Models

2017

The paper discusses an external solution for data quality management in information systems. In contradiction to traditional data quality assurance methods, the proposed approach provides the usage of a domain specific language (DSL) for description data quality models. Data quality models consists of graphical diagrams, which elements contain requirements for data object's values and procedures for data object's analysis. The DSL interpreter makes the data quality model executable therefore ensuring measurement and improving of data quality. The described approach can be applied: (1) to check the completeness, accuracy and consistency of accumulated data; (2) to support data migration in c…

Computer scienceData transformation02 engineering and technologycomputer.software_genreData modeling0203 mechanical engineering0202 electrical engineering electronic engineering information engineeringInformation systemLogical data modelGeneral Environmental ScienceData elementDatabaseInformation qualityData warehouseData mapping020303 mechanical engineering & transportsData modelData qualityGeneral Earth and Planetary Sciences020201 artificial intelligence & image processingData pre-processingData architectureData miningSoftware architecturecomputerData migrationData virtualizationProcedia Computer Science
researchProduct

Response Determination Criteria for ELISPOT: Toward a Standard that Can Be Applied Across Laboratories

2011

ELISPOT assay readout is often dichomized as positive or negative responses according to prespecified criteria. However, these criteria can vary widely across institutions. The adoption of a common response criterion is a key step toward cross-laboratory comparability. This chapter describes the two main approaches to response determination, identifying the strengths and limitations of each. Nonparametric statistical tests and consideration of data quality are recommended and instructions provided for their ready implementation by nonstatisticians and statisticians alike.

Computer scienceELISPOTData qualityImmunologyMEDLINEKey (cryptography)EconometricsNonparametric statistics
researchProduct