Search results for "methodologies"

showing 10 items of 2106 documents

Restoration and Enhancement of Historical Stereo Photos

2021

Restoration of digital visual media acquired from repositories of historical photographic and cinematographic material is of key importance for the preservation, study and transmission of the legacy of past cultures to the coming generations. In this paper, a fully automatic approach to the digital restoration of historical stereo photographs is proposed, referred to as Stacked Median Restoration plus (SMR+). The approach exploits the content redundancy in stereo pairs for detecting and fixing scratches, dust, dirt spots and many other defects in the original images, as well as improving contrast and illumination. This is done by estimating the optical flow between the images, and using it …

image denoisingComputer sciencemedia_common.quotation_subjectNoise reductionComputer applications to medicine. Medical informaticsR858-859.7ComputingMethodologies_IMAGEPROCESSINGANDCOMPUTERVISIONOptical flow02 engineering and technologyimage restorationArticleoptical flowgradient filteringPhotography0202 electrical engineering electronic engineering information engineeringRedundancy (engineering)historical photosContrast (vision)Radiology Nuclear Medicine and imagingComputer visionimage enhancementElectrical and Electronic EngineeringTR1-1050stereo matchingImage restorationmedia_commonSettore ING-INF/05 - Sistemi Di Elaborazione Delle Informazioniguided supersamplingImage fusionSettore INF/01 - Informaticabusiness.industry020206 networking & telecommunicationsSupersamplingQA75.5-76.95stacked medianComputer Graphics and Computer-Aided DesignTransmission (telecommunications)Electronic computers. Computer science020201 artificial intelligence & image processingComputer Vision and Pattern RecognitionArtificial intelligencebusinessimage denoising image restoration image enhancement stereo matching optical flow gradient filtering stacked median guided supersampling historical photosJournal of Imaging
researchProduct

Space-Frequency Quantization for Image Compression With Directionlets

2007

The standard separable 2-D wavelet transform (WT) has recently achieved a great success in image processing because it provides a sparse representation of smooth images. However, it fails to efficiently capture 1-D discontinuities, like edges or contours. These features, being elongated and characterized by geometrical regularity along different directions, intersect and generate many large magnitude wavelet coefficients. Since contours are very important elements in the visual perception of images, to provide a good visual quality of compressed images, it is fundamental to preserve good reconstruction of these directional features. In our previous work, we proposed a construction of critic…

image orientation analysisMultiresolution analysisVideo RecordingComputingMethodologies_IMAGEPROCESSINGANDCOMPUTERVISIONImage processingnonseparable transformsmultiresolution analysisRate–distortion theoryWaveletDVMsImage Interpretation Computer-AssistedComputer GraphicsComputer visionQuantization (image processing)image codingimage segmentationMathematicsbusiness.industryWavelet transformNumerical Analysis Computer-AssistedSignal Processing Computer-AssistedWTsData CompressionImage EnhancementComputer Graphics and Computer-Aided Designwavelet transformsdirectional vanishing momentsdirectional transformsArtificial intelligencebusinessAlgorithmsSoftwareImage compressionData compressionIEEE Transactions on Image Processing
researchProduct

Memory-Efficient Sliding Window Progressive Meshes

2007

Progressive mesh is a data structure that encodes a continuous spectrum of mesh approximations. Sliding window progressive meshes (SWPM) minimize data transfers between CPU and GPU by storing mesh data in static on-GPU memory buffers [For01]. The main disadvantages of the original SWPM algorithm are poor vertex cache usage efficiency, and big resulting datasets. Connectivity-based algorithm [KT04] achieves a good vertex cache coherence but it does not address the problem of high memory utilization. In this paper, we describe estimates for the size of memory buffers, and describe methods to reduce the index datasets. We achieve 20% reduction due to the use hierarchical data structures (clust…

independent setsgraphic processing unitsprogresivní mřížkymíra detailunezávislé setylevel of detailComputingMethodologies_COMPUTERGRAPHICSgrafická procesoryprogressive meshes
researchProduct

The pivotal role of students’ absorptive capacity in management learning

2022

Within a research context dominated by an increasing interest in innovative learning method- ologies in management education, an individual’s capacity to establish links between existing and new knowledge, that is, absorptive capacity (AC), has been surprisingly neglected in management (higher) education inquiry. This study helps to close this gap by investigating the role of management students’ AC on their academic performance. The study also examines the moderating effect on this relationship of using traditional learning methodologies (such as lectures), innovative learning methodologies (such as interacting with digital platforms), and having a cooperative climate in the classroom. Sec…

innovative learning methodologiesStrategy and Managementcooperative climatetraditional learning methodologiesabsorptive capacityacademic performancemanagement learningUNESCO::CIENCIAS ECONÓMICASEducationThe International Journal of Management Education
researchProduct

Monitoring and data quality assessment of the ATLAS liquid argon calorimeter

2014

The liquid argon calorimeter is a key component of the ATLAS detector installed at the CERN Large Hadron Collider. The primary purpose of this calorimeter is the measurement of electron and photon kinematic properties. It also provides a crucial input for measuring jets and missing transverse momentum. An advanced data monitoring procedure was designed to quickly identify issues that would affect detector performance and ensure that only the best quality data are used for physics analysis. This article presents the validation procedure developed during the 2011 and 2012 LHC data-taking periods, in which more than 98% of the proton-proton luminosity recorded by ATLAS at a centre-of-mass ener…

interaction [p nucleus]data acquisitionPhysics::Instrumentation and DetectorsCiencias FísicasNuclear engineeringinteraction [p p]7. Clean energy01 natural sciencesHigh Energy Physics - Experiment//purl.org/becyt/ford/1 [https]High Energy Physics - Experiment (hep-ex)Particle identification methodsData acquisitionParticle Identification Methodsperformance [monitoring]Naturvetenskap[PHYS.HEXP]Physics [physics]/High Energy Physics - Experiment [hep-ex]InstrumentationQCMathematical PhysicsPhysicsLarge Hadron ColliderLuminosity (scattering theory)Settore FIS/01 - Fisica SperimentaleDetectorATLASCalorimeterCERN LHC Collmedicine.anatomical_structurePhysical SciencesComputingMethodologies_DOCUMENTANDTEXTPROCESSINGLHCNatural SciencesCIENCIAS NATURALES Y EXACTASParticle Physics - ExperimentnoiseCiências Naturais::Ciências Físicas530 Physics:Ciências Físicas [Ciências Naturais]FOS: Physical sciencesCalorimeters; Large detector systems for particle and astroparticle physics; Particle identification methods; Instrumentation; Mathematical Physics530Nuclear physicsParticle identification methodCalorimetersParticle identification methods; Calorimeters; Large detector systems for particle and astroparticle physicsscattering [heavy ion]Atlas (anatomy)0103 physical sciencesCalibrationmedicineFysikHigh Energy Physicsddc:610010306 general physicsCalorimeters; Large detector systems for particle and astroparticle physics; Particle identification methodsCiencias ExactasCalorimeterleadScience & TechnologyLarge detector systems for particle and astroparticle physics010308 nuclear & particles physicsFísica//purl.org/becyt/ford/1.3 [https]calibrationAstronomíamissing-energy [transverse momentum]Data qualityExperimental High Energy PhysicsLarge detector systems for particle and astroparticle physicPhysics::Accelerator PhysicsHigh Energy Physics::ExperimentLarge Detector Systems for Particle and Astroparticle Physicsliquid argon [calorimeter]
researchProduct

Fast Computation by Subdivision of Multidimensional Splines and Their Applications

2016

We present theory and algorithms for fast explicit computations of uni- and multi-dimensional periodic splines of arbitrary order at triadic rational points and of splines of even order at diadic rational points. The algorithms use the forward and the inverse Fast Fourier transform (FFT). The implementation is as fast as FFT computation. The algorithms are based on binary and ternary subdivision of splines. Interpolating and smoothing splines are used for a sample rate convertor such as resolution upsampling of discrete-time signals and digital images and restoration of decimated images that were contaminated by noise. The performance of the rate conversion based spline is compared with the…

interpolating and smoothing splinesComputer Science::Graphicsrestorationprolate spheroidal wave functionsrate convertorperiodic splinessubdivisionupsamplingMathematicsofComputing_NUMERICALANALYSISComputingMethodologies_COMPUTERGRAPHICS
researchProduct

ON SOME GENERALIZATION OF SMOOTHING PROBLEMS

2015

The paper deals with the generalized smoothing problem in abstract Hilbert spaces. This generalized problem involves particular cases such as the interpolating problem, the smoothing problem with weights, the smoothing problem with obstacles, the problem on splines in convex sets and others. The theorem on the existence and characterization of a solution of the generalized problem is proved. It is shown how the theorem gives already known theorems in special cases as well as some new results.

interpolating splinesBox splineGeneralizationsmoothing splinesRegular polygonHilbert spaceCharacterization (mathematics)CombinatoricsSmoothing splinesymbols.namesakeModeling and Simulationmixed splinesQA1-939symbolsApplied mathematicssplines in convex setsMathematicsAnalysisSmoothingComputingMethodologies_COMPUTERGRAPHICSMathematicsMathematical Modelling and Analysis
researchProduct

Large-scale nonlinear dimensionality reduction for network intrusion detection

2017

International audience; Network intrusion detection (NID) is a complex classification problem. In this paper, we combine classification with recent and scalable nonlinear dimensionality reduction (NLDR) methods. Classification and DR are not necessarily adversarial, provided adequate cluster magnification occurring in NLDR methods like $t$-SNE: DR mitigates the curse of dimensionality, while cluster magnification can maintain class separability. We demonstrate experimentally the effectiveness of the approach by analyzing and comparing results on the big KDD99 dataset, using both NLDR quality assessment and classification rate for SVMs and random forests. Since data involves features of mixe…

intrusion detection[INFO.INFO-CV]Computer Science [cs]/Computer Vision and Pattern Recognition [cs.CV][ SPI.SIGNAL ] Engineering Sciences [physics]/Signal and Image processing[INFO.INFO-LG] Computer Science [cs]/Machine Learning [cs.LG][ INFO.INFO-CV ] Computer Science [cs]/Computer Vision and Pattern Recognition [cs.CV][ INFO.INFO-LG ] Computer Science [cs]/Machine Learning [cs.LG][STAT.ML] Statistics [stat]/Machine Learning [stat.ML][INFO.INFO-CV] Computer Science [cs]/Computer Vision and Pattern Recognition [cs.CV]ComputingMethodologies_PATTERNRECOGNITION[STAT.ML]Statistics [stat]/Machine Learning [stat.ML][INFO.INFO-LG]Computer Science [cs]/Machine Learning [cs.LG]Gower[SPI.SIGNAL]Engineering Sciences [physics]/Signal and Image processing[ STAT.ML ] Statistics [stat]/Machine Learning [stat.ML][SPI.SIGNAL] Engineering Sciences [physics]/Signal and Image processingdimensionality reduction
researchProduct

Guaranteed error bounds for linear algebra problems and a class of Picard-Lindelöf iteration methods

2012

This study focuses on iteration methods based on the Banach fixed point theorem and a posteriori error estimates of Ostrowski. Their application for systems of linear simultaneous equations, bounded linear operators, as well as integral and differential equations is considered. The study presents a new version of the Picard–Lindelöf method for ordinary differential equations (ODEs) supplied with guaranteed and explicitly computable upper bounds of the approximation error. The estimates derived in the thesis take into account interpolation and integration errors and, therefore, provide objective information on the accuracy of computed approximations.

iterointireliabilityiterative methodComputingMethodologies_SYMBOLICANDALGEBRAICMANIPULATIONthe Picard–Lindelöf methodguaranteed boundsError estimatesluotettavuus
researchProduct

DOCUMENT MANAGEMENT USING CLUSTERING ALGORITHMS

2015

Document management systems are complex systems, which offer services as storage, versioning, metadata, security, as well as indexing and retrieval capabilities. Large numbers of documents could be automatically grouped into classes of documents, which contain similar information. Therefor we propose to use clustering methods in order to group the documents. Clustering is an important process in text mining used for groping documents based on their contents in order to extract knowledge. In this paper we will present some requirements for clustering algorithms for a document management system

jel:Y80ComputingMethodologies_DOCUMENTANDTEXTPROCESSINGManagement Document Management Clustering Cluster ValidationRevista Economica
researchProduct