Search results for "methodologies"
showing 10 items of 2106 documents
Restoration and Enhancement of Historical Stereo Photos
2021
Restoration of digital visual media acquired from repositories of historical photographic and cinematographic material is of key importance for the preservation, study and transmission of the legacy of past cultures to the coming generations. In this paper, a fully automatic approach to the digital restoration of historical stereo photographs is proposed, referred to as Stacked Median Restoration plus (SMR+). The approach exploits the content redundancy in stereo pairs for detecting and fixing scratches, dust, dirt spots and many other defects in the original images, as well as improving contrast and illumination. This is done by estimating the optical flow between the images, and using it …
Space-Frequency Quantization for Image Compression With Directionlets
2007
The standard separable 2-D wavelet transform (WT) has recently achieved a great success in image processing because it provides a sparse representation of smooth images. However, it fails to efficiently capture 1-D discontinuities, like edges or contours. These features, being elongated and characterized by geometrical regularity along different directions, intersect and generate many large magnitude wavelet coefficients. Since contours are very important elements in the visual perception of images, to provide a good visual quality of compressed images, it is fundamental to preserve good reconstruction of these directional features. In our previous work, we proposed a construction of critic…
Memory-Efficient Sliding Window Progressive Meshes
2007
Progressive mesh is a data structure that encodes a continuous spectrum of mesh approximations. Sliding window progressive meshes (SWPM) minimize data transfers between CPU and GPU by storing mesh data in static on-GPU memory buffers [For01]. The main disadvantages of the original SWPM algorithm are poor vertex cache usage efficiency, and big resulting datasets. Connectivity-based algorithm [KT04] achieves a good vertex cache coherence but it does not address the problem of high memory utilization. In this paper, we describe estimates for the size of memory buffers, and describe methods to reduce the index datasets. We achieve 20% reduction due to the use hierarchical data structures (clust…
The pivotal role of students’ absorptive capacity in management learning
2022
Within a research context dominated by an increasing interest in innovative learning method- ologies in management education, an individual’s capacity to establish links between existing and new knowledge, that is, absorptive capacity (AC), has been surprisingly neglected in management (higher) education inquiry. This study helps to close this gap by investigating the role of management students’ AC on their academic performance. The study also examines the moderating effect on this relationship of using traditional learning methodologies (such as lectures), innovative learning methodologies (such as interacting with digital platforms), and having a cooperative climate in the classroom. Sec…
Monitoring and data quality assessment of the ATLAS liquid argon calorimeter
2014
The liquid argon calorimeter is a key component of the ATLAS detector installed at the CERN Large Hadron Collider. The primary purpose of this calorimeter is the measurement of electron and photon kinematic properties. It also provides a crucial input for measuring jets and missing transverse momentum. An advanced data monitoring procedure was designed to quickly identify issues that would affect detector performance and ensure that only the best quality data are used for physics analysis. This article presents the validation procedure developed during the 2011 and 2012 LHC data-taking periods, in which more than 98% of the proton-proton luminosity recorded by ATLAS at a centre-of-mass ener…
Fast Computation by Subdivision of Multidimensional Splines and Their Applications
2016
We present theory and algorithms for fast explicit computations of uni- and multi-dimensional periodic splines of arbitrary order at triadic rational points and of splines of even order at diadic rational points. The algorithms use the forward and the inverse Fast Fourier transform (FFT). The implementation is as fast as FFT computation. The algorithms are based on binary and ternary subdivision of splines. Interpolating and smoothing splines are used for a sample rate convertor such as resolution upsampling of discrete-time signals and digital images and restoration of decimated images that were contaminated by noise. The performance of the rate conversion based spline is compared with the…
ON SOME GENERALIZATION OF SMOOTHING PROBLEMS
2015
The paper deals with the generalized smoothing problem in abstract Hilbert spaces. This generalized problem involves particular cases such as the interpolating problem, the smoothing problem with weights, the smoothing problem with obstacles, the problem on splines in convex sets and others. The theorem on the existence and characterization of a solution of the generalized problem is proved. It is shown how the theorem gives already known theorems in special cases as well as some new results.
Large-scale nonlinear dimensionality reduction for network intrusion detection
2017
International audience; Network intrusion detection (NID) is a complex classification problem. In this paper, we combine classification with recent and scalable nonlinear dimensionality reduction (NLDR) methods. Classification and DR are not necessarily adversarial, provided adequate cluster magnification occurring in NLDR methods like $t$-SNE: DR mitigates the curse of dimensionality, while cluster magnification can maintain class separability. We demonstrate experimentally the effectiveness of the approach by analyzing and comparing results on the big KDD99 dataset, using both NLDR quality assessment and classification rate for SVMs and random forests. Since data involves features of mixe…
Guaranteed error bounds for linear algebra problems and a class of Picard-Lindelöf iteration methods
2012
This study focuses on iteration methods based on the Banach fixed point theorem and a posteriori error estimates of Ostrowski. Their application for systems of linear simultaneous equations, bounded linear operators, as well as integral and differential equations is considered. The study presents a new version of the Picard–Lindelöf method for ordinary differential equations (ODEs) supplied with guaranteed and explicitly computable upper bounds of the approximation error. The estimates derived in the thesis take into account interpolation and integration errors and, therefore, provide objective information on the accuracy of computed approximations.
DOCUMENT MANAGEMENT USING CLUSTERING ALGORITHMS
2015
Document management systems are complex systems, which offer services as storage, versioning, metadata, security, as well as indexing and retrieval capabilities. Large numbers of documents could be automatically grouped into classes of documents, which contain similar information. Therefor we propose to use clustering methods in order to group the documents. Clustering is an important process in text mining used for groping documents based on their contents in order to extract knowledge. In this paper we will present some requirements for clustering algorithms for a document management system