Search results for "Lossless"
showing 10 items of 24 documents
Improving Lossless Image Compression with Contextual Memory
2019
With the increased use of image acquisition devices, including cameras and medical imaging instruments, the amount of information ready for long term storage is also growing. In this paper we give a detailed description of the state-of-the-art lossless compression software PAQ8PX applied to grayscale image compression. We propose a new online learning algorithm for predicting the probability of bits from a stream. We then proceed to integrate the algorithm into PAQ8PX&rsquo
Morse Description and Geometric Encoding of Digital Elevation Maps
2004
Two complementary geometric structures for the topographic representation of an image are developed in this work. The first one computes a description of the Morse-topological structure of the image, while the second one computes a simplified version of its drainage structure. The topographic significance of the Morse and drainage structures of digital elevation maps (DEMs) suggests that they can been used as the basis of an efficient encoding scheme. As an application, we combine this geometric representation with an interpolation algorithm and lossless data compression schemes to develop a compression scheme for DEMs. This algorithm achieves high compression while controlling the maximum …
A WAVELET OPERATOR ON THE INTERVAL IN SOLVING MAXWELL'S EQUATIONS
2011
In this paper, a differential wavelet-based operator defined on an interval is presented and used in evaluating the electromagnetic field described by Maxwell's curl equations, in time domain. The wavelet operator has been generated by using Daubechies wavelets with boundary functions. A spatial differential scheme has been performed and it has been applied in studying electromagnetic phenomena in a lossless medium. The proposed approach has been successfully tested on a bounded axial-symmetric cylindrical domain.
The effect of wavelet and discrete cosine transform compression of digital radiographs on the detection of subtle proximal caries. ROC analysis.
2007
The study compared diagnostic performances of 2 different image compression methods: JPEG (discrete cosine transform; Joint Photographic Experts Group compression standard) versus JPEG2000 (discrete wavelet transform), both at a compression ratio of 12:1, from the original uncompressed TIFF radiograph with respect to the detection of non-cavitated carious lesions. Therefore, 100 approximal surfaces of 50 tooth pairs were evaluated on the radiographs by 10 experienced observers using a 5-point confidence scale. Observations were carried out on a standardized viewing monitor under subdued light conditions. The proportion of diseased surfaces was balanced to approximately 50% to avoid bias. Tr…
The rightmost equal-cost position problem.
2013
LZ77-based compression schemes compress the input text by replacing factors in the text with an encoded reference to a previous occurrence formed by the couple (length, offset). For a given factor, the smallest is the offset, the smallest is the resulting compression ratio. This is optimally achieved by using the rightmost occurrence of a factor in the previous text. Given a cost function, for instance the minimum number of bits used to represent an integer, we define the Rightmost Equal-Cost Position (REP) problem as the problem of finding one of the occurrences of a factor whose cost is equal to the cost of the rightmost one. We present the Multi-Layer Suffix Tree data structure that, for…
A HARDWARE SOLUTION FOR HEVC INTRA PREDICTION LOSSLESS CODING
2015
International audience; The lossless coding mode of the High Efficiency Video Coding (HEVC) main profile that bypasses transform, quantization, and in-loop filters is described. Compared to the HEVC non-lossless coding mode, the HEVC lossless coding mode provides perfect fidelity and an average bit-rate reduction of 3.2%–13.2%. It also significantly outperforms the existing lossless compression solutions, such as JPEG2000 and JPEG-LS for images as well as WinRAR for data archiving. A fully parallel-based solution is presented in this paper in order to reduce processing time and computation complexity resulting from intra prediction. Two higher performance structures are designed to perform …
Lossless and near-lossless image compression based on multiresolution analysis
2013
There are applications in data compression, where quality control is of utmost importance. Certain features in the decoded signal must be exactly, or very accurately recovered, yet one would like to be as economical as possible with respect to storage and speed of computation. In this paper, we present a multi-scale data-compression algorithm within Harten's interpolatory framework for multiresolution that gives a specific estimate of the precise error between the original and the decoded signal, when measured in the L"~ and in the L"p (p=1,2) discrete norms. The proposed algorithm does not rely on a tensor-product strategy to compress two-dimensional signals, and it provides a priori bound…
The Engineering of a Compression Boosting Library: Theory vs Practice in BWT Compression
2006
Data Compression is one of the most challenging arenas both for algorithm design and engineering. This is particularly true for Burrows and Wheeler Compression a technique that is important in itself and for the design of compressed indexes. There has been considerable debate on how to design and engineer compression algorithms based on the BWT paradigm. In particular, Move-to-Front Encoding is generally believed to be an "inefficient " part of the Burrows-Wheeler compression process. However, only recently two theoretically superior alternatives to Move-to-Front have been proposed, namely Compression Boosting and Wavelet Trees. The main contribution of this paper is to provide the first ex…
From First Principles to the Burrows and Wheeler Transform and Beyond, via Combinatorial Optimization
2007
AbstractWe introduce a combinatorial optimization framework that naturally induces a class of optimal word permutations with respect to a suitably defined cost function taking into account various measures of relatedness between words. The Burrows and Wheeler transform (bwt) (cf. [M. Burrows, D. Wheeler, A block sorting lossless data compression algorithm, Technical Report 124, Digital Equipment Corporation, 1994]), and its analog for labelled trees (cf. [P. Ferragina, F. Luccio, G. Manzini, S. Muthukrishnan, Structuring labeled trees for optimal succinctness, and beyond, in: Proc. of the 45th Annual IEEE Symposium on Foundations of Computer Science, 2005, pp. 198–207]), are special cases i…
Statistical Modeling of Huffman Tables Coding
2005
An innovative algorithm for automatic generation of Huffman coding tables for semantic classes of digital images is presented. Collecting statistics over a large dataset of corresponding images, we generated Huffman tables for three images classes: landscape, portrait and document. Comparisons between the new tables and the JPEG standard coding tables, using also different quality settings, have shown the effectiveness of the proposed strategy in terms of final bit size (e.g. compression ratio).