Search results for "COMPRESSION"
showing 10 items of 774 documents
Optimal Parsing for Dictionary Text Compression
2012
Dictionary-based compression algorithms include a parsing strategy to transform the input text into a sequence of dictionary phrases. Given a text, such process usually is not unique and, for compression purpose, it makes sense to find one of the possible parsing that minimize the final compression ratio. This is the parsing problem. An optimal parsing is a parsing strategy or a parsing algorithm that solve the parsing problem taking account of all the constraints of a compression algorithm or of a class of homogeneous compression algorithms. Compression algorithm constrains are, for instance, the dictionary itself, i.e. the dynamic set of available phrases, and how much a phrase weights on…
Cost-driven framework for progressive compression of textured meshes
2019
International audience; Recent advances in digitization of geometry and radiometry generate in routine massive amounts of surface meshes with texture or color attributes. This large amount of data can be compressed using a progressive approach which provides at decoding low complexity levels of details (LoDs) that are continuously refined until retrieving the original model. The goal of such a progressive mesh compression algorithm is to improve the overall quality of the transmission for the user, by optimizing the rate-distortion trade-off. In this paper, we introduce a novel meaningful measure for the cost of a progressive transmission of a textured mesh by observing that the rate-distor…
Copy-move Forgery Detection via Texture Description
2010
Copy-move forgery is one of the most common type of tampering in digital images. Copy-moves are parts of the image that are copied and pasted onto another part of the same image. Detection methods in general use block-matching methods, which first divide the image into overlapping blocks and then extract features from each block, assuming similar blocks will yield similar features. In this paper we present a block-based approach which exploits texture as feature to be extracted from blocks. Our goal is to study if texture is well suited for the specific application, and to compare performance of several texture descriptors. Tests have been made on both uncompressed and JPEG compressed image…
New techniques for visualization of losses due to image compression in grayscale medical still images
2003
To evaluate the visual influence of irreversible compression on medical images, changes of the images have to be visualized. The authors have explored alternative techniques to be used instead of the usual side-by-side comparison, where the information contained in both images is perceived in a single image, preserving the context between compression errors and image structures. Thus fast and easy comparison can be done. These techniques make use of the human ability to perceive information also in the dimensions of color, space, and time. A study was performed with JPEG-compressed coronary angiographic images. Changes in the resulting images for six compression factors from 7 to 30 were sc…
Boosting Textual Compression in Optimal Linear Time
2005
We provide a general boosting technique for Textual Data Compression. Qualitatively, it takes a good compression algorithm and turns it into an algorithm with a better compression performance guarantee. It displays the following remarkable properties: (a) it can turn any memoryless compressor into a compression algorithm that uses the “best possible” contexts; (b) it is very simple and optimal in terms of time; and (c) it admits a decompression algorithm again optimal in time. To the best of our knowledge, this is the first boosting technique displaying these properties.Technically, our boosting technique builds upon three main ingredients: the Burrows--Wheeler Transform, the Suffix Tree d…
On parsing optimality for dictionary-based text compression—the Zip case
2013
Dictionary-based compression schemes are the most commonly used data compression schemes since they appeared in the foundational paper of Ziv and Lempel in 1977, and generally referred to as LZ77. Their work is the base of Zip, gZip, 7-Zip and many other compression software utilities. Some of these compression schemes use variants of the greedy approach to parse the text into dictionary phrases; others have left the greedy approach to improve the compression ratio. Recently, two bit-optimal parsing algorithms have been presented filling the gap between theory and best practice. We present a survey on the parsing problem for dictionary-based text compression, identifying noticeable results …
Dictionary-symbolwise flexible parsing
2012
AbstractLinear-time optimal parsing algorithms are rare in the dictionary-based branch of the data compression theory. A recent result is the Flexible Parsing algorithm of Matias and Sahinalp (1999) that works when the dictionary is prefix closed and the encoding of dictionary pointers has a constant cost. We present the Dictionary-Symbolwise Flexible Parsing algorithm that is optimal for prefix-closed dictionaries and any symbolwise compressor under some natural hypothesis. In the case of LZ78-like algorithms with variable costs and any, linear as usual, symbolwise compressor we show how to implement our parsing algorithm in linear time. In the case of LZ77-like dictionaries and any symbol…
Text Compression Using Antidictionaries
1999
International audience; We give a new text compression scheme based on Forbidden Words ("antidictionary"). We prove that our algorithms attain the entropy for balanced binary sources. They run in linear time. Moreover, one of the main advantages of this approach is that it produces very fast decompressors. A second advantage is a synchronization property that is helpful to search compressed data and allows parallel compression. Our algorithms can also be presented as "compilers" that create compressors dedicated to any previously fixed source. The techniques used in this paper are from Information Theory and Finite Automata.
Kolmogorov superposition theorem for image compression
2012
International audience; The authors present a novel approach for image compression based on an unconventional representation of images. The proposed approach is different from most of the existing techniques in the literature because the compression is not directly performed on the image pixels, but is rather applied to an equivalent monovariate representation of the wavelet-transformed image. More precisely, the authors have considered an adaptation of Kolmogorov superposition theorem proposed by Igelnik and known as the Kolmogorov spline network (KSN), in which the image is approximated by sums and compositions of specific monovariate functions. Using this representation, the authors trad…
Learning multiresolution schemes for compression of images
2007
We introduce a new type of multiresolution based on the Harten's framework using learning theory. This changes the point of view of the classical multiresolution analysis and it transforms an approximation problem in a learning problem opening great possibilities. (© 2008 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim)