Search results for "Compres"
showing 10 items of 1107 documents
Compression forces of haptics of freely rotating posterior chamber intraocular lenses.
1998
Abstract Purpose: To measure the compressive forces of the haptics of 28 intraocular lens (IOL) models for different modes of compression and compare the results of two types of measurements. Setting: Department of Ophthalmology, Central Hospital of Central Finland, Jyvaskyla, Finland. Methods: The haptics of 28 types of IOLs were compressed to a diameter of 9.0 mm between curved anvils. The compression forces in the plane of compression (i.e., in the plane of the optics) were measured at 0.5 mm intervals. During compression, the optics and the haptics were free to rotate with respect to the anvils. The results were compared with those of earlier measurements in which the optics were held f…
In Reply to the Letter to the Editor: “Comparing the Volume of Brain Metastases in F-18-FET-PET and MRI”
2016
Is it time to consider visual feedback systems the gold standard for chest compression skill acquisition?
2017
The Pseudo-variable Extrathoracic Obstruction Flow Volume Loop Pattern In Double Lung And Heart Lung Transplantation
2010
Lossless and near-lossless image compression based on multiresolution analysis
2013
There are applications in data compression, where quality control is of utmost importance. Certain features in the decoded signal must be exactly, or very accurately recovered, yet one would like to be as economical as possible with respect to storage and speed of computation. In this paper, we present a multi-scale data-compression algorithm within Harten's interpolatory framework for multiresolution that gives a specific estimate of the precise error between the original and the decoded signal, when measured in the L"~ and in the L"p (p=1,2) discrete norms. The proposed algorithm does not rely on a tensor-product strategy to compress two-dimensional signals, and it provides a priori bound…
The Engineering of a Compression Boosting Library: Theory vs Practice in BWT Compression
2006
Data Compression is one of the most challenging arenas both for algorithm design and engineering. This is particularly true for Burrows and Wheeler Compression a technique that is important in itself and for the design of compressed indexes. There has been considerable debate on how to design and engineer compression algorithms based on the BWT paradigm. In particular, Move-to-Front Encoding is generally believed to be an "inefficient " part of the Burrows-Wheeler compression process. However, only recently two theoretically superior alternatives to Move-to-Front have been proposed, namely Compression Boosting and Wavelet Trees. The main contribution of this paper is to provide the first ex…
From First Principles to the Burrows and Wheeler Transform and Beyond, via Combinatorial Optimization
2007
AbstractWe introduce a combinatorial optimization framework that naturally induces a class of optimal word permutations with respect to a suitably defined cost function taking into account various measures of relatedness between words. The Burrows and Wheeler transform (bwt) (cf. [M. Burrows, D. Wheeler, A block sorting lossless data compression algorithm, Technical Report 124, Digital Equipment Corporation, 1994]), and its analog for labelled trees (cf. [P. Ferragina, F. Luccio, G. Manzini, S. Muthukrishnan, Structuring labeled trees for optimal succinctness, and beyond, in: Proc. of the 45th Annual IEEE Symposium on Foundations of Computer Science, 2005, pp. 198–207]), are special cases i…
Statistical Modeling of Huffman Tables Coding
2005
An innovative algorithm for automatic generation of Huffman coding tables for semantic classes of digital images is presented. Collecting statistics over a large dataset of corresponding images, we generated Huffman tables for three images classes: landscape, portrait and document. Comparisons between the new tables and the JPEG standard coding tables, using also different quality settings, have shown the effectiveness of the proposed strategy in terms of final bit size (e.g. compression ratio).
Lossless coding of hyperspectral images with principal polynomial analysis
2014
The transform in image coding aims to remove redundancy among data coefficients so that they can be independently coded, and to capture most of the image information in few coefficients. While the second goal ensures that discarding coefficients will not lead to large errors, the first goal ensures that simple (point-wise) coding schemes can be applied to the retained coefficients with optimal results. Principal Component Analysis (PCA) provides the best independence and data compaction for Gaussian sources. Yet, non-linear generalizations of PCA may provide better performance for more realistic non-Gaussian sources. Principal Polynomial Analysis (PPA) generalizes PCA by removing the non-li…
Optimal Partitions of Strings: A New Class of Burrows-Wheeler Compression Algorithms
2003
The Burrows-Wheeler transform [1] is one of the mainstays of lossless data compression. In most cases, its output is fed to Move to Front or other variations of symbol ranking compression. One of the main open problems [2] is to establish whether Move to Front, or more in general symbol ranking compression, is an essential part of the compression process. We settle this question positively by providing a new class of Burrows-Wheeler algorithms that use optimal partitions of strings, rather than symbol ranking, for the additional step. Our technique is a quite surprising specialization to strings of partitioning techniques devised by Buchsbaum et al. [3] for two-dimensional table compression…