Search results for " Compression"
showing 10 items of 400 documents
Solid-State Pyrolyses of Metal Phthalocyanines: A Simple Approach towards Nitrogen-Doped CNTs and Metal/Carbon Nanocables
2006
Solid-state pyrolysis of organometallic precursors has emerged as an alternative method for preparing carbon nanostructures such as carbon nanotubes (CNT) and carbon anions. The morphology of the tubes can be controlled by the nature of the precursors and the pyrolysis procedures, and micrometer long nanotubes, composed of metal carbide wires encased in a graphitic sheath. Cobalt phthalocyanine (CoPc) as well as iron phthalocyanine were pyrolyzed at different temperatures to obtain CNTs. HRTEM and energy-dispersion X-Ray analysis disclosed that the core consisted of long, iron-containing single crystals and that the core was fully surrounded by crystallized graphic carbon. Iron-filled carbo…
Improving Karhunen-Loeve based transform coding by using square isometries
2002
We propose, for an image compression system based on the Karhunen-Loeve transform implemented by neural networks, to take into consideration the 8 square isometries of an image block. The proper isometry applied puts the 8*8 square image block in a standard position, before applying the image block as input to the neural network architecture. The standard position is defined based on the variance of its four 4*4 sub-blocks (quadro partitioned) and brings the sub-block having the greatest variance in a specific corner and in another specific adjoining corner the sub-block having the second variance (if this is not possible the third is considered). The use of this "preprocessing" phase was e…
Comparison of genomic sequences clustering using Normalized Compression Distance and Evolutionary Distance
2008
Genomic sequences are usually compared using evolutionary distance, a procedure that implies the alignment of the sequences. Alignment of long sequences is a long procedure and the obtained dissimilarity results is not a metric. Recently the normalized compression distance was introduced as a method to calculate the distance between two generic digital objects, and it seems a suitable way to compare genomic strings. In this paper the clustering and the mapping, obtained using a SOM, with the traditional evolutionary distance and the compression distance are compared in order to understand if the two distances sets are similar. The first results indicate that the two distances catch differen…
Is it time to consider visual feedback systems the gold standard for chest compression skill acquisition?
2017
Lossless and near-lossless image compression based on multiresolution analysis
2013
There are applications in data compression, where quality control is of utmost importance. Certain features in the decoded signal must be exactly, or very accurately recovered, yet one would like to be as economical as possible with respect to storage and speed of computation. In this paper, we present a multi-scale data-compression algorithm within Harten's interpolatory framework for multiresolution that gives a specific estimate of the precise error between the original and the decoded signal, when measured in the L"~ and in the L"p (p=1,2) discrete norms. The proposed algorithm does not rely on a tensor-product strategy to compress two-dimensional signals, and it provides a priori bound…
The Engineering of a Compression Boosting Library: Theory vs Practice in BWT Compression
2006
Data Compression is one of the most challenging arenas both for algorithm design and engineering. This is particularly true for Burrows and Wheeler Compression a technique that is important in itself and for the design of compressed indexes. There has been considerable debate on how to design and engineer compression algorithms based on the BWT paradigm. In particular, Move-to-Front Encoding is generally believed to be an "inefficient " part of the Burrows-Wheeler compression process. However, only recently two theoretically superior alternatives to Move-to-Front have been proposed, namely Compression Boosting and Wavelet Trees. The main contribution of this paper is to provide the first ex…
From First Principles to the Burrows and Wheeler Transform and Beyond, via Combinatorial Optimization
2007
AbstractWe introduce a combinatorial optimization framework that naturally induces a class of optimal word permutations with respect to a suitably defined cost function taking into account various measures of relatedness between words. The Burrows and Wheeler transform (bwt) (cf. [M. Burrows, D. Wheeler, A block sorting lossless data compression algorithm, Technical Report 124, Digital Equipment Corporation, 1994]), and its analog for labelled trees (cf. [P. Ferragina, F. Luccio, G. Manzini, S. Muthukrishnan, Structuring labeled trees for optimal succinctness, and beyond, in: Proc. of the 45th Annual IEEE Symposium on Foundations of Computer Science, 2005, pp. 198–207]), are special cases i…
Statistical Modeling of Huffman Tables Coding
2005
An innovative algorithm for automatic generation of Huffman coding tables for semantic classes of digital images is presented. Collecting statistics over a large dataset of corresponding images, we generated Huffman tables for three images classes: landscape, portrait and document. Comparisons between the new tables and the JPEG standard coding tables, using also different quality settings, have shown the effectiveness of the proposed strategy in terms of final bit size (e.g. compression ratio).
Lossless coding of hyperspectral images with principal polynomial analysis
2014
The transform in image coding aims to remove redundancy among data coefficients so that they can be independently coded, and to capture most of the image information in few coefficients. While the second goal ensures that discarding coefficients will not lead to large errors, the first goal ensures that simple (point-wise) coding schemes can be applied to the retained coefficients with optimal results. Principal Component Analysis (PCA) provides the best independence and data compaction for Gaussian sources. Yet, non-linear generalizations of PCA may provide better performance for more realistic non-Gaussian sources. Principal Polynomial Analysis (PPA) generalizes PCA by removing the non-li…
Optimal Partitions of Strings: A New Class of Burrows-Wheeler Compression Algorithms
2003
The Burrows-Wheeler transform [1] is one of the mainstays of lossless data compression. In most cases, its output is fed to Move to Front or other variations of symbol ranking compression. One of the main open problems [2] is to establish whether Move to Front, or more in general symbol ranking compression, is an essential part of the compression process. We settle this question positively by providing a new class of Burrows-Wheeler algorithms that use optimal partitions of strings, rather than symbol ranking, for the additional step. Our technique is a quite surprising specialization to strings of partitioning techniques devised by Buchsbaum et al. [3] for two-dimensional table compression…