Search results for "Data Compression"

showing 10 items of 99 documents

Optimizing H.264/AVC interprediction on a GPU-based framework

2011

H.264/MPEG-4 part 10 is the latest standard for video compression and promises a significant advance in terms of quality and distortion compared with the commercial standards currently most in use such as MPEG-2 or MPEG-4. To achieve this better performance, H.264 adopts a large number of new/improved compression techniques compared with previous standards, albeit at the expense of higher computational complexity. In addition, in recent years new hardware accelerators have emerged, such as graphics processing units (GPUs), which provide a new opportunity to reduce complexity for a large variety of algorithms. However, current GPUs suffer from higher power consumption requirements because of…

Reduction (complexity)Computational Theory and MathematicsComputer Networks and CommunicationsComputer scienceDistortionMotion estimationSymmetric multiprocessor systemEnergy consumptionParallel computingSoftwareComputer Science ApplicationsTheoretical Computer ScienceData compressionConcurrency and Computation: Practice and Experience
researchProduct

Balancing and clustering of words: a combinatorial analysis of the Burrows & Wheeler Transform

2010

The Burrows-Wheeler Transform (denoted by BWT) is a well founded mathematical transformation on sequences introduced in 1994, widely used in the context of Data Compression and recently studied also from a combinatorial point of view. The transformation does not itself compress the data, but it produces a permutation bwt(w) of an input string w that is easier to compress than the original one, with some fast locally-adaptive algorithms, such as Move-to-Front in combination with Huffman or arithmetic coding. It is well-known that in most real texts, characters with the same or similar contexts tend to be the same. So, the BWT tends to group together characters which occur adjacent to similar…

Rich wordSettore INF/01 - InformaticaPalindromeData CompressionBurrows-Wheeler transformBalanced wordCombinatorics on word
researchProduct

Compressive biological sequence analysis and archival in the era of high-throughput sequencing technologies

2013

High-throughput sequencing technologies produce large collections of data, mainly DNA sequences with additional information, requiring the design of efficient and effective methodologies for both their compression and storage. In this context, we first provide a classification of the main techniques that have been proposed, according to three specific research directions that have emerged from the literature and, for each, we provide an overview of the current techniques. Finally, to make this review useful to researchers and technicians applying the existing software and tools, we include a synopsis of the main characteristics of the described approaches, including details on their impleme…

Sequence analysisComputer sciencebusiness.industryComputational BiologyHigh-Throughput Nucleotide SequencingContext (language use)Data CompressionBioinformaticsData scienceDNA sequencingSoftwareSequence analysis Data compressionMetagenomicsState (computer science)businessSequence AlignmentMolecular BiologyAlgorithmsSoftwareInformation SystemsData compressionBriefings in Bioinformatics
researchProduct

A New Multicast Technique for Video Transmission over ABR Services in ATM Networks

2000

Multicast techniques are the only way to simultaneously provide flows of information from one source to several destinations. The intention of this paper is to study and to evaluate different multicast techniques using a video coder based on an adaptive video compression algorithm with subband coding and a best effort network service like ATM with the Available Bit Rate (ABR) service. This video transmission can adapt faster and easily to changing network conditions. In this way, we present an evaluation process for a determined network configuration. Thereafter we discuss the results obtained by simulation and propose for this video transmission a trade-off between these multicast techniqu…

Service (systems architecture)Multicastbusiness.industryComputer scienceQuality of serviceATM adaptation layerNetwork serviceProcess (computing)Data_CODINGANDINFORMATIONTHEORYbusinessComputer networkSub-band codingData compression
researchProduct

The Myriad Virtes of Wavelet Trees

2009

A new data structure, the wavelet tree, is analysied and discussed with particular attention to data compression

Settore INF/01 - InformaticaAlgorithms Data Structures Data Compression
researchProduct

Correction to: FASTA/Q data compressors for MapReduce-Hadoop genomics: space and time savings made easy

2022

Following publication of the original article [1], the authors identified that the affiliations of Giuseppe Cattaneo and Raffaele Giancarlo were interchanged. The correct affiliations are given below. The correct affiliation of Giuseppe Cattaneo is: 2Dipartimento di Informatica, Università di Salerno, Fisciano, Italy. The correct affiliation of Raffaele Giancarlo is: 3Dipartimento di Matematica ed Informatica, Università di Palermo, Palermo, Italy. The original article [1] has been corrected.

Settore INF/01 - InformaticaStructural BiologyApplied MathematicsData CompressionMolecular BiologyBiochemistryComputer Science ApplicationsBMC Bioinformatics
researchProduct

Metodo e apparato per la compressione e la decompressione dati tramite il quale si possono utilizzare efficientemente molteplici compressori e decomp…

2010

Settore INF/01 - Informaticaoptimal parsingdictionary-based compressiondata compression
researchProduct

Optimal Parsing for Dictionary-Based Compression

2013

Dictionary-based compression algorithms include a parsing strategy to transform the input text into a sequence of dictionary phrases. Given a text, such process usually is not unique and, for compression purposes, it makes sense to find one of the possible parsing that minimise the final compression ratio. This is the parsing problem. In more than 30 years of history of dictionary-based text compression only few optimal parsing algorithms were presented. Most of the practical dictionary-based compression solutions need or prefer to factorise the input data into a sequence of dictionary-phrases and symbols. Those two output categories are usually encoded via two different encoders producing …

Settore INF/01 - Informaticaoptimal parsingtext compressiondata compressionLZ-compression
researchProduct

A new algorithm for bit rate allocation in JPEG2000 tile encoding

2004

A new algorithm for allocating a given bit rate to different image tiles in the JPEG2000 encoding system is proposed. The algorithm outperforms other approaches commonly used in implementations. The new algorithm is suitable when information content is not equally distributed across the image. It is based on the computation of an index of the information content of each tile. To implement the proposed approach, we modified JasPer, a free software-based JPEG2000 coder implementation (Adams, M.D. and Kossentini, F., Proc. IEEE Int. Conf. on Image Process., vol.2, p.53-6, 2000). The experimentation was carried out on a subset of the JPEG2000 test images. Experimental results are reported, show…

Settore ING-INF/05 - Sistemi Di Elaborazione Delle Informazioniimage coding JPEG2000business.industryComputer scienceComputationComputingMethodologies_IMAGEPROCESSINGANDCOMPUTERVISIONProcess (computing)computer.file_formatImage (mathematics)Softwarevisual_artEncoding (memory)JPEG 2000visual_art.visual_art_mediumTilebusinesscomputerAlgorithmData compression12th International Conference on Image Analysis and Processing, 2003.Proceedings.
researchProduct

On the Design of Fast Wavelet Transform Algorithms With Low Memory Requirements

2008

In this paper, a new algorithm to efficiently compute the two-dimensional wavelet transform is presented. This algorithm aims at low memory consumption and reduced complexity, meeting these requirements by means of line-by-line processing. In this proposal, we use recursion to automatically place the order in which the wavelet transform is computed. This way, we solve some synchronization problems that have not been tackled by previous proposals. Furthermore, unlike other similar proposals, our proposal can be straightforwardly implemented from the algorithm description. To this end, a general algorithm is given which is further detailed to allow its implementation with a simple filter bank…

Signal processingLifting schemeComputer scienceSecond-generation wavelet transformStationary wavelet transformWavelet transformImage processingCascade algorithmFilter bankWavelet packet decompositionMedia TechnologyDiscrete cosine transformCodecElectrical and Electronic EngineeringFast wavelet transformAlgorithmEncoderData compressionImage compressionIEEE Transactions on Circuits and Systems for Video Technology
researchProduct