Search results for "data compression"

showing 10 items of 99 documents

Bluetooth-3G wireless transmission system for neural signal telemetry

2007

In this contribution a wireless transmission system for neural signals is developed. This system includes data compression algorithms at the information source, namely neural signals recorded by micro-electrode arrays. The signals are transmitted over Bluetooth to a mobile device that, without any processing or storing, retransmits it over 3G to a remote server where signal post-processing and analysis is performed. The overall transmission rate of the system is limited by the Bluetooth link between the information source and the mobile phone, as well as by the limited processing capabilities of the mobile device and also by the 3G-link. Data compression allows the transmission of up to 7 n…

Signal processingbusiness.industryComputer scienceComputerSystemsOrganization_COMPUTER-COMMUNICATIONNETWORKSData_CODINGANDINFORMATIONTHEORYSignallaw.inventionBluetoothTransmission (telecommunications)lawMobile phoneEmbedded systemWirelessbusinessMobile deviceComputer hardwareData compression2007 Wireless Telecommunications Symposium
researchProduct

Implementation of JPEG2000 arithmetic decoder using dynamic reconfiguration of FPGA

2005

This paper describes implementation of a part of JPEG2000 algorithm (MQ-Decoder and arithmetic decoder) on a FPGA board using dynamic reconfiguration. Comparison between static and dynamic reconfiguration is presented and new analysis criteria (time performance, logic cost, spatio-temporal efficiency) are defined. MQ-decoder and arithmetic decoder can be classified in the most attractive case for dynamic reconfiguration implementation: applications without parallelism by functions. This implementation is done on an architecture designed to study dynamic reconfiguration of FPGAs: the ARDOISE architecture. The implementation obtained, based on four partial configurations of arithmetic decoder…

Soft-decision decoderComputer scienceJPEG 2000Control reconfigurationcomputer.file_formatHardware_ARITHMETICANDLOGICSTRUCTURESArithmeticField-programmable gate arraycomputerDecoding methodsData compression2004 International Conference on Image Processing, 2004. ICIP '04.
researchProduct

Reducing complexity in H.264/AVC motion estimation by using a GPU

2011

H.264/AVC applies a complex mode decision technique that has high computational complexity in order to reduce the temporal redundancies of video sequences. Several algorithms have been proposed in the literature in recent years with the aim of accelerating this part of the encoding process. Recently, with the emergence of many-core processors or accelerators, a new approach can be adopted for reducing the complexity of the H.264/AVC encoding algorithm. This paper focuses on reducing the inter prediction complexity adopted in H.264/AVC and proposes a GPU-based implementation using CUDA. Experimental results show that the proposed approach reduces the complexity by as much as 99% (100x of spe…

SpeedupComputational complexity theoryComputer science020206 networking & telecommunicationsData_CODINGANDINFORMATIONTHEORY02 engineering and technologyParallel computingCUDAAlgorithmic efficiency0202 electrical engineering electronic engineering information engineeringWorst-case complexity020201 artificial intelligence & image processingContext-adaptive binary arithmetic codingData compressionContext-adaptive variable-length coding
researchProduct

Textual data compression in computational biology: a synopsis.

2009

Abstract Motivation: Textual data compression, and the associated techniques coming from information theory, are often perceived as being of interest for data communication and storage. However, they are also deeply related to classification and data mining and analysis. In recent years, a substantial effort has been made for the application of textual data compression techniques to various computational biology tasks, ranging from storage and indexing of large datasets to comparison and reverse engineering of biological networks. Results: The main focus of this review is on a systematic presentation of the key areas of bioinformatics and computational biology where compression has been use…

Statistics and ProbabilityDatabases Factualbusiness.industryComputer sciencemedia_common.quotation_subjectSearch engine indexingcompression dataComputational BiologyInformation Storage and RetrievalComputational biologyBiochemistryData scienceComputer Science ApplicationsComputational MathematicsPresentationSoftwareComputational Theory and MathematicsBenchmark (computing)businessMolecular BiologyBiological networkSoftwareData compressionmedia_commonBioinformatics (Oxford, England)
researchProduct

Adaptive reference-free compression of sequence quality scores

2014

Motivation: Rapid technological progress in DNA sequencing has stimulated interest in compressing the vast datasets that are now routinely produced. Relatively little attention has been paid to compressing the quality scores that are assigned to each sequence, even though these scores may be harder to compress than the sequences themselves. By aggregating a set of reads into a compressed index, we find that the majority of bases can be predicted from the sequence of bases that are adjacent to them and hence are likely to be less informative for variant calling or other applications. The quality scores for such bases are aggressively compressed, leaving a relatively small number at full reso…

Statistics and ProbabilityFOS: Computer and information sciencesComputer sciencemedia_common.quotation_subjectReference-freecomputer.software_genreBiochemistryDNA sequencingSet (abstract data type)Redundancy (information theory)BWTComputer Science - Data Structures and AlgorithmsCode (cryptography)AnimalsHumansQuality (business)Data Structures and Algorithms (cs.DS)Quantitative Biology - GenomicsCaenorhabditis elegansMolecular Biologymedia_commonGenomics (q-bio.GN)SequenceGenomeSettore INF/01 - Informaticareference-free compressionHigh-Throughput Nucleotide SequencingGenomicsSequence Analysis DNAData CompressioncompressionComputer Science ApplicationsComputational MathematicsComputational Theory and MathematicsFOS: Biological sciencesData miningquality scoreMetagenomicscomputerBWT; compression; quality score; reference-free compressionAlgorithmsReference genome
researchProduct

Parallel Construction and Query of Index Data Structures for Pattern Matching on Square Matrices

1999

AbstractWe describe fast parallel algorithms for building index data structures that can be used to gather various statistics on square matrices. The main data structure is the Lsuffix tree, which is a generalization of the classical suffix tree for strings. Given ann×ntext matrixA, we build our data structures inO(logn) time withn2processors on a CRCW PRAM, so that we can quickly processAin parallel as follows: (i) report some statistical information aboutA, e.g., find the largest repeated square submatrices that appear at least twice inAor determine, for each position inA, the smallest submatrix that occurs only there; (ii) given, on-line, anm×mpattern matrixPAT, check whether it occurs i…

Statistics and ProbabilityNumerical AnalysisControl and OptimizationAlgebra and Number TheoryApplied MathematicsGeneral MathematicsSuffix treeParallel algorithmData structureSquare matrixSquare (algebra)law.inventionTree (data structure)lawPattern matchingAlgorithmMathematicsData compressionJournal of Complexity
researchProduct

Repetitiveness Measures based on String Attractors and Burrows-Wheeler Transform: Properties and Applications

2023

String AttractorSettore INF/01 - InformaticaMeasure of repetitiveneBurrows-Wheeler TransformCompressed Data StructuresData CompressionCombinatorics on WordStringology
researchProduct

Cost-driven framework for progressive compression of textured meshes

2019

International audience; Recent advances in digitization of geometry and radiometry generate in routine massive amounts of surface meshes with texture or color attributes. This large amount of data can be compressed using a progressive approach which provides at decoding low complexity levels of details (LoDs) that are continuously refined until retrieving the original model. The goal of such a progressive mesh compression algorithm is to improve the overall quality of the transmission for the user, by optimizing the rate-distortion trade-off. In this paper, we introduce a novel meaningful measure for the cost of a progressive transmission of a textured mesh by observing that the rate-distor…

Texture atlasDecimationadaptive quantizationmultiplexingComputer scienceGeometry compressionComputingMethodologies_IMAGEPROCESSINGANDCOMPUTERVISIONInversesurface meshes02 engineering and technologyData_CODINGANDINFORMATIONTHEORYtexturesprogressive vs single-rate[INFO.INFO-CG]Computer Science [cs]/Computational Geometry [cs.CG]MultiplexingCCS CONCEPTS • Computing methodologies → Computer graphics020204 information systems0202 electrical engineering electronic engineering information engineering020201 artificial intelligence & image processingPolygon meshQuantization (image processing)AlgorithmDecoding methodsData compressionComputingMethodologies_COMPUTERGRAPHICS
researchProduct

New techniques for visualization of losses due to image compression in grayscale medical still images

2003

To evaluate the visual influence of irreversible compression on medical images, changes of the images have to be visualized. The authors have explored alternative techniques to be used instead of the usual side-by-side comparison, where the information contained in both images is perceived in a single image, preserving the context between compression errors and image structures. Thus fast and easy comparison can be done. These techniques make use of the human ability to perceive information also in the dimensions of color, space, and time. A study was performed with JPEG-compressed coronary angiographic images. Changes in the resulting images for six compression factors from 7 to 30 were sc…

Texture compressionStandard test imagebusiness.industryComputer scienceImage processingcomputer.file_formatLossy compressionJPEGComputer visionArtificial intelligenceQuantization (image processing)businesscomputerImage compressionData compressionProceedings Computers in Cardiology
researchProduct

Boosting Textual Compression in Optimal Linear Time

2005

We provide a general boosting technique for Textual Data Compression. Qualitatively, it takes a good compression algorithm and turns it into an algorithm with a better compression performance guarantee. It displays the following remarkable properties: (a) it can turn any memoryless compressor into a compression algorithm that uses the “best possible” contexts; (b) it is very simple and optimal in terms of time; and (c) it admits a decompression algorithm again optimal in time. To the best of our knowledge, this is the first boosting technique displaying these properties.Technically, our boosting technique builds upon three main ingredients: the Burrows--Wheeler Transform, the Suffix Tree d…

Theoretical computer scienceBurrows–Wheeler transformSuffix treeString (computer science)Data_CODINGANDINFORMATIONTHEORYBurrows-Wheeler transformSubstringArithmetic codinglaw.inventionLempel-Ziv compressorsArtificial IntelligenceHardware and ArchitectureControl and Systems Engineeringlawtext compressionempirical entropyArithmetic codingGreedy algorithmTime complexityAlgorithmSoftwareInformation SystemsMathematicsData compression
researchProduct