Search results for "data compression"
showing 10 items of 99 documents
Data Compression Using Wavelet and Local Cosine Transforms
2015
The chapter describes an algorithm that compresses two-dimensional data arrays, which are piece-wise smooth in one direction and have oscillating events in the other direction. Seismic, hyper-spectral and fingerprints data, for example, have such a mixed structure. The transform part of the compression process is an algorithm that combines wavelet and local cosine transform (LCT). The quantization and the entropy coding parts of the compression are taken from the SPIHT codec. To efficiently apply the SPIHT codec to a mixed coefficients array, reordering of the LCT coefficients takes place. On the data arrays, which have the mixed structure, this algorithm outperforms other algorithms that a…
The effect of wavelet and discrete cosine transform compression of digital radiographs on the detection of subtle proximal caries. ROC analysis.
2007
The study compared diagnostic performances of 2 different image compression methods: JPEG (discrete cosine transform; Joint Photographic Experts Group compression standard) versus JPEG2000 (discrete wavelet transform), both at a compression ratio of 12:1, from the original uncompressed TIFF radiograph with respect to the detection of non-cavitated carious lesions. Therefore, 100 approximal surfaces of 50 tooth pairs were evaluated on the radiographs by 10 experienced observers using a 5-point confidence scale. Observations were carried out on a standardized viewing monitor under subdued light conditions. The proportion of diseased surfaces was balanced to approximately 50% to avoid bias. Tr…
Compression Methods for Microclimate Data Based on Linear Approximation of Sensor Data
2019
Edge computing is currently one of the main research topics in the field of Internet of Things. Edge computing requires lightweight and computationally simple algorithms for sensor data analytics. Sensing edge devices are often battery powered and have a wireless connection. In designing edge devices the energy efficiency needs to be taken into account. Pre-processing the data locally in the edge device reduces the amount of data and thus decreases the energy consumption of wireless data transmission. Sensor data compression algorithms presented in this paper are mainly based on data linearity. Microclimate data is near linear in short time window and thus simple linear approximation based …
Multimedia applications in industrial networks: integration of image processing in profibus
2003
This paper analyzes the transmission of images through Profibus for applications concerned with the processing of images integrated in control systems, by means of a detailed study of two real cases with different bandwidth requirements. We analyze the special features of an artificial vision system making use of Profibus as a transport system, the scheduling of the traffic in order to guarantee the delivery of the images and the control traffic before their deadlines, the protocol used, and the compression techniques usable in this kind of system, which enable us to reduce the necessary bandwidth for the applications without degrading the operation of the image processing application. We p…
Efficiency improvement of DC* through a Genetic Guidance
2017
DC∗ is a method for generating interpretable fuzzy information granules from pre-classified data. It is based on the subsequent application of LVQ1 for data compression and an ad-hoc procedure based on A∗ to represent data with the minimum number of fuzzy information granules satisfying some interpretability constraints. While being efficient in tackling several problems, the A∗ procedure included in DC∗ may happen to require a long computation time because the A∗ algorithm has exponential time complexity in the worst case. In this paper, we approach the problem of driving the search process of A∗ by suggesting a close-to-optimal solution that is produced through a Genetic Algorithm (GA). E…
P2D: a self-supervised method for depth estimation from polarimetry
2021
Monocular depth estimation is a recurring subject in the field of computer vision. Its ability to describe scenes via a depth map while reducing the constraints related to the formulation of perspective geometry tends to favor its use. However, despite the constant improvement of algorithms, most methods exploit only colorimetric information. Consequently, robustness to events to which the modality is not sensitive to, like specularity or transparency, is neglected. In response to this phenomenon, we propose using polarimetry as an input for a self-supervised monodepth network. Therefore, we propose exploiting polarization cues to encourage accurate reconstruction of scenes. Furthermore, we…
Novel Results on the Number of Runs of the Burrows-Wheeler-Transform
2021
The Burrows-Wheeler-Transform (BWT), a reversible string transformation, is one of the fundamental components of many current data structures in string processing. It is central in data compression, as well as in efficient query algorithms for sequence data, such as webpages, genomic and other biological sequences, or indeed any textual data. The BWT lends itself well to compression because its number of equal-letter-runs (usually referred to as $r$) is often considerably lower than that of the original string; in particular, it is well suited for strings with many repeated factors. In fact, much attention has been paid to the $r$ parameter as measure of repetitiveness, especially to evalua…
The rightmost equal-cost position problem.
2013
LZ77-based compression schemes compress the input text by replacing factors in the text with an encoded reference to a previous occurrence formed by the couple (length, offset). For a given factor, the smallest is the offset, the smallest is the resulting compression ratio. This is optimally achieved by using the rightmost occurrence of a factor in the previous text. Given a cost function, for instance the minimum number of bits used to represent an integer, we define the Rightmost Equal-Cost Position (REP) problem as the problem of finding one of the occurrences of a factor whose cost is equal to the cost of the rightmost one. We present the Multi-Layer Suffix Tree data structure that, for…
Constructing Antidictionaries in Output-Sensitive Space
2021
A word $x$ that is absent from a word $y$ is called minimal if all its proper factors occur in $y$. Given a collection of $k$ words $y_1,y_2,\ldots,y_k$ over an alphabet $\Sigma$, we are asked to compute the set $\mathrm{M}^{\ell}_{y_{1}\#\ldots\#y_{k}}$ of minimal absent words of length at most $\ell$ of word $y=y_1\#y_2\#\ldots\#y_k$, $\#\notin\Sigma$. In data compression, this corresponds to computing the antidictionary of $k$ documents. In bioinformatics, it corresponds to computing words that are absent from a genome of $k$ chromosomes. This computation generally requires $\Omega(n)$ space for $n=|y|$ using any of the plenty available $\mathcal{O}(n)$-time algorithms. This is because a…
Large-scale compression of genomic sequence databases with the Burrows-Wheeler transform
2012
Motivation The Burrows-Wheeler transform (BWT) is the foundation of many algorithms for compression and indexing of text data, but the cost of computing the BWT of very large string collections has prevented these techniques from being widely applied to the large sets of sequences often encountered as the outcome of DNA sequencing experiments. In previous work, we presented a novel algorithm that allows the BWT of human genome scale data to be computed on very moderate hardware, thus enabling us to investigate the BWT as a tool for the compression of such datasets. Results We first used simulated reads to explore the relationship between the level of compression and the error rate, the leng…