6533b834fe1ef96bd129df63
RESEARCH PRODUCT
Merging the transform step and the quantization step for Karhunen-Loeve transform based image compression
Macarie BreazuI.p. MihuGavril TodereanB.j. Beggssubject
business.industryFractal transformVector quantizationTop-hat transformPattern recognitionArtificial intelligencebusinessQuantization (image processing)S transformTransform codingFractional Fourier transformData compressionMathematicsdescription
Transform coding is one of the most important methods for lossy image compression. The optimum linear transform - known as Karhunen-Loeve transform (KLT) - was difficult to implement in the classic way. Now, due to continuous improvements in neural network's performance, the KLT method becomes more topical then ever. We propose a new scheme where the quantization step is merged together with the transform step during the learning phase. The new method is tested for different levels of quantization and for different types of quantizers. Experimental results presented in the paper prove that the new proposed scheme always gives better results than the state-of-the-art solution.
year | journal | country | edition | language |
---|---|---|---|---|
2000-01-01 | Proceedings of the IEEE-INNS-ENNS International Joint Conference on Neural Networks. IJCNN 2000. Neural Computing: New Challenges and Perspectives for the New Millennium |