Search results for " Compression"

showing 10 items of 400 documents

Image Compression by 2D Motif Basis

2011

Approaches to image compression and indexing based on extensions to 2D of some of the Lempel-Ziv incremental parsing techniques have been proposed in the recent past. In these approaches, an image is decomposed into a number of patches, consisting each of a square or rectangular solid block. This paper proposes image compression techniques based on patches that are not necessarily solid blocks, but are affected instead by a controlled number of undetermined or don't care pixels. Such patches are chosen from a set of candidate motifs that are extracted in turn from the image 2D motif basis, the latter consisting of a compact set of patterns that result from the autocorrelation of the image w…

Pixelbusiness.industryComputingMethodologies_IMAGEPROCESSINGANDCOMPUTERVISIONPattern recognitionData_CODINGANDINFORMATIONTHEORYcomputer.file_formatJPEGImage (mathematics)Compression (functional analysis)Motif extraction Pattern discoveryArtificial intelligencebusinessAlgorithmcomputerImage compressionData compressionMathematicsColor Cell CompressionBlock (data storage)2011 Data Compression Conference
researchProduct

Non-consistent cell-average multiresolution operators with application to image processing

2016

In recent years different techniques to process signal and image have been designed and developed. In particular, multiresolution representations of data have been studied and used successfully for several applications such as compression, denoising or inpainting. A general framework about multiresolution representation has been presented by Harten (1996) 20. Harten's schemes are based on two operators: decimation, D , and prediction, P , that satisfy the consistency property D P = I , where I is the identity operator. Recently, some new classes of multiresolution operators have been designed using learning statistical tools and weighted local polynomial regression methods obtaining filters…

Polynomial regressionDecimationTheoretical computer scienceApplied MathematicsInpaintingImage processing010103 numerical & computational mathematics01 natural sciences010101 applied mathematicsComputational MathematicsOperator (computer programming)Consistency (statistics)0101 mathematicsRepresentation (mathematics)AlgorithmMathematicsImage compressionApplied Mathematics and Computation
researchProduct

Postpartum Hemorrhage: Conservative Treatments

2023

Postpartum hemorrhage (PPH) is an obstetric emergency representing the first cause of obstetric mortality and a frequent cause of severe maternal morbidity. It can complicate vaginal or cesarean deliveries and accounts for 25% of all maternal deaths worldwide, as reported by the World Health Organization (WHO). Primary PPH is defined as blood loss from the genital tract of at least 500 ml after vaginal or 1000 ml following cesarean delivery within 24 h postpartum, whereas secondary PPH is defined as any significant bleeding from the birth canal occurring between 24 h and 12 weeks postnatally. Uterine atony is reported as the main cause of PPH and accounts for 75%–90% of primary PPH. When ut…

Postpartum hemorrhageUterotonic agentUterine tamponade procedureVascular ligation.Selective arterial embolizationUterine compression sutureUterine atonySettore MED/40 - Ginecologia E Ostetricia
researchProduct

Long-term fracture load of all-ceramic crowns: Effects of veneering ceramic thickness, application techniques, and cooling protocol

2020

Made available in DSpace on 2021-06-25T10:46:25Z (GMT). No. of bitstreams: 0 Previous issue date: 2020-11-01 Background: To evaluate, in vitro, the effects of the cooling protocol, application technique, and veneering ceramic thickness on the fracture resistance of ceramic crowns with Y-TZP frameworks. Material and Methods: 80 frameworks were made from zirconia by the CAD/CAM technique and divided into 8 groups (n = 10) according to the factors: “application technique” (stratified-L and pressed -P), “thickness” (1 mm and 2 mm), and “cooling protocol” (slow-S and fast-F) of the feldspathic veneering ceramic. After, all crowns were cemented over G10 preparations with resin cement (Panavia F, …

Prosthetic DentistryMaterials scienceAll ceramicmedicine.medical_treatmentResearchFracture loadVeneering ceramicceramics:CIENCIAS MÉDICAS [UNESCO]thicknessapplication techniquecooling protocolvisual_artAxial compressionUNESCO::CIENCIAS MÉDICASmedicinevisual_art.visual_art_mediumFracture (geology)ZirconiaVeneerCubic zirconiaCeramicComposite materialGeneral Dentistry
researchProduct

Computer Simulation to Optimize the VFA Alpha Prototype with a Hydraulic Piston Compressor and an Integrated Booster

2020

The research has been supported by the European Regional Development Fund project "Competence Centre of Mechanical Engineering", contract No.1.2.1.1/18/A/008 signed between the Competence Centre of Mechanical Engineering and the Central Finance and Contracting Agency, Research No. 3.1 "Additional research and integra tion of the technology of hydraulic piston, aiming to develop and demonstrate economically efficient compressed natural gas smart commercial vehicle fuelling appliance". Our special gratitude to Gaspard Bouteau, PhD, Research Engineer, who conducted research in Engie Lab CRIGEN. Scientific co-authorship of the Laboratory of Materials for Energy Harvesting and Storage, ISSP UL h…

QC1-9990211 other engineering and technologiesGeneral Physics and Astronomy02 engineering and technologyreservoir engineering and simulation7. Clean energyAutomotive engineering021105 building & constructionMatlab simulation:NATURAL SCIENCES:Physics [Research Subject Categories]Matlab simulationgas compression021108 energygas thermodynamicsBooster (rocketry)PhysicsGeneral EngineeringGas compressiongas storageAlpha (navigation)Hydraulic cylindermatlab simulationgas refuellingEnvironmental scienceGas compressor
researchProduct

Two-Higgs leptonic minimal flavour violation

2011

We construct extensions of the Standard Model with two Higgs doublets, where there are flavour changing neutral currents both in the quark and leptonic sectors, with their strength fixed by the fermion mixing matrices $V_{CKM}$ and $V_{PMNS}$. These models are an extension to the leptonic sector of the class of models previously considered by Branco, Grimus and Lavoura, for the quark sector. We consider both the cases of Dirac and Majorana neutrinos and identify the minimal discrete symmetry required in order to implement the models in a natural way.

QuarkPhysicsNuclear and High Energy PhysicsParticle physics010308 nuclear & particles physicsDirac (video compression format)High Energy Physics::LatticeFlavourHigh Energy Physics::PhenomenologyFOS: Physical sciencesFísicaFermion01 natural sciencesHigh Energy Physics - PhenomenologyMAJORANAStandard Model (mathematical formulation)High Energy Physics - Phenomenology (hep-ph)0103 physical sciencesHiggs bosonHigh Energy Physics::Experiment010306 general physicsDiscrete symmetry
researchProduct

Accelerating H.264 inter prediction in a GPU by using CUDA

2010

H.264/AVC defines a very efficient algorithm for the inter prediction but it takes too much time. With the emergence of General Purpose Graphics Processing Units (GPGPU), a new door has been opened to support this video algorithm into these small processing units. In this paper, a forward step is developed towards an implementation of the H.264/AVC inter prediction algorithm into a GPU using Compute Unified Device Architecture (CUDA). The results show a negligible rate distortion drop with a time reduction on average up to 93.6%.

Reduction (complexity)CUDACoprocessorComputer scienceImage processingParallel computingGeneral-purpose computing on graphics processing unitsGraphicsData compression2010 Digest of Technical Papers International Conference on Consumer Electronics (ICCE)
researchProduct

Optimizing H.264/AVC interprediction on a GPU-based framework

2011

H.264/MPEG-4 part 10 is the latest standard for video compression and promises a significant advance in terms of quality and distortion compared with the commercial standards currently most in use such as MPEG-2 or MPEG-4. To achieve this better performance, H.264 adopts a large number of new/improved compression techniques compared with previous standards, albeit at the expense of higher computational complexity. In addition, in recent years new hardware accelerators have emerged, such as graphics processing units (GPUs), which provide a new opportunity to reduce complexity for a large variety of algorithms. However, current GPUs suffer from higher power consumption requirements because of…

Reduction (complexity)Computational Theory and MathematicsComputer Networks and CommunicationsComputer scienceDistortionMotion estimationSymmetric multiprocessor systemEnergy consumptionParallel computingSoftwareComputer Science ApplicationsTheoretical Computer ScienceData compressionConcurrency and Computation: Practice and Experience
researchProduct

Balancing and clustering of words: a combinatorial analysis of the Burrows & Wheeler Transform

2010

The Burrows-Wheeler Transform (denoted by BWT) is a well founded mathematical transformation on sequences introduced in 1994, widely used in the context of Data Compression and recently studied also from a combinatorial point of view. The transformation does not itself compress the data, but it produces a permutation bwt(w) of an input string w that is easier to compress than the original one, with some fast locally-adaptive algorithms, such as Move-to-Front in combination with Huffman or arithmetic coding. It is well-known that in most real texts, characters with the same or similar contexts tend to be the same. So, the BWT tends to group together characters which occur adjacent to similar…

Rich wordSettore INF/01 - InformaticaPalindromeData CompressionBurrows-Wheeler transformBalanced wordCombinatorics on word
researchProduct

Compressive biological sequence analysis and archival in the era of high-throughput sequencing technologies

2013

High-throughput sequencing technologies produce large collections of data, mainly DNA sequences with additional information, requiring the design of efficient and effective methodologies for both their compression and storage. In this context, we first provide a classification of the main techniques that have been proposed, according to three specific research directions that have emerged from the literature and, for each, we provide an overview of the current techniques. Finally, to make this review useful to researchers and technicians applying the existing software and tools, we include a synopsis of the main characteristics of the described approaches, including details on their impleme…

Sequence analysisComputer sciencebusiness.industryComputational BiologyHigh-Throughput Nucleotide SequencingContext (language use)Data CompressionBioinformaticsData scienceDNA sequencingSoftwareSequence analysis Data compressionMetagenomicsState (computer science)businessSequence AlignmentMolecular BiologyAlgorithmsSoftwareInformation SystemsData compressionBriefings in Bioinformatics
researchProduct