Search results for "IONT"

showing 10 items of 382 documents

Step-by-Step Control of the Dynamics of a Superconducting QED-like System

2007

We discuss the modus operandi of a theoretical scalable coupling scheme to control step by step the time evolution of a pair of flux qubits embedded in a lossy resonant cavity. The sequential interaction of each qubit with the quantized cavity mode is controlled by externally applied magnetic fluxes. Our analysis indicates that indirect qubit-qubit interactions, with the electromagnetic mode acting as a data bus, can be selectively performed and exploited both for the implementation of entangling gates and for the generation of states with a priori known characteristics.

Statistics and ProbabilityCouplingPhysicsSuperconductivityFlux qubitComplex systemTime evolutionStatistical and Nonlinear PhysicsData_CODINGANDINFORMATIONTHEORYQuantum PhysicsLossy compressioncoupling schemeTopologyComputer Science::Emerging TechnologiesControl theoryQubitHardware_ARITHMETICANDLOGICSTRUCTURESMathematical PhysicsSystem busOpen Systems & Information Dynamics
researchProduct

Groundwater biodiversity in Europe

2009

18 páginas, 7 figuras, 4 tables et al

StygobiontsEcologyRarityBiodiversitySpecies diversityIntroduced speciesAquatic ScienceBiologyTaxonGenusSpatial ecologyHotspotsSpecies richnessEndemismEndemismSpecies richnessFreshwater Biology
researchProduct

18F-FDG PET imaging of breast cancer : evaluation of the metabolic behaviour of the different breast cancer subtypes and prediction of the tumor resp…

2015

Positron Emission Tomography (PET) with 18Fluoro-deoxyglucose (18F-FDG) is the reference imaging examination for in-vivo quantification of the glucidic metabolism of tumour cells. It allows for the monitoring of tumour metabolic changes during chemotherapy. Breast cancer comprises several distinct genomic entities with different biological characteristics and clinical behaviours, leading to different tailored treatments. The aim of this doctoral thesis was to evaluate the relationship between the different biological entities of breast cancer and the tumour metabolic behaviour during neoadjuvant chemotherapy. We have also retrieved, among the various metabolic parameters on PET images, the …

Subtypes[SDV.IB] Life Sciences [q-bio]/BioengineeringFDGPositon EmissionTomographyRéponse tumoraleNeoadjuvant chemotherapyPhénotypesPETBreast cancerTEP[SDV.CAN] Life Sciences [q-bio]/CancerÉvaluationTomographie par Emission de PositonChimiothérapie néoadjucanteTumor responseCancer du sein[SPI.SIGNAL] Engineering Sciences [physics]/Signal and Image processing
researchProduct

Development of antimigraine transdermal delivery systems of pizotifen malate.

2015

Abstract The aim of this study was to develop and evaluate a transdermal delivery system of pizotifen malate. Pizotifen is frequently used in the preventive treatment of migraine, but is also indicated in eating disorders. In the course of the project, the effects of chemical enhancers such as ethanol, 1,8-cineole, limonene, azone and different fatty acids (decanoic, decenoic, dodecanoic, linoleic and oleic acids) were determined, first using a pizotifen solution. Steady state flux, diffusion and partition parameters were estimated by fitting the Scheuplein equation to the data obtained. Among the chemical enhancers studied, decenoic acid showed the highest enhancement activity, which seeme…

SwineMigraine DisordersSkin AbsorptionPharmaceutical ScienceAbsorption (skin)PizotifenIn Vitro TechniquesAdministration Cutaneouschemistry.chemical_compoundDrug Delivery SystemsCyclohexenesmedicineOrganic chemistryAnimalsTransdermalDegree of unsaturationPizotylineEucalyptolIontophoresisEthanolTerpenesFatty AcidsAzepinesAnalgesics Non-NarcoticIontophoresisCyclohexanolsOleic acidchemistryMonoterpenesDecenoic AcidAzoneLimonenemedicine.drugInternational journal of pharmaceutics
researchProduct

Cost-driven framework for progressive compression of textured meshes

2019

International audience; Recent advances in digitization of geometry and radiometry generate in routine massive amounts of surface meshes with texture or color attributes. This large amount of data can be compressed using a progressive approach which provides at decoding low complexity levels of details (LoDs) that are continuously refined until retrieving the original model. The goal of such a progressive mesh compression algorithm is to improve the overall quality of the transmission for the user, by optimizing the rate-distortion trade-off. In this paper, we introduce a novel meaningful measure for the cost of a progressive transmission of a textured mesh by observing that the rate-distor…

Texture atlasDecimationadaptive quantizationmultiplexingComputer scienceGeometry compressionComputingMethodologies_IMAGEPROCESSINGANDCOMPUTERVISIONInversesurface meshes02 engineering and technologyData_CODINGANDINFORMATIONTHEORYtexturesprogressive vs single-rate[INFO.INFO-CG]Computer Science [cs]/Computational Geometry [cs.CG]MultiplexingCCS CONCEPTS • Computing methodologies → Computer graphics020204 information systems0202 electrical engineering electronic engineering information engineering020201 artificial intelligence & image processingPolygon meshQuantization (image processing)AlgorithmDecoding methodsData compressionComputingMethodologies_COMPUTERGRAPHICS
researchProduct

Boosting Textual Compression in Optimal Linear Time

2005

We provide a general boosting technique for Textual Data Compression. Qualitatively, it takes a good compression algorithm and turns it into an algorithm with a better compression performance guarantee. It displays the following remarkable properties: (a) it can turn any memoryless compressor into a compression algorithm that uses the “best possible” contexts; (b) it is very simple and optimal in terms of time; and (c) it admits a decompression algorithm again optimal in time. To the best of our knowledge, this is the first boosting technique displaying these properties.Technically, our boosting technique builds upon three main ingredients: the Burrows--Wheeler Transform, the Suffix Tree d…

Theoretical computer scienceBurrows–Wheeler transformSuffix treeString (computer science)Data_CODINGANDINFORMATIONTHEORYBurrows-Wheeler transformSubstringArithmetic codinglaw.inventionLempel-Ziv compressorsArtificial IntelligenceHardware and ArchitectureControl and Systems Engineeringlawtext compressionempirical entropyArithmetic codingGreedy algorithmTime complexityAlgorithmSoftwareInformation SystemsMathematicsData compression
researchProduct

On parsing optimality for dictionary-based text compression—the Zip case

2013

Dictionary-based compression schemes are the most commonly used data compression schemes since they appeared in the foundational paper of Ziv and Lempel in 1977, and generally referred to as LZ77. Their work is the base of Zip, gZip, 7-Zip and many other compression software utilities. Some of these compression schemes use variants of the greedy approach to parse the text into dictionary phrases; others have left the greedy approach to improve the compression ratio. Recently, two bit-optimal parsing algorithms have been presented filling the gap between theory and best practice. We present a survey on the parsing problem for dictionary-based text compression, identifying noticeable results …

Theoretical computer scienceComputer scienceData_CODINGANDINFORMATIONTHEORYTop-down parsingcomputer.software_genreTheoretical Computer ScienceParsing optimalityCompression (functional analysis)Discrete Mathematics and CombinatoricsLossless compressionParsingLZ77 algorithmSettore INF/01 - InformaticaDeflate algorithmbusiness.industryDictionary-based text compressionComputational Theory and MathematicsData compressionDEFLATECompression ratioArtificial intelligencebusinesscomputerNatural language processingBottom-up parsingData compressionJournal of Discrete Algorithms
researchProduct

Dictionary-symbolwise flexible parsing

2012

AbstractLinear-time optimal parsing algorithms are rare in the dictionary-based branch of the data compression theory. A recent result is the Flexible Parsing algorithm of Matias and Sahinalp (1999) that works when the dictionary is prefix closed and the encoding of dictionary pointers has a constant cost. We present the Dictionary-Symbolwise Flexible Parsing algorithm that is optimal for prefix-closed dictionaries and any symbolwise compressor under some natural hypothesis. In the case of LZ78-like algorithms with variable costs and any, linear as usual, symbolwise compressor we show how to implement our parsing algorithm in linear time. In the case of LZ77-like dictionaries and any symbol…

Theoretical computer scienceComputer science[INFO.INFO-DS]Computer Science [cs]/Data Structures and Algorithms [cs.DS][INFO.INFO-DS] Computer Science [cs]/Data Structures and Algorithms [cs.DS]Data_CODINGANDINFORMATIONTHEORY0102 computer and information sciences02 engineering and technologycomputer.software_genre01 natural sciencesDirected acyclic graphTheoretical Computer ScienceConstant (computer programming)020204 information systemsEncoding (memory)Optimal parsing0202 electrical engineering electronic engineering information engineeringDiscrete Mathematics and CombinatoricsStringologySymbolwise text compressionTime complexityLossless compressionParsingSettore INF/01 - InformaticaDictionary-based compressionOptimal Parsing Lossless Data Compression DAGDirected acyclic graphPrefixComputational Theory and MathematicsText compression010201 computation theory & mathematicsAlgorithmcomputerBottom-up parsingData compressionJournal of Discrete Algorithms
researchProduct

Text Compression Using Antidictionaries

1999

International audience; We give a new text compression scheme based on Forbidden Words ("antidictionary"). We prove that our algorithms attain the entropy for balanced binary sources. They run in linear time. Moreover, one of the main advantages of this approach is that it produces very fast decompressors. A second advantage is a synchronization property that is helpful to search compressed data and allows parallel compression. Our algorithms can also be presented as "compilers" that create compressors dedicated to any previously fixed source. The techniques used in this paper are from Information Theory and Finite Automata.

Theoretical computer scienceFinite-state machineComputer science[INFO.INFO-DS]Computer Science [cs]/Data Structures and Algorithms [cs.DS]010102 general mathematicsforbidden wordData_CODINGANDINFORMATIONTHEORY0102 computer and information sciencesInformation theory01 natural sciencesfinite automatonParallel compressionpattern matching010201 computation theory & mathematicsEntropy (information theory)Pattern matching0101 mathematicsTime complexityAlgorithmdata compressioninformation theoryData compression
researchProduct

Analog Multiple Description Joint Source-Channel Coding Based on Lattice Scaling

2015

Joint source-channel coding schemes based on analog mappings for point-to-point channels have recently gained attention for their simplicity and low delay. In this paper, these schemes are extended either to scenarios with or without side information at the decoders to transmit multiple descriptions of a Gaussian source over independent parallel channels. They are based on a lattice scaling approach together with bandwidth reduction analog mappings adapted for this multiple description scenario. The rationale behind lattice scaling is to improve performance through bandwidth expansion. Another important contribution of this paper is the proof of the separation theorem for the communication …

Theoretical computer scienceGaussianBandwidth (signal processing)Data_CODINGANDINFORMATIONTHEORYTopologysymbols.namesakeAdditive white Gaussian noiseBandwidth expansionSignal ProcessingsymbolsMutual fund separation theoremElectrical and Electronic EngineeringScalingDecoding methodsComputer Science::Information TheoryMathematicsCoding (social sciences)IEEE Transactions on Signal Processing
researchProduct