Search results for "Crypt"

showing 10 items of 1111 documents

Cloud Computing e protezione dei dati nel web 3.0

2014

Il cloud computing è un ritrovato deputato all'archiviazione, elaborazione e uso di dati su computer remoti, grazie al quale gli utenti hanno a disposizione una potenza di elaborazione quasi illimitata, non sono tenuti ad investire grandi capitali per soddisfare le proprie esigenze e possono accedere ai loro dati ovunque sia disponibile una connessione Internet. Il presente scritto si propone di contribuire ad una migliore comprensione della nuova tecnologia e alla ricostruzione dell'attuale (invero scarno) quadro giuridico, ciò che può aiutare a superare i principali problemi posti dalla nuvola al diritto dei privati e riconducibili, primariamente, alla protezione dei dati, al diritto dei …

Settore IUS/14 - Diritto Dell'Unione EuropeaSettore IUS/01 - Diritto Privatocloud cloud computing data data protection privacy security encryption EU law european private law italian law comparative law cyberlaw ICT law technology information and communications technology law & technology
researchProduct

Coherent Conditional Previsions and Proper Scoring Rules

2012

In this paper we study the relationship between the notion of coherence for conditional prevision assessments on a family of finite conditional random quantities and the notion of admissibility with respect to bounded strictly proper scoring rules. Our work extends recent results given by the last two authors of this paper on the equivalence between coherence and admissibility for conditional probability assessments. In order to prove that admissibility implies coherence a key role is played by the notion of Bregman divergence.

Settore MAT/06 - Probabilita' E Statistica Matematicabregman divergenceproper scor- ing rulesConditional prevision assessmentsconditional scoring rulesstrong dominanceConditional probabilityweak dominanceCoherence (statistics)Bregman divergenceConditional prevision assessments coherence proper scoring rules conditional scoring rules weak dominance strong dominance admissibility Bregman divergence.proper scoring rulescoherenceBounded functionKey (cryptography)admissibilityConditional prevision assessments; conditional scoring rules; admissibility; proper scor- ing rules; weak dominance; strong dominanceEquivalence (measure theory)Mathematical economicsconditional prevision assessments; strong dominance; admissibility; proper scoring rules; bregman divergence; weak dominance; conditional scoring rules; coherenceMathematics
researchProduct

A multidimensional hydrodynamic code for structure evolution in cosmology

1996

A cosmological multidimensional hydrodynamic code is described and tested. This code is based on modern high-resolution shock-capturing techniques. It can make use of a linear or a parabolic cell reconstruction as well as an approximate Riemann solver. The code has been specifically designed for cosmological applications. Two tests including shocks have been considered: the first one is a standard shock tube and the second test involves a spherically symmetric shock. Various additional cosmological tests are also presented. In this way, the performance of the code is proved. The usefulness of the code is discussed; in particular, this powerful tool is expected to be useful in order to study…

Shock wavePhysicsAstrophysics (astro-ph)Structure (category theory)FOS: Physical sciencesAstronomy and AstrophysicsAstrophysicsCosmologyRiemann solverShock (mechanics)symbols.namesakeSpace and Planetary ScienceComponent (UML)symbolsCode (cryptography)Statistical physicsShock tube
researchProduct

A Spatial-Temporal Correlation Approach for Data Reduction in Cluster-Based Sensor Networks

2019

International audience; In a resource-constrained Wireless Sensor Networks (WSNs), the optimization of the sampling and the transmission rates of each individual node is a crucial issue. A high volume of redundant data transmitted through the network will result in collisions, data loss, and energy dissipation. This paper proposes a novel data reduction scheme, that exploits the spatial-temporal correlation among sensor data in order to determine the optimal sampling strategy for the deployed sensor nodes. This strategy reduces the overall sampling/transmission rates while preserving the quality of the data. Moreover, a back-end reconstruction algorithm is deployed on the workstation (Sink)…

Signal Processing (eess.SP)FOS: Computer and information sciencesAdaptive samplingGeneral Computer ScienceComputer sciencespatial-temporal correlationReal-time computing02 engineering and technologyData loss[INFO.INFO-SE]Computer Science [cs]/Software Engineering [cs.SE]data reconstructionQA76Computer Science - Networking and Internet Architecture[INFO.INFO-IU]Computer Science [cs]/Ubiquitous Computing[INFO.INFO-CR]Computer Science [cs]/Cryptography and Security [cs.CR]FOS: Electrical engineering electronic engineering information engineering0202 electrical engineering electronic engineering information engineeringGeneral Materials ScienceElectrical Engineering and Systems Science - Signal ProcessingNetworking and Internet Architecture (cs.NI)General EngineeringSampling (statistics)020206 networking & telecommunicationsReconstruction algorithmDissipation[INFO.INFO-MO]Computer Science [cs]/Modeling and SimulationWireless sensor networks[INFO.INFO-MA]Computer Science [cs]/Multiagent Systems [cs.MA]data reduction020201 artificial intelligence & image processing[INFO.INFO-ET]Computer Science [cs]/Emerging Technologies [cs.ET]lcsh:Electrical engineering. Electronics. Nuclear engineering[INFO.INFO-DC]Computer Science [cs]/Distributed Parallel and Cluster Computing [cs.DC]lcsh:TK1-9971Wireless sensor networkData reduction
researchProduct

Design of SCMA Codebooks using Differential Evolution

2020

Non-orthogonal multiple access (NOMA) is a promising technology which meets the demands of massive connectivity in future wireless networks. Sparse code multiple access (SCMA) is a popular code-domain NOMA technique. The effectiveness of SCMA comes from: (1) the multi-dimensional sparse codebooks offering high shaping gain and (2) sophisticated multi-user detection based on message passing algorithm (MPA). The codebooks of the users play the main role in determining the performance of SCMA system. This paper presents a framework to design the codebooks by taking into account the entire system including the SCMA encoder and the MPA-based detector. The symbol-error rate (SER) is considered as…

Signal Processing (eess.SP)FOS: Computer and information sciencesComputer scienceWireless networkInformation Theory (cs.IT)Computer Science - Information Theory05 social sciencesMessage passingDetector050801 communication & media studiesmedicine.diseaseNoma0508 media and communicationsComputer engineeringDifferential evolution0502 economics and businessFOS: Electrical engineering electronic engineering information engineeringmedicineCode (cryptography)050211 marketingMinificationElectrical Engineering and Systems Science - Signal ProcessingEncoder2020 IEEE International Conference on Communications Workshops (ICC Workshops)
researchProduct

Blind multi-user detection by fast fixed point algorithm without prior knowledge of symbol-level timing

2003

We consider the estimation of the source process of the desired user an the downlink of a code-division multiple access (CDMA) communication system. In downlink signal processing, only the code of the mobile telephone user is known, while the codes of the interfering users are unknown. Blind source separation or independent component analysis is an approach offering the solution to this problem. In this work we apply the fast fixed point algorithm to the separation problem. The algorithm is based on fourth-order statistics optimization. Knowledge about the symbol level timing has to be known only coarsely.

Signal processingCode division multiple accessComputer scienceReal-time computingTelecommunications linkComputer Science::Networking and Internet ArchitectureCode (cryptography)Detection theoryCommunications systemIndependent component analysisBlind signal separationComputer Science::Information TheoryProceedings of the IEEE Signal Processing Workshop on Higher-Order Statistics. SPW-HOS '99
researchProduct

Optical encryption with compressive ghost imaging

2011

Ghost imaging (GI) is a novel technique where the optical information of an object is encoded in the correlation of the intensity fluctuations of a light source. Computational GI (CGI) is a variant of the standard procedure that uses a single bucket detector. Recently, we proposed to use CGI to encrypt and transmit the object information to a remote party [1]. The optical encryption scheme shows compressibility and robustness to eavesdropping attacks. The reconstruction algorithm provides a relative low quality images and requires high acquisitions times. A procedure to overcome such limitations is to combine CGI with compressive sampling (CS), an advanced signal processing theory that expl…

Signal processingLight intensityCompressed sensingbusiness.industryComputer scienceComputingMethodologies_IMAGEPROCESSINGANDCOMPUTERVISIONImage processingReconstruction algorithmIterative reconstructionGhost imagingEncryptionbusinessAlgorithm2011 Conference on Lasers and Electro-Optics Europe and 12th European Quantum Electronics Conference (CLEO EUROPE/EQEC)
researchProduct

Progressive transmission of secured images with authentication using decompositions into monovariate functions

2014

International audience; We propose a progressive transmission approach of an image authenticated using an overlapping subimage that can be removed to restore the original image. Our approach is different from most visible water- marking approaches that allow one to later remove the watermark, because the mark is not directly introduced in the two-dimensional image space. Instead, it is rather applied to an equivalent monovariate representation of the image. Precisely, the approach is based on our progressive transmission approach that relies on a modified Kolmogorov spline network, and therefore inherits its advantages: resilience to packet losses during transmis- sion and support of hetero…

Signal processing[ INFO.INFO-TS ] Computer Science [cs]/Signal and Image ProcessingComputingMethodologies_IMAGEPROCESSINGANDCOMPUTERVISION02 engineering and technology[ SPI.SIGNAL ] Engineering Sciences [physics]/Signal and Image processing01 natural sciencesImage encryption010309 optics[INFO.INFO-TS]Computer Science [cs]/Signal and Image Processing0103 physical sciences0202 electrical engineering electronic engineering information engineeringComputer visionElectrical and Electronic EngineeringDigital watermarkingImage resolutionMathematicsSignal processingAuthenticationNetwork packetbusiness.industryWatermarkAtomic and Molecular Physics and OpticsComputer Science ApplicationsSpline (mathematics)Binary dataKolmogorov superposition theorem020201 artificial intelligence & image processingArtificial intelligencebusinessVisible watermarking[SPI.SIGNAL]Engineering Sciences [physics]/Signal and Image processing
researchProduct

A cryptochrome-based photosensory system in the siliceous sponge Suberites domuncula (Demospongiae)

2010

Based on the light-reactive behavior of siliceous sponges, their intriguing quartz glass-based spicular system and the existence of a light-generating luciferase [Muller WEG et al. (2009) Cell Mol Life Sci 66, 537–552], a protein potentially involved in light reception has been identified, cloned and recombinantly expressed from the demosponge Suberites domuncula. Its sequence displays two domains characteristic of cryptochrome, the N-terminal photolyase-related region and the C-terminal FAD-binding domain. The expression level of S. domuncula cryptochrome depends on animal’s exposure to light and is highest in tissue regions rich in siliceous spicules; in the dark, no cryptochrome transcri…

Siliceous spongebiologyA proteinCell BiologyAnatomybiology.organism_classificationBiochemistryCell biologySuberites domunculaDemospongeSponge spiculeLight sourceCryptochromeLuciferaseMolecular BiologyFEBS Journal
researchProduct

unitas: the universal tool for annotation of small RNAs

2017

AbstractBackgroundNext generation sequencing is a key technique in small RNA biology research that has led to the discovery of functionally different classes of small non-coding RNAs in the past years. However, reliable annotation of the extensive amounts of small non-coding RNA data produced by high-throughput sequencing is time-consuming and requires robust bioinformatics expertise. Moreover, existing tools have a number of shortcomings including a lack of sensitivity under certain conditions, limited number of supported species or detectable sub-classes of small RNAs.ResultsHere we introduce unitas, an out-of-the-box ready software for complete annotation of small RNA sequence datasets, …

Small RNAtRNA-derived fragments (tRFs)Computational biologypiRNABiologyDNA sequencing570 Life sciencesAnnotationEnsemblHumansRNA-seq data analysismiRNAGeneticsbusiness.industryphasiRNARNAHigh-Throughput Nucleotide SequencingUsabilityMolecular Sequence AnnotationNon-coding RNAKey (cryptography)RNA Small UntranslatedSmall non-coding RNAsbusinessSoftwareHeLa Cells570 Biowissenschaften
researchProduct