Search results for "Scalability"

showing 10 items of 221 documents

GROMEX: A Scalable and Versatile Fast Multipole Method for Biomolecular Simulation

2020

Atomistic simulations of large biomolecular systems with chemical variability such as constant pH dynamic protonation offer multiple challenges in high performance computing. One of them is the correct treatment of the involved electrostatics in an efficient and highly scalable way. Here we review and assess two of the main building blocks that will permit such simulations: (1) An electrostatics library based on the Fast Multipole Method (FMM) that treats local alternative charge distributions with minimal overhead, and (2) A $λ$-dynamics module working in tandem with the FMM that enables various types of chemical transitions during the simulation. Our $λ$-dynamics and FMM implementations d…

Computer scienceFast multipole method05 social sciencesFast Fourier transform050301 educationSupercomputerElectrostaticsbiomolekyylitComputational scienceMolecular dynamicsCUDAsähköstatiikkaParticle MeshScalabilityOverhead (computing)simulointi0501 psychology and cognitive sciencesSIMD0503 education050104 developmental & child psychology
researchProduct

A new Adaptive and Progressive Image Transmission Approach using Function Superpositions

2010

International audience; We present a novel approach to adaptive and progressive image transmission, based on the decomposition of an image into compositions and superpositions of monovariate functions. The monovariate functions are iteratively constructed and transmitted, one after the other, to progressively reconstruct the original image: the progressive transmission is performed directly in the 1D space of the monovariate functions and independently of any statistical properties of the image. Each monovariate function contains only a fraction of the pixels of the image. Each new transmitted monovariate function adds data to the previously transmitted monovariate functions. After each tra…

Computer scienceImage qualityComputingMethodologies_IMAGEPROCESSINGANDCOMPUTERVISION02 engineering and technologyIterative reconstructionmultidimensional function decompositionSuperposition principleRobustness (computer science)[ INFO.INFO-TI ] Computer Science [cs]/Image Processing0202 electrical engineering electronic engineering information engineeringComputer visionsignal processingspatial scalability.Image resolutionImage restorationSignal processingPixelbusiness.industryprogressive image transmissionGeneral Engineering020206 networking & telecommunicationsAtomic and Molecular Physics and Opticsfunctional representation[INFO.INFO-TI]Computer Science [cs]/Image Processing [eess.IV]Computer Science::Computer Vision and Pattern RecognitionKolmogorov superposition theorem020201 artificial intelligence & image processingTomographyArtificial intelligencebusinessDigital filterAlgorithmspatial scalabilityImage compression
researchProduct

Embedded Processing and Compression of 3D Sensor Data for Large Scale Industrial Environments

2019

This paper presents a scalable embedded solution for processing and transferring 3D point cloud data. Sensors based on the time-of-flight principle generate data which are processed on a local embedded computer and compressed using an octree-based scheme. The compressed data is transferred to a central node where the individual point clouds from several nodes are decompressed and filtered based on a novel method for generating intensity values for sensors which do not natively produce such a value. The paper presents experimental results from a relatively large industrial robot cell with an approximate size of 10 m &times

Computer sciencePoint cloud02 engineering and technologylcsh:Chemical technologytime-of-flightBiochemistryArticleAnalytical ChemistryComputational sciencelaw.inventionIndustrial robotOctreelawpoint clouds0202 electrical engineering electronic engineering information engineeringdenoisinglcsh:TP1-1185Electrical and Electronic EngineeringInstrumentationlidarscalabilityLocal area network020206 networking & telecommunications020207 software engineering3D sensorscompressionAtomic and Molecular Physics and OpticsScalabilitySensors (Basel, Switzerland)
researchProduct

Multimode entanglement in reconfigurable graph states using optical frequency combs

2017

Multimode entanglement is an essential resource for quantum information processing and quantum metrology. However, multimode entangled states are generally constructed by targeting a specific graph configuration. This yields to a fixed experimental setup that therefore exhibits reduced versatility and scalability. Here we demonstrate an optical on-demand, reconfigurable multimode entangled state, using an intrinsically multimode quantum resource and a homodyne detection apparatus. Without altering either the initial squeezing source or experimental architecture, we realize the construction of thirteen cluster states of various sizes and connectivities as well as the implementation of a secr…

Computer scienceScienceGeneral Physics and Astronomy02 engineering and technologyQuantum entanglementTopology01 natural sciencesArticleGeneral Biochemistry Genetics and Molecular BiologyHomodyne detection0103 physical sciencesQuantum metrology010306 general physicsQuantum[PHYS]Physics [physics]MultidisciplinaryMulti-mode optical fiberQTheoryofComputation_GENERALQuantum PhysicsGeneral ChemistryOne-way quantum computer021001 nanoscience & nanotechnologyScalabilityGraph (abstract data type)0210 nano-technologyNature Communications
researchProduct

A Comparison Study of Metaheuristic Techniques for Providing QoS to Avatars in DVE Systems

2004

Network-server architecture has become a de-facto standard for Distributed Virtual Environment (DVE) systems. In these systems, a large set of remote users share a 3D virtual scene. In order to design scalable DVE systems, different approaches have been proposed to maintain the DVE system working under its saturation point, maximizing system throughput. Also, in order to provide quality of service to avatars in a DVE systems, avatars should be assigned to servers taking into account, among other factors, system throughput and system latency. This highly complex problem is called quality of service (QoS) problem in DVE systems. This paper proposes two different approaches for solving the QoS…

Computer scienceVirtual machineQuality of serviceServerDistributed computingScalabilityComputingMilieux_PERSONALCOMPUTINGHeuristicscomputer.software_genrecomputerMetaheuristic
researchProduct

PV-Alert: A fog-based architecture for safeguarding vulnerable road users

2017

International audience; High volumes of pedestrians, cyclists and other vulnerable road users (VRUs) have much higher casualty rates per mile; not surprising given their lack of protection from an accident. In order to alleviate the problem, sensing capabilities of smartphones can be used to detect, warn and safeguard these road users. In this research we propose an infrastructure-less fog-based architecture named PV-Alert (Pedestrian-Vehicle Alert) where fog nodes process delay sensitive data obtained from smartphones for alerting pedestrians and drivers before sending the data to the cloud for further analysis. Fog computing is considered in developing the architecture since it is an emer…

Computer science[SPI] Engineering Sciences [physics]Reliability (computer networking)mobile computingLatency (audio)traffic engineering computingCloud computing02 engineering and technologyFog ComputingComputer securitycomputer.software_genreroad vehicles[SPI]Engineering Sciences [physics]Low Latency0502 economics and businessPedestrian Safety0202 electrical engineering electronic engineering information engineeringWirelessComputer architectureArchitecturewireless LANEdge computing050210 logistics & transportationbusiness.industry05 social sciencesLocation awarenessroad traffic020206 networking & telecommunicationsVehiclesEdge computingsmart phonesVulnerable Road UsersRoadsroad accidentsCrowd sensingAccidentsScalabilitymobile radioSafetybusinessroad safetycomputer
researchProduct

A survey on data center network topologies

2018

Data centers are the infrastructures that support the cloud computing services. So, their topologies have an important role on controlling the performance of these services. Designing an efficient topology with a high scalability and a good network performance is one of the most important challenges in data centers. This paper surveys recent research advances linked to data center network topologies. We review some representative topologies and discuss their proprieties in details. We compare them in terms of average path length, network fault tolerance, scalability and connection pattern techniques. Springer Nature Switzerland AG 2018. Acknowledgment. This publication was made possible by …

Computer sciencebusiness.industryCloud computing servicesData center topologyFault toleranceCloud computingTopology (electrical circuits)Network topologyAverage path lengthNetwork performanceScalabilityNetwork performanceData centerbusinessSurveyComputer network
researchProduct

PTNet: A parameterizable data center network

2016

This paper presents PTNet, a new data center topology that is specifically designed to offer a high and parameterized scalability with just one layer architecture. Furthermore, despite its high scalability, PTNet grants a reduced latency and a high performance in terms of capacity and fault tolerance. Consequently, compared to widely known data center networks, our new topology shows better capacity, robustness, cost-effectiveness and less power consumption. Conducted experiments and theoretical analyses illustrate the performance of the novel system. 2016 IEEE. Scopus

Computer sciencebusiness.industryDistributed computingLogical topologypower consumption020206 networking & telecommunicationsFault tolerance02 engineering and technologyNetwork topologynetwork topologyRobustness (computer science)020204 information systemsServercostScalability0202 electrical engineering electronic engineering information engineeringData centerData centerfault tolerancebusinesslatencyscalability2016 IEEE Wireless Communications and Networking Conference
researchProduct

DSMAV: An improved solution for multi-attribute search based on load capacities

2016

DHT (Distributed Hash Table) such as CHORD or PARTRY facilitates information searching in scalable systems. Two popular DHT-based approaches for range or multi-attribute search are to rely on attribute-value tree and a combination of attributes and values. However, tradeoff between a load balancing mechanism and query efficiency is a challenging task for such information searching systems. In this paper, we propose improved algorithms for a system called DSMAV in which information resources are distributed fairly among nodes and found based on multi-attribute queries in a small number of hop counts. Our system creates identifiers from resource names, each of which is a combination of attrib…

Computer sciencebusiness.industryDistributed computingScalability0202 electrical engineering electronic engineering information engineering020206 networking & telecommunications020201 artificial intelligence & image processing02 engineering and technologyLoad balancing (computing)Chord (peer-to-peer)businessComputer networkDistributed hash table2016 IEEE Sixth International Conference on Communications and Electronics (ICCE)
researchProduct

Quality-preserving low-cost probabilistic 3D denoising with applications to Computed Tomography

2021

AbstractWe propose a pipeline for a synthetic generation of personalized Computer Tomography (CT) images, with a radiation exposure evaluation and a lifetime attributable risk (LAR) assessment. We perform a patient-specific performance evaluation for a broad range of denoising algorithms (including the most popular Deep Learning denoising approaches, wavelets-based methods, methods based on Mumford-Shah denoising etc.), focusing both on accessing the capability to reduce the patient-specific CT-induced LAR and on computational cost scalability. We introduce a parallel probabilistic Mumford-Shah denoising model (PMS), showing that it markedly-outperforms the compared common denoising methods…

Computer sciencebusiness.industryGaussianPipeline (computing)Deep learningNoise reductionProbabilistic logicPattern recognitionReduction (complexity)symbols.namesakeWaveletScalabilitysymbolsArtificial intelligencebusiness
researchProduct