Search results for " processing"

showing 10 items of 7549 documents

Efficient FPGA Implementation of an Adaptive Noise Canceller

2006

A hardware implementation of an adaptive noise canceller (ANC) is presented. It has been synthesized within an FPGA, using a modified version of the least mean square (LMS) error algorithm. The results obtained so far show a significant decrease of the required gate count when compared with a standard LMS implementation, while increasing the ANC bandwidth and signal to noise (S/N) ratio. This novel adaptive noise canceller is then useful for enhancing the S/N ratio of data collected from sensors (or sensor arrays) working in noisy environment, or dealing with potentially weak signals.

Computer scienceBandwidth (signal processing)Real-time computingSignal synthesisElectroencephalographyBioelectric potentialsLeast mean squares filterSignal-to-noise ratioGate countError analysisElectronic engineeringHardware_ARITHMETICANDLOGICSTRUCTURESField-programmable gate arrayEvoked PotentialsActive noise control
researchProduct

Efficient linear fusion of partial estimators

2018

Abstract Many signal processing applications require performing statistical inference on large datasets, where computational and/or memory restrictions become an issue. In this big data setting, computing an exact global centralized estimator is often either unfeasible or impractical. Hence, several authors have considered distributed inference approaches, where the data are divided among multiple workers (cores, machines or a combination of both). The computations are then performed in parallel and the resulting partial estimators are finally combined to approximate the intractable global estimator. In this paper, we focus on the scenario where no communication exists among the workers, de…

Computer scienceBayesian probabilityInferenceAsymptotic distribution02 engineering and technology01 natural sciences010104 statistics & probability[INFO.INFO-TS]Computer Science [cs]/Signal and Image ProcessingArtificial Intelligence0202 electrical engineering electronic engineering information engineeringStatistical inferenceFusion rules0101 mathematicsElectrical and Electronic EngineeringComputingMilieux_MISCELLANEOUSMinimum mean square errorApplied MathematicsConstrained optimizationEstimator020206 networking & telecommunicationsComputational Theory and MathematicsSignal ProcessingComputer Vision and Pattern RecognitionStatistics Probability and Uncertainty[SPI.SIGNAL]Engineering Sciences [physics]/Signal and Image processingAlgorithmDigital Signal Processing
researchProduct

Adaptive Importance Sampling: The past, the present, and the future

2017

A fundamental problem in signal processing is the estimation of unknown parameters or functions from noisy observations. Important examples include localization of objects in wireless sensor networks [1] and the Internet of Things [2]; multiple source reconstruction from electroencephalograms [3]; estimation of power spectral density for speech enhancement [4]; or inference in genomic signal processing [5]. Within the Bayesian signal processing framework, these problems are addressed by constructing posterior probability distributions of the unknowns. The posteriors combine optimally all of the information about the unknowns in the observations with the information that is present in their …

Computer scienceBayesian probabilityPosterior probabilityInference02 engineering and technologyMachine learningcomputer.software_genre01 natural sciences010104 statistics & probabilityMultidimensional signal processing[INFO.INFO-TS]Computer Science [cs]/Signal and Image ProcessingPrior probability0202 electrical engineering electronic engineering information engineering0101 mathematicsElectrical and Electronic EngineeringComputingMilieux_MISCELLANEOUSbusiness.industryApplied Mathematics020206 networking & telecommunicationsApproximate inferenceSignal ProcessingProbability distributionArtificial intelligencebusinessAlgorithmcomputer[SPI.SIGNAL]Engineering Sciences [physics]/Signal and Image processingImportance sampling
researchProduct

Manulex-infra: Distributional characteristics of grapheme—phoneme mappings, and infralexical and lexical units in child-directed written material

2007

It is well known that the statistical characteristics of a language, such as word frequency or the consistency of the relationships between orthography and phonology, influence literacy acquisition. Accordingly, linguistic databases play a central role by compiling quantitative and objective estimates about the principal variables that affect reading and writing acquisition. We describe a new set of Web-accessible databases of French orthography whose main characteristic is that they are based on frequency analyses of words occurring in reading books used in the elementary school grades. Quantitative estimates were made for several infralexical variables (syllable, grapheme-to-phoneme mappi…

Computer scienceBigrammedia_common.quotation_subjectExperimental and Cognitive PsychologyHomophonycomputer.software_genreVocabularyManuals as TopicArts and Humanities (miscellaneous)PhoneticsReading (process)Developmental and Educational PsychologyHumansChildGeneral Psychologymedia_commonPsycholinguisticsbusiness.industryPhonologyLinguisticsWord lists by frequencyWritten languagePsychology (miscellaneous)Artificial intelligenceSyllablebusinesscomputerNatural language processingOrthographyBehavior Research Methods
researchProduct

Parallelizing Epistasis Detection in GWAS on FPGA and GPU-Accelerated Computing Systems

2015

This is a post-peer-review, pre-copyedit version of an article published in IEEE - ACM Transactions on Computational Biology and Bioinformatics. The final authenticated version is available online at: http://dx.doi.org/10.1109/TCBB.2015.2389958 [Abstract] High-throughput genotyping technologies (such as SNP-arrays) allow the rapid collection of up to a few million genetic markers of an individual. Detecting epistasis (based on 2-SNP interactions) in Genome-Wide Association Studies is an important but time consuming operation since statistical computations have to be performed for each pair of measured markers. Computational methods to detect epistasis therefore suffer from prohibitively lon…

Computer scienceBioinformaticsDNA Mutational AnalysisGenome-wide association studyParallel computingPolymorphism Single NucleotideSensitivity and SpecificityComputational biologyComputer GraphicsGeneticsComputer architectureField-programmable gate arrayRandom access memoryApplied MathematicsChromosome MappingHigh-Throughput Nucleotide SequencingReproducibility of ResultsField programmable gate arraysEpistasis GeneticSignal Processing Computer-AssistedEquipment DesignRandom access memoryComputing systemsReconfigurable computingEquipment Failure AnalysisTask (computing)EpistasisHost (network)Graphics processing unitsGenome-Wide Association StudyBiotechnology
researchProduct

Real Time Image Rotation Using Dynamic Reconfiguration

2002

Abstract Field programmable gate array (FPGA) components are widely used nowdays to implement various algorithms, such as digital filtering, in real time. The emergence of dynamically reconfigurable FPGAs made it possible to reduce the number of necessary resources to carry out an image-processing task (tasks chain). In this article, an image-processing application, image rotation, that exploits the FPGAs dynamic reconfiguration method is presented. This paper shows that the choice of an implementation, static or dynamic reconfiguration, depends on the nature of the application. A comparison is carried out between the dynamic and the static reconfiguration using two criteria: cost and perfo…

Computer scienceBlock diagramControl reconfigurationImage processingTask (computing)Computer engineeringSignal ProcessingComputer Vision and Pattern RecognitionElectrical and Electronic EngineeringField-programmable gate arrayDynamic methodReal-time operating systemImage restorationSimulationReal-Time Imaging
researchProduct

A chirp-z transform-based synchronizer for power system measurements

2005

In the last few years, increased interest in power and voltage quality has forced international working groups to standardize testing and measurement techniques. IEC 61000-4-30, which defines the characteristics of instrumentation for the measurement of power quality, refers to IEC 61000-4-7 for the evaluation of harmonics and interharmonics. This standard, revised in 2002, requires a synchronous sampling of voltage or current signal, in order to limit errors and to ensure reproducible results even in the presence of nonstationary signals. Therefore, an accurate estimation of the fundamental frequency is required, even in the presence of disturbances. In this paper, an algorithm to detect t…

Computer scienceBluestein's FFT algorithmFast Fourier transformChirp-z transform power quality synchronizationFundamental frequencyPower (physics)Electric power systemSampling (signal processing)SynchronizerHarmonicsElectronic engineeringElectrical and Electronic EngineeringInstrumentationSettore ING-INF/07 - Misure Elettriche E ElettronicheInterpolation
researchProduct

Knowledge acquisition through introspection in Human-Robot Cooperation

2018

Abstract When cooperating with a team including humans, robots have to understand and update semantic information concerning the state of the environment. The run-time evaluation and acquisition of new concepts fall in the critical mass learning. It is a cognitive skill that enables the robot to show environmental awareness to complete its tasks successfully. A kind of self-consciousness emerges: the robot activates the introspective mental processes inferring if it owns a domain concept or not, and correctly blends the conceptual meaning of new entities. Many works attempt to simulate human brain functions leading to neural network implementation of consciousness; regrettably, some of thes…

Computer scienceCognitive Neurosciencemedia_common.quotation_subjectExperimental and Cognitive PsychologyCognitive agent02 engineering and technologyOntology (information science)Human–robot interaction03 medical and health sciences0302 clinical medicineArtificial IntelligenceHuman–computer interaction0202 electrical engineering electronic engineering information engineeringCognitive skillmedia_commonTeamworkOntologyIntrospectionCognitive architectureCognitive architectureKnowledge acquisitionKnowledge acquisitionRobotIntrospection020201 artificial intelligence & image processing030217 neurology & neurosurgeryBiologically Inspired Cognitive Architectures
researchProduct

2D virtual texture on 3D real object with coded structured light

2008

Augmented reality is used to improve color segmentation on human body or on precious no touch artifacts. We propose a technique to project a synthesized texture on real object without contact. Our technique can be used in medical or archaeological application. By projecting a suitable set of light patterns onto the surface of a 3D real object and by capturing images with a camera, a large number of correspondences can be found and the 3D points can be reconstructed. We aim to determine these points of correspondence between cameras and projector from a scene without explicit points and normals. We then project an adjusted texture onto the real object surface. We propose a global and automat…

Computer scienceColor imagebusiness.industryEpipolar geometryComputingMethodologies_IMAGEPROCESSINGANDCOMPUTERVISIONImage processingImage segmentationObject (computer science)law.inventionProjectorlawComputer graphics (images)Augmented realitySegmentationComputer visionArtificial intelligencebusinessComputingMethodologies_COMPUTERGRAPHICSStructured lightImage Processing: Machine Vision Applications
researchProduct

A self-adaptable distributed CBR version of the EquiVox system

2016

Three dimensional (3D) voxel phantoms are numerical representations of human bodies, used by physicians in very different contexts. In the controlled context of hospitals, where from 2 to 10 subjects may arrive per day, phantoms are used to verify computations before therapeutic exposure to radiation of cancerous tumors. In addition, 3D phantoms are used to diagnose the gravity of accidental exposure to radiation. In such cases, there may be from 10 to more than 1000 subjects to be diagnosed simultaneously. In all of these cases, computation accuracy depends on a single such representation. In this paper, we present EquiVox which is a tool composed of several distributed functions and enab…

Computer scienceComputation0206 medical engineeringBiomedical EngineeringBiophysicsTherapeutic exposureBioengineeringContext (language use)02 engineering and technology[INFO.INFO-SE]Computer Science [cs]/Software Engineering [cs.SE]computer.software_genreMachine learning[INFO.INFO-IU]Computer Science [cs]/Ubiquitous Computing[INFO.INFO-CR]Computer Science [cs]/Cryptography and Security [cs.CR]Voxel0202 electrical engineering electronic engineering information engineeringComputer visionRepresentation (mathematics)Adaptation (computer science)business.industryMulti-agent system020601 biomedical engineering[INFO.INFO-MO]Computer Science [cs]/Modeling and Simulation[INFO.INFO-MA]Computer Science [cs]/Multiagent Systems [cs.MA]Key (cryptography)020201 artificial intelligence & image processing[INFO.INFO-ET]Computer Science [cs]/Emerging Technologies [cs.ET]Artificial intelligence[INFO.INFO-DC]Computer Science [cs]/Distributed Parallel and Cluster Computing [cs.DC]businesscomputer
researchProduct