Search results for "signal processing"

showing 10 items of 2451 documents

Simple computation of the approximated modulation transfer function (MTF) using spreadsheet-software: method and evaluation in five maxillofacial CBC…

2019

OBJECTIVES: To develop a simple way to compute the approximated modulation transfer function (MTF) manually using conventional spreadsheet software. METHODS: Basing on an edge-image a method was developed, facilitating computation of the edge spread and line spread function in open-source spreadsheet software (Gnumeric; http://projects.gnome.org/gnumeric/downloads.shtml). By means of the integrated fast Fourier transformation Fourier coefficients are obtained from the line spread function which can then be plotted vs spatial frequency to obtain MTF-plots. For the experimental evaluation an edge test object was exposed in five commercial CBCT devices for maxillofacial applications. RESULTS: …

Computer science030218 nuclear medicine & medical imaging03 medical and health sciencessymbols.namesake0302 clinical medicineTechnical ReportSimple (abstract algebra)Optical transfer functionRadiography DentalHumansRadiology Nuclear Medicine and imagingGeneral DentistryDigital signal processingSimple computationbusiness.industrySpreadsheet softwarePhantoms ImagingComputer Science::Software EngineeringReproducibility of Results030206 dentistryGeneral MedicineSpiral Cone-Beam Computed TomographyRadiographic Image EnhancementFourier transformOtorhinolaryngologysymbolsbusinessAlgorithmSoftware
researchProduct

Efficient FPGA Implementation of an Adaptive Noise Canceller

2006

A hardware implementation of an adaptive noise canceller (ANC) is presented. It has been synthesized within an FPGA, using a modified version of the least mean square (LMS) error algorithm. The results obtained so far show a significant decrease of the required gate count when compared with a standard LMS implementation, while increasing the ANC bandwidth and signal to noise (S/N) ratio. This novel adaptive noise canceller is then useful for enhancing the S/N ratio of data collected from sensors (or sensor arrays) working in noisy environment, or dealing with potentially weak signals.

Computer scienceBandwidth (signal processing)Real-time computingSignal synthesisElectroencephalographyBioelectric potentialsLeast mean squares filterSignal-to-noise ratioGate countError analysisElectronic engineeringHardware_ARITHMETICANDLOGICSTRUCTURESField-programmable gate arrayEvoked PotentialsActive noise control
researchProduct

Efficient linear fusion of partial estimators

2018

Abstract Many signal processing applications require performing statistical inference on large datasets, where computational and/or memory restrictions become an issue. In this big data setting, computing an exact global centralized estimator is often either unfeasible or impractical. Hence, several authors have considered distributed inference approaches, where the data are divided among multiple workers (cores, machines or a combination of both). The computations are then performed in parallel and the resulting partial estimators are finally combined to approximate the intractable global estimator. In this paper, we focus on the scenario where no communication exists among the workers, de…

Computer scienceBayesian probabilityInferenceAsymptotic distribution02 engineering and technology01 natural sciences010104 statistics & probability[INFO.INFO-TS]Computer Science [cs]/Signal and Image ProcessingArtificial Intelligence0202 electrical engineering electronic engineering information engineeringStatistical inferenceFusion rules0101 mathematicsElectrical and Electronic EngineeringComputingMilieux_MISCELLANEOUSMinimum mean square errorApplied MathematicsConstrained optimizationEstimator020206 networking & telecommunicationsComputational Theory and MathematicsSignal ProcessingComputer Vision and Pattern RecognitionStatistics Probability and Uncertainty[SPI.SIGNAL]Engineering Sciences [physics]/Signal and Image processingAlgorithmDigital Signal Processing
researchProduct

Adaptive Importance Sampling: The past, the present, and the future

2017

A fundamental problem in signal processing is the estimation of unknown parameters or functions from noisy observations. Important examples include localization of objects in wireless sensor networks [1] and the Internet of Things [2]; multiple source reconstruction from electroencephalograms [3]; estimation of power spectral density for speech enhancement [4]; or inference in genomic signal processing [5]. Within the Bayesian signal processing framework, these problems are addressed by constructing posterior probability distributions of the unknowns. The posteriors combine optimally all of the information about the unknowns in the observations with the information that is present in their …

Computer scienceBayesian probabilityPosterior probabilityInference02 engineering and technologyMachine learningcomputer.software_genre01 natural sciences010104 statistics & probabilityMultidimensional signal processing[INFO.INFO-TS]Computer Science [cs]/Signal and Image ProcessingPrior probability0202 electrical engineering electronic engineering information engineering0101 mathematicsElectrical and Electronic EngineeringComputingMilieux_MISCELLANEOUSbusiness.industryApplied Mathematics020206 networking & telecommunicationsApproximate inferenceSignal ProcessingProbability distributionArtificial intelligencebusinessAlgorithmcomputer[SPI.SIGNAL]Engineering Sciences [physics]/Signal and Image processingImportance sampling
researchProduct

Parallelizing Epistasis Detection in GWAS on FPGA and GPU-Accelerated Computing Systems

2015

This is a post-peer-review, pre-copyedit version of an article published in IEEE - ACM Transactions on Computational Biology and Bioinformatics. The final authenticated version is available online at: http://dx.doi.org/10.1109/TCBB.2015.2389958 [Abstract] High-throughput genotyping technologies (such as SNP-arrays) allow the rapid collection of up to a few million genetic markers of an individual. Detecting epistasis (based on 2-SNP interactions) in Genome-Wide Association Studies is an important but time consuming operation since statistical computations have to be performed for each pair of measured markers. Computational methods to detect epistasis therefore suffer from prohibitively lon…

Computer scienceBioinformaticsDNA Mutational AnalysisGenome-wide association studyParallel computingPolymorphism Single NucleotideSensitivity and SpecificityComputational biologyComputer GraphicsGeneticsComputer architectureField-programmable gate arrayRandom access memoryApplied MathematicsChromosome MappingHigh-Throughput Nucleotide SequencingReproducibility of ResultsField programmable gate arraysEpistasis GeneticSignal Processing Computer-AssistedEquipment DesignRandom access memoryComputing systemsReconfigurable computingEquipment Failure AnalysisTask (computing)EpistasisHost (network)Graphics processing unitsGenome-Wide Association StudyBiotechnology
researchProduct

Real Time Image Rotation Using Dynamic Reconfiguration

2002

Abstract Field programmable gate array (FPGA) components are widely used nowdays to implement various algorithms, such as digital filtering, in real time. The emergence of dynamically reconfigurable FPGAs made it possible to reduce the number of necessary resources to carry out an image-processing task (tasks chain). In this article, an image-processing application, image rotation, that exploits the FPGAs dynamic reconfiguration method is presented. This paper shows that the choice of an implementation, static or dynamic reconfiguration, depends on the nature of the application. A comparison is carried out between the dynamic and the static reconfiguration using two criteria: cost and perfo…

Computer scienceBlock diagramControl reconfigurationImage processingTask (computing)Computer engineeringSignal ProcessingComputer Vision and Pattern RecognitionElectrical and Electronic EngineeringField-programmable gate arrayDynamic methodReal-time operating systemImage restorationSimulationReal-Time Imaging
researchProduct

A chirp-z transform-based synchronizer for power system measurements

2005

In the last few years, increased interest in power and voltage quality has forced international working groups to standardize testing and measurement techniques. IEC 61000-4-30, which defines the characteristics of instrumentation for the measurement of power quality, refers to IEC 61000-4-7 for the evaluation of harmonics and interharmonics. This standard, revised in 2002, requires a synchronous sampling of voltage or current signal, in order to limit errors and to ensure reproducible results even in the presence of nonstationary signals. Therefore, an accurate estimation of the fundamental frequency is required, even in the presence of disturbances. In this paper, an algorithm to detect t…

Computer scienceBluestein's FFT algorithmFast Fourier transformChirp-z transform power quality synchronizationFundamental frequencyPower (physics)Electric power systemSampling (signal processing)SynchronizerHarmonicsElectronic engineeringElectrical and Electronic EngineeringInstrumentationSettore ING-INF/07 - Misure Elettriche E ElettronicheInterpolation
researchProduct

Logical Key Hierarchy for Group Management in Distributed Online Social Networks

2016

Distributed Online Social Networks (DOSNs) have recently been proposed to shift the control over user data from a unique entity to the users of the DOSN themselves. In this paper, we focus our attention on the problem of privacy preserving content sharing to a large group of users of the DOSNs. Several solutions, based on cryptographic techniques, have been recently proposed. The main challenge here is the definition of a scalable and decentralized approach that: i) minimizes the re-encryption of the contents published in a group when the composition of the group changes and ii) enables a fast distribution of the cryptographic keys to all the members (n) of a group, each time a new user is …

Computer scienceComputer Networks and CommunicationsCryptography02 engineering and technologyEncryptionSet (abstract data type)Public-key cryptography0202 electrical engineering electronic engineering information engineeringMathematics (all)Distributed Online Social Network; Privacy; Secure Group communicationFocus (computing)Social networkSettore INF/01 - Informaticabusiness.industryGroup (mathematics)020206 networking & telecommunicationsComputer Science Applications1707 Computer Vision and Pattern RecognitionSecure Group communicationDistributed Online Social NetworkPrivacyScalabilitySignal ProcessingKey (cryptography)020201 artificial intelligence & image processingbusinessSoftwareComputer network
researchProduct

Efficient and Accurate OTU Clustering with GPU-Based Sequence Alignment and Dynamic Dendrogram Cutting.

2015

De novo clustering is a popular technique to perform taxonomic profiling of a microbial community by grouping 16S rRNA amplicon reads into operational taxonomic units (OTUs). In this work, we introduce a new dendrogram-based OTU clustering pipeline called CRiSPy. The key idea used in CRiSPy to improve clustering accuracy is the application of an anomaly detection technique to obtain a dynamic distance cutoff instead of using the de facto value of 97 percent sequence similarity as in most existing OTU clustering pipelines. This technique works by detecting an abrupt change in the merging heights of a dendrogram. To produce the output dendrograms, CRiSPy employs the OTU hierarchical clusterin…

Computer scienceCorrelation clusteringSingle-linkage clusteringMolecular Sequence DataMachine learningcomputer.software_genrePattern Recognition AutomatedCURE data clustering algorithmRNA Ribosomal 16SGeneticsComputer GraphicsCluster analysisBase Sequencebusiness.industryApplied MathematicsDendrogramHigh-Throughput Nucleotide SequencingPattern recognitionSignal Processing Computer-AssistedEquipment DesignHierarchical clusteringEquipment Failure AnalysisRNA BacterialCanopy clustering algorithmArtificial intelligenceHierarchical clustering of networksbusinesscomputerSequence AlignmentAlgorithmsBiotechnologyIEEE/ACM transactions on computational biology and bioinformatics
researchProduct

Improving topological mapping on NoCs

2010

Networks-on-Chip (NoCs) have been proposed as an efficient solution to the complex communications on System-on-chip (SoCs). The design flow of network-on-chip (NoCs) include several key issues, and one of them is the decision of where cores have to be topologically mapped. This thesis proposes a new approach to the topological mapping strategy for NoCs. Concretely, we propose a new topological mapping technique for regular and irregular NoC platforms and its application for optimizing application specific NoC based on distributed and source routing.

Computer scienceDistributed computingDesign flowBandwidth (signal processing)Hardware_PERFORMANCEANDRELIABILITYIntegrated circuit designSource routingNetwork topologyComputer Science::Hardware ArchitectureComputer Science::Emerging TechnologiesNetwork on a chipHardware_INTEGRATEDCIRCUITSSystem on a chipRouting (electronic design automation)2010 IEEE International Symposium on Parallel & Distributed Processing, Workshops and Phd Forum (IPDPSW)
researchProduct