Search results for "image processing"

showing 10 items of 3285 documents

Discrete wavelet transform implementation in Fourier domain for multidimensional signal

2002

Wavelet transforms are often calculated by using the Mallat algorithm. In this algorithm, a signal is decomposed by a cascade of filtering and downsampling operations. Computing time can be important but the filtering operations can be speeded up by using fast Fourier transform (FFT)-based convolutions. Since it is necessary to work in the Fourier domain when large filters are used, we present some results of Fourier-based optimization of the sampling operations. Acceleration can be obtained by expressing the samplings in the Fourier domain. The general equations of the down- and upsampling of digital multidimensional signals are given. It is shown that for special cases such as the separab…

Non-uniform discrete Fourier transformDiscrete-time Fourier transformMathematical analysisPrime-factor FFT algorithm020206 networking & telecommunications02 engineering and technologyAtomic and Molecular Physics and OpticsFractional Fourier transformDiscrete Fourier transformComputer Science ApplicationsMultidimensional signal processingDiscrete Fourier series0202 electrical engineering electronic engineering information engineering020201 artificial intelligence & image processingElectrical and Electronic EngineeringHarmonic wavelet transformAlgorithm[SPI.SIGNAL]Engineering Sciences [physics]/Signal and Image processingComputingMilieux_MISCELLANEOUSMathematics
researchProduct

Verifications of Primal Energy Identities for Variational Problems with Obstacles

2018

We discuss error identities for two classes of free boundary problems generated by obstacles. The identities suggest true forms of the respective error measures which consist of two parts: standard energy norm and a certain nonlinear measure. The latter measure controls (in a weak sense) approximation of free boundaries. Numerical tests confirm sharpness of error identities and show that in different examples one or another part of the error measure may be dominant.

Nonlinear systemNorm (mathematics)010102 general mathematics0202 electrical engineering electronic engineering information engineeringApplied mathematics020201 artificial intelligence & image processing02 engineering and technologyNumerical tests0101 mathematics01 natural sciencesMathematics
researchProduct

<title>Reaction-diffusion electrical network for image processing</title>

2006

We consider an experimental setup, modelling the FitzHugh-Nagumo equation without recovery term and composed of a 1D nonlinear electrical network made up of discrete bistable cells, resistively coupled. In the first place, we study the propagation of topological fronts in the continuum limit, then in more discrete case. We propose to apply these results to the domain of signal processing. We show that erosion and dilation of a binary signal, can be obtained. Finally, we extend the study to 2D lattices and show that it can be of great interest in image processing techniques.© (2006) COPYRIGHT SPIE--The International Society for Optical Engineering. Downloading of the abstract is permitted fo…

Nonlinear systemSignal processingMultidimensional signal processingBistabilitylawComputer scienceOptical engineeringElectrical networkElectronic engineeringDilation (morphology)Image processingTopologylaw.inventionSPIE Proceedings
researchProduct

Normalization of T2W-MRI Prostate Images using Rician a priori

2016

International audience; Prostate cancer is reported to be the second most frequently diagnosed cancer of men in the world. In practise, diagnosis can be affected by multiple factors which reduces the chance to detect the potential lesions. In the last decades, new imaging techniques mainly based on MRI are developed in conjunction with Computer-Aided Diagnosis (CAD) systems to help radiologists for such diagnosis. CAD systems are usually designed as a sequential process consisting of four stages: pre-processing, segmentation, registration and classification. As a pre-processing, image normalization is a critical and important step of the chain in order to design a robust classifier and over…

Normalization (statistics)Computer scienceNormalization (image processing)T2W-MRI02 engineering and technology[ SPI.SIGNAL ] Engineering Sciences [physics]/Signal and Image processing030218 nuclear medicine & medical imaging03 medical and health sciencesProstate cancer0302 clinical medicineProstateRician fading0202 electrical engineering electronic engineering information engineeringmedicineComputer visionSegmentation[SPI.SIGNAL] Engineering Sciences [physics]/Signal and Image processingpre-processingProstate cancermedicine.diagnostic_testbusiness.industryCancerMagnetic resonance imagingImage segmentationmedicine.diseasemedicine.anatomical_structurenormalizationComputer-aided diagnosisA priori and a posteriori020201 artificial intelligence & image processingcomputer-aided diagnosisArtificial intelligencebusiness[SPI.SIGNAL]Engineering Sciences [physics]/Signal and Image processing
researchProduct

Affine compensation of illumination in hyperspectral remote sensing images

2009

A problem when working with optical satellite or airborne images is the need to compensate for changes in the illumination conditions at the time of acquisition. This is particularly critical when working with time series of data. Atmospheric correction strategies based on radiative transfer codes may provide a rigorous solution but it may not be the best solution for situations where a huge amount of hyperspectral images may need to be processed and computational time is a critical factor. The GMES (”Global Monitoring for Environment and Security”) initiative has promoted the creation of a new generation of satellites (the SENTINEL series) with ”ultra-high resolution” and ”superspectral im…

Normalization (statistics)Computer sciencebusiness.industryMultispectral imageNormalization (image processing)Atmospheric correctionHyperspectral imagingData acquisitionRadianceRadiative transferComputer visionArtificial intelligenceAffine transformationbusinessImage resolutionRemote sensing2009 IEEE International Geoscience and Remote Sensing Symposium
researchProduct

Combining Inter-Subject Modeling with a Subject-Based Data Transformation to Improve Affect Recognition from EEG Signals

2019

Existing correlations between features extracted from Electroencephalography (EEG) signals and emotional aspects have motivated the development of a diversity of EEG-based affect detection methods. Both intra-subject and inter-subject approaches have been used in this context. Intra-subject approaches generally suffer from the small sample problem, and require the collection of exhaustive data for each new user before the detection system is usable. On the contrary, inter-subject models do not account for the personality and physiological influence of how the individual is feeling and expressing emotions. In this paper, we analyze both modeling approaches, using three public repositories. T…

Normalization (statistics)Data AnalysisSupport Vector MachineDatabases FactualComputer sciencemedia_common.quotation_subjectEmotionsData transformation (statistics)Context (language use)02 engineering and technologyvalence detectionElectroencephalographyAffect (psychology)Machine learningcomputer.software_genrelcsh:Chemical technologyBiochemistryModels BiologicalArticleAnalytical Chemistrydata transformation0202 electrical engineering electronic engineering information engineeringmedicinePersonalityHumanslcsh:TP1-1185EEGElectrical and Electronic EngineeringInstrumentationarousal detectionmedia_commonmedicine.diagnostic_testbusiness.industry020206 networking & telecommunicationsSubject (documents)ElectroencephalographySignal Processing Computer-AssistedAtomic and Molecular Physics and Opticsnormalization020201 artificial intelligence & image processingArtificial intelligencebusinessArousalcomputerSensors
researchProduct

A Comparative Analysis of Residual Block Alternatives for End-to-End Audio Classification

2020

Residual learning is known for being a learning framework that facilitates the training of very deep neural networks. Residual blocks or units are made up of a set of stacked layers, where the inputs are added back to their outputs with the aim of creating identity mappings. In practice, such identity mappings are accomplished by means of the so-called skip or shortcut connections. However, multiple implementation alternatives arise with respect to where such skip connections are applied within the set of stacked layers making up a residual block. While residual networks for image classification using convolutional neural networks (CNNs) have been widely discussed in the literature, their a…

Normalization (statistics)General Computer ScienceComputer scienceFeature extractionESC02 engineering and technologycomputer.software_genreResidualConvolutional neural networkconvolutional neural networks0202 electrical engineering electronic engineering information engineeringGeneral Materials Scienceurbansound8kAudio signal processingBlock (data storage)Contextual image classificationGeneral EngineeringAudio classification020206 networking & telecommunications113 Computer and information sciences020201 artificial intelligence & image processinglcsh:Electrical engineering. Electronics. Nuclear engineeringData mininglcsh:TK1-9971computerresidual learningIEEE Access
researchProduct

Rank-order and morphological enhancement of image details with an optoelectronic processor.

2010

In all-optical processors, enhancement of image details is the result of high-pass filtering. We describe an optoelectronic processor in which detail enhancement results from the digitally calculated difference between an original input image and its low-pass filtered version. The low-pass filtering is realized through the rank-order median and the morphological opening and closing operations calculated by use of the optical convolver. It is shown that the normalized difference between the morphological white and black top hats enhances bright and dark image details analogously to the rank-order unsharp masking.

Normalization (statistics)Point spread functionComputer sciencebusiness.industryMaterials Science (miscellaneous)Binary imageTop-hat transformImage processingAstrophysics::Cosmology and Extragalactic AstrophysicsFilter (signal processing)Edge enhancementIndustrial and Manufacturing EngineeringOptical transfer functionOptoelectronicsBusiness and International ManagementbusinessClosing (morphology)OpeningUnsharp maskingApplied optics
researchProduct

The Regression Tsetlin Machine: A Tsetlin Machine for Continuous Output Problems

2019

The recently introduced Tsetlin Machine (TM) has provided competitive pattern classification accuracy in several benchmarks, composing patterns with easy-to-interpret conjunctive clauses in propositional logic. In this paper, we go beyond pattern classification by introducing a new type of TMs, namely, the Regression Tsetlin Machine (RTM). In all brevity, we modify the inner inference mechanism of the TM so that input patterns are transformed into a single continuous output, rather than to distinct categories. We achieve this by: (1) using the conjunctive clauses of the TM to capture arbitrarily complex patterns; (2) mapping these patterns to a continuous output through a novel voting and n…

Normalization (statistics)Scheme (programming language)Computer scienceInferenceProbability density function02 engineering and technologyPropositional calculusRegression020202 computer hardware & architecturePattern recognition (psychology)0202 electrical engineering electronic engineering information engineering020201 artificial intelligence & image processingNoise (video)Algorithmcomputercomputer.programming_language
researchProduct

Testing the effects of pre-processing on voxel based morphometry analysis

2015

Voxel based morphometry (VBM) is an automated analysis technique which allows voxel-wise comparison of mainly grey-matter volumes between two magnetic resonance images (MRI). Two main analysis processes in VBM are possible. One is cross-sectional data analysis, where one group is compared with another to depict see the regions in the brain, which show changes in their grey-matter volume. Second is longitudinal data analysis, where MRIs, taken at different time points, are compared to see the regions in the brain that show changes in their grey matter volume for one time point with respect to another time point. Both types of analyses require pre-processing steps before performing the statis…

Normalization (statistics)medicine.diagnostic_testbusiness.industryPattern recognitionMagnetic resonance imagingVoxel-based morphometryGrey matterMagnetic Resonance ImagingCross-Sectional Studiesmedicine.anatomical_structureImage Processing Computer-AssistedmedicineHumansPreprocessorComputer visionArtificial intelligenceGray MatterTime pointPsychologybusinessSmoothingVolume (compression)2015 37th Annual International Conference of the IEEE Engineering in Medicine and Biology Society (EMBC)
researchProduct