Search results for "Dimensionality Reduction"

showing 10 items of 120 documents

Dimension reduction: additional benefit of an optimal filter for independent component analysis to extract event-related potentials.

2011

The present study addresses benefits of a linear optimal filter (OF) for independent component analysis (ICA) in extracting brain event-related potentials (ERPs). A filter such as the digital filter is usually considered as a denoising tool. Actually, in filtering ERP recordings by an OF, the ERP' topography should not be changed by the filter, and the output should also be able to be modeled by the linear transformation. Moreover, an OF designed for a specific ERP source or component may remove noise, as well as reduce the overlap of sources and even reject some non-targeted sources in the ERP recordings. The OF can thus accomplish both the denoising and dimension reduction (reducing the n…

AdultMaleUnderdetermined systemSpeech recognitionNoise reductionYoung AdultHumansChildEvoked Potentialsta515ta217Mathematicsta113Principal Component Analysisbusiness.industryGeneral NeuroscienceDimensionality reductionPattern recognitionElectroencephalographyFilter (signal processing)Independent component analysisNoisePrincipal component analysisLinear ModelsFemaleArtificial intelligencebusinessDigital filterPhotic StimulationJournal of neuroscience methods
researchProduct

Detection of steering direction using EEG recordings based on sample entropy and time-frequency analysis.

2016

Monitoring driver's intentions beforehand is an ambitious aim, which will bring a huge impact on the society by preventing traffic accidents. Hence, in this preliminary study we recorded high resolution electroencephalography (EEG) from 5 subjects while driving a car under real conditions along with an accelerometer which detects the onset of steering. Two sensor-level analyses, sample entropy and time-frequency analysis, have been implemented to observe the dynamics before the onset of steering. Thus, in order to classify the steering direction we applied a machine learning algorithm consisting of: dimensionality reduction and classification using principal-component-analysis (PCA) and sup…

Automobile DrivingSupport Vector MachineComputer scienceSpeech recognitionEntropyElectroencephalography03 medical and health sciencesEntropy (classical thermodynamics)0302 clinical medicine0502 economics and businessAccelerometrymedicineEntropy (information theory)HumansEntropy (energy dispersal)Entropy (arrow of time)050210 logistics & transportationPrincipal Component Analysismedicine.diagnostic_testbusiness.industryEntropy (statistical thermodynamics)Dimensionality reduction05 social sciencesPattern recognitionElectroencephalographyTime–frequency analysisSupport vector machineSample entropyPrincipal component analysisArtificial intelligencebusiness030217 neurology & neurosurgeryAlgorithmsEntropy (order and disorder)Annual International Conference of the IEEE Engineering in Medicine and Biology Society. IEEE Engineering in Medicine and Biology Society. Annual International Conference
researchProduct

Sparse Manifold Clustering and Embedding to discriminate gene expression profiles of glioblastoma and meningioma tumors.

2013

Sparse Manifold Clustering and Embedding (SMCE) algorithm has been recently proposed for simultaneous clustering and dimensionality reduction of data on nonlinear manifolds using sparse representation techniques. In this work, SMCE algorithm is applied to the differential discrimination of Glioblastoma and Meningioma Tumors by means of their Gene Expression Profiles. Our purpose was to evaluate the robustness of this nonlinear manifold to classify gene expression profiles, characterized by the high-dimensionality of their representations and the low discrimination power of most of the genes. For this objective, we used SMCE to reduce the dimensionality of a preprocessed dataset of 35 single…

BioinformaticsHealth InformaticsMicroarray data analysisRobustness (computer science)Databases GeneticCluster AnalysisHumansManifoldsCluster analysisMathematicsOligonucleotide Array Sequence Analysisbusiness.industryDimensionality reductionGene Expression ProfilingComputational BiologyDiscriminant AnalysisPattern recognitionSparse approximationLinear discriminant analysisManifoldComputer Science ApplicationsFISICA APLICADAEmbeddingAutomatic classificationArtificial intelligencebusinessGlioblastomaMeningiomaTranscriptomeAlgorithmsCurse of dimensionalityComputers in biology and medicine
researchProduct

Improving clustering of Web bot and human sessions by applying Principal Component Analysis

2019

View references (18) The paper addresses the problem of modeling Web sessions of bots and legitimate users (humans) as feature vectors for their use at the input of classification models. So far many different features to discriminate bots’ and humans’ navigational patterns have been considered in session models but very few studies were devoted to feature selection and dimensionality reduction in the context of bot detection. We propose applying Principal Component Analysis (PCA) to develop improved session models based on predictor variables being efficient discriminants of Web bots. The proposed models are used in session clustering, whose performance is evaluated in terms of the purity …

Bot detectionPrincipal Component AnalysisPCALog analysisComputer sciencek-meansInternet robotcomputer.software_genreClassificationWeb botDimensionality reductionClusteringWeb serverPrincipal component analysisFeature selectionData miningCluster analysiscomputerCommunications of the ECMS
researchProduct

An efficient functional magnetic resonance imaging data reduction strategy using neighborhood preserving embedding algorithm

2021

High dimensionality data have become common in neuroimaging fields, especially group-level functional magnetic resonance imaging (fMRI) datasets. fMRI connectivity analysis is a widely used, powerful technique for studying functional brain networks to probe underlying mechanisms of brain function and neuropsychological disorders. However, data-driven technique like independent components analysis (ICA), can yield unstable and inconsistent results, confounding the true effects of interest and hindering the understanding of brain functionality and connectivity. A key contributing factor to this instability is the information loss that occurs during fMRI data reduction. Data reduction of high …

Brain MappingPrincipal Component AnalysisRadiological and Ultrasound TechnologysignaalinkäsittelyfMRIBrainMagnetic Resonance Imagingtoiminnallinen magneettikuvausNeurologyHumansRadiology Nuclear Medicine and imagingNeurology (clinical)ICAAnatomyAlgorithmsNPEdimensionality reduction
researchProduct

Making nonlinear manifold learning models interpretable: The manifold grand tour

2015

Smooth nonlinear topographic maps of the data distribution to guide a Grand Tour visualisation.Prioritisation of data linear views that are most consistent with data structure in the maps.Useful visualisations that cannot be obtained by other more classical approaches. Dimensionality reduction is required to produce visualisations of high dimensional data. In this framework, one of the most straightforward approaches to visualising high dimensional data is based on reducing complexity and applying linear projections while tumbling the projection axes in a defined sequence which generates a Grand Tour of the data. We propose using smooth nonlinear topographic maps of the data distribution to…

Clustering high-dimensional dataQA75Nonlinear dimensionality reductionDiscriminative clusteringComputer scienceVisualització de la informaciócomputer.software_genreData visualizationProjection (mathematics)Information visualizationArtificial IntelligenceQA:Informàtica::Infografia [Àrees temàtiques de la UPC]business.industryData visualizationDimensionality reductionGrand tourGeneral EngineeringNonlinear dimensionality reductionTopographic mapData structureComputer Science ApplicationsVisualizationManifold learningData miningbusinesscomputerGenerative topographic mappingLinear projections
researchProduct

Dimensionality reduction via regression on hyperspectral infrared sounding data

2014

This paper introduces a new method for dimensionality reduction via regression (DRR). The method generalizes Principal Component Analysis (PCA) in such a way that reduces the variance of the PCA scores. In order to do so, DRR relies on a deflationary process in which a non-linear regression reduces the redundancy between the PC scores. Unlike other nonlinear dimensionality reduction methods, DRR is easy to apply, it has out-of-sample extension, it is invertible, and the learned transformation is volume-preserving. These properties make the method useful for a wide range of applications, especially in very high dimensional data in general, and for hyperspectral image processing in particular…

Clustering high-dimensional dataRedundancy (information theory)business.industryDimensionality reductionPrincipal component analysisFeature extractionNonlinear dimensionality reductionHyperspectral imagingPattern recognitionArtificial intelligencebusinessMathematicsCurse of dimensionality2014 6th Workshop on Hyperspectral Image and Signal Processing: Evolution in Remote Sensing (WHISPERS)
researchProduct

The impact of sample reduction on PCA-based feature extraction for supervised learning

2006

"The curse of dimensionality" is pertinent to many learning algorithms, and it denotes the drastic raise of computational complexity and classification error in high dimensions. In this paper, different feature extraction (FE) techniques are analyzed as means of dimensionality reduction, and constructive induction with respect to the performance of Naive Bayes classifier. When a data set contains a large number of instances, some sampling approach is applied to address the computational complexity of FE and classification processes. The main goal of this paper is to show the impact of sample reduction on the process of FE for supervised learning. In our study we analyzed the conventional PC…

Computer scienceCovariance matrixbusiness.industryDimensionality reductionFeature extractionSupervised learningNonparametric statisticsSampling (statistics)Pattern recognitionStratified samplingNaive Bayes classifierSample size determinationArtificial intelligencebusinessEigenvalues and eigenvectorsParametric statisticsCurse of dimensionalityProceedings of the 2006 ACM symposium on Applied computing
researchProduct

Feature Extraction and Selection for Pain Recognition Using Peripheral Physiological Signals.

2019

In pattern recognition, the selection of appropriate features is paramount to both the performance and the robustness of the system. Over-reliance on machine learning-based feature selection methods can, therefore, be problematic; especially when conducted using small snapshots of data. The results of these methods, if adopted without proper interpretation, can lead to sub-optimal system design or worse, the abandonment of otherwise viable and important features. In this work, a deep exploration of pain-based emotion classification was conducted to better understand differences in the results of the related literature. In total, 155 different time domain and frequency domain features were e…

Computer scienceFeature vectorFeature extractionFeature selection02 engineering and technologyphysiological signalslcsh:RC321-57103 medical and health sciences0302 clinical medicineEMGfeature selectionChartemotion recognition0202 electrical engineering electronic engineering information engineeringaffective computinglcsh:Neurosciences. Biological psychiatry. NeuropsychiatryOriginal Researchheat painmultimodal analysisbusiness.industryGeneral NeuroscienceDeep learningDimensionality reductionfeature extractionPattern recognitionFeature (computer vision)Pattern recognition (psychology)020201 artificial intelligence & image processingArtificial intelligencebusiness030217 neurology & neurosurgeryNeuroscienceFrontiers in neuroscience
researchProduct

Improved Statistically Based Retrievals via Spatial-Spectral Data Compression for IASI Data

2019

In this paper, we analyze the effect of spatial and spectral compression on the performance of statistically based retrieval. Although the quality of the information is not com- pletely preserved during the coding process, experiments reveal that a certain amount of compression may yield a positive impact on the accuracy of retrievals. We unveil two strategies, both with interesting benefits: either to apply a very high compression, which still maintains the same retrieval performance as that obtained for uncompressed data; or to apply a moderate to high compression, which improves the performance. As a second contribution of this paper, we focus on the origins of these benefits. On the one…

Computer scienceInfrared Atmospheric Sounding Interferometer (IASI)Spectral Transforms0211 other engineering and technologies02 engineering and technologyData_CODINGANDINFORMATIONTHEORYLossy compressionInfrared atmospheric sounding interferometer (IASI)Kernel MethodsElectrical and Electronic EngineeringTransform coding021101 geological & geomatics engineeringbusiness.industryDimensionality reductionLossy CompressionJPEG 2000Kernel methodsPattern recognitioncomputer.file_formatJoint Photographic Experts Group (JPEG) 2000RegressionUncompressed videoSpectral transformsKernel methodStatistically based retrievalJPEG 2000General Earth and Planetary SciencesLossy compressionArtificial intelligencebusinessStatistically Based RetrievalcomputerSmoothingIEEE Transactions on Geoscience and Remote Sensing
researchProduct