Search results for "Dimension Reduction"

showing 5 items of 15 documents

Linear Feature Extraction for Ranking

2018

We address the feature extraction problem for document ranking in information retrieval. We then propose LifeRank, a Linear feature extraction algorithm for Ranking. In LifeRank, we regard each document collection for ranking as a matrix, referred to as the original matrix. We try to optimize a transformation matrix, so that a new matrix (dataset) can be generated as the product of the original matrix and a transformation matrix. The transformation matrix projects high-dimensional document vectors into lower dimensions. Theoretically, there could be very large transformation matrices, each leading to a new generated matrix. In LifeRank, we produce a transformation matrix so that the generat…

dimension reductionComputer scienceFeature extractionMathematicsofComputing_NUMERICALANALYSISFeature selectiontiedonhakujärjestelmät02 engineering and technologyLibrary and Information SciencesRanking (information retrieval)Matrix (mathematics)Transformation matrix020204 information systemsalgoritmit0202 electrical engineering electronic engineering information engineeringtiedonhakulearning to rankbusiness.industryfeature extractionPattern recognitionkoneoppiminenPattern recognition (psychology)Benchmark (computing)020201 artificial intelligence & image processingLearning to rankArtificial intelligencebusinessInformation Systems
researchProduct

Beyond Tandem Analysis: Joint Dimension Reduction and Clustering in R

2019

We present the R package clustrd which implements a class of methods that combine dimension reduction and clustering of continuous or categorical data. In particular, for continuous data, the package contains implementations of factorial K-means and reduced K-means; both methods combine principal component analysis with K-means clustering. For categorical data, the package provides MCA K-means, i-FCB and cluster correspondence analysis, which combine multiple correspondence analysis with K-means. Two examples on real data sets are provided to illustrate the usage of the main functions.

dimension reduction; clustering; principal component analysis; multiple correspondence analysis; K-meansStatistics and Probabilitydimension reduction clustering principal component analysis multiple correspon-dence analysis K-meansFactorialmultiple correspon-dence analysisMultiple correspondence analysiComputer sciencedimension reductionprincipal component analysisk-meansmultiple correspondence analysisPrincipal component analysicomputer.software_genre01 natural sciencesCorrespondence analysis010104 statistics & probabilityMultiple correspondence analysis0101 mathematicsCluster analysisCategorical variablelcsh:Statisticslcsh:HA1-4737Dimensionality reductionk-means clusteringK-meanPrincipal component analysisData miningHA29-32Statistics Probability and UncertaintycomputerSoftwareclusteringJournal of Statistical Software
researchProduct

Dimension reduction for $-Delta_1$

2012

A 3D-2D dimension reduction for $-\Delta_1$ is obtained. A power law approximation from $-\Delta_p$ as $p \to 1$ in terms of $\Gamma$- convergence, duality and asymptotics for least gradient functions has also been provided.

dimension reduction; gamma convergence; duality; functions of bounded variation; 1-laplacianMathematics - Analysis of PDEsdimension reductionfunctions of bounded variationdualitygamma convergence1-laplacian
researchProduct

Signal dimension estimation in BSS models with serial dependence

2022

Many modern multivariate time series datasets contain a large amount of noise, and the first step of the data analysis is to separate the noise channels from the signals of interest. A crucial part of this dimension reduction is determining the number of signals. In this paper we approach this problem by considering a noisy latent variable time series model which comprises many popular blind source separation models. We propose a general framework for the estimation of the signal dimension that is based on testing for sub-sphericity and give examples of different tests suitable for time series settings. In the inference we rely on bootstrap null distributions. Several simulation studies are…

nonstationary source separationdimension reductionsignaalinkäsittelyaikasarjatsub-sphericitysecond order source separationblock bootstrapaikasarja-analyysi2022 International Conference on Electrical, Computer, Communications and Mechatronics Engineering (ICECCME)
researchProduct

Combining PCA and multiset CCA for dimension reduction when group ICA is applied to decompose naturalistic fMRI data

2015

An extension of group independent component analysis (GICA) is introduced, where multi-set canonical correlation analysis (MCCA) is combined with principal component analysis (PCA) for three-stage dimension reduction. The method is applied on naturalistic functional MRI (fMRI) images acquired during task-free continuous music listening experiment, and the results are compared with the outcome of the conventional GICA. The extended GICA resulted slightly faster ICA convergence and, more interestingly, extracted more stimulus-related components than its conventional counterpart. Therefore, we think the extension is beneficial enhancement for GICA, especially when applied to challenging fMRI d…

ta113MultisetPCAGroup (mathematics)business.industrydimension reductionSpeech recognitionDimensionality reductionPattern recognitionMusic listeningta3112naturalistic fMRIGroup independent component analysisPrincipal component analysistemporal cocatenationArtificial intelligenceCanonical correlationbusinessmultiset CCAMathematics
researchProduct