Search results for "Dimension Reduction"
showing 5 items of 15 documents
Linear Feature Extraction for Ranking
2018
We address the feature extraction problem for document ranking in information retrieval. We then propose LifeRank, a Linear feature extraction algorithm for Ranking. In LifeRank, we regard each document collection for ranking as a matrix, referred to as the original matrix. We try to optimize a transformation matrix, so that a new matrix (dataset) can be generated as the product of the original matrix and a transformation matrix. The transformation matrix projects high-dimensional document vectors into lower dimensions. Theoretically, there could be very large transformation matrices, each leading to a new generated matrix. In LifeRank, we produce a transformation matrix so that the generat…
Beyond Tandem Analysis: Joint Dimension Reduction and Clustering in R
2019
We present the R package clustrd which implements a class of methods that combine dimension reduction and clustering of continuous or categorical data. In particular, for continuous data, the package contains implementations of factorial K-means and reduced K-means; both methods combine principal component analysis with K-means clustering. For categorical data, the package provides MCA K-means, i-FCB and cluster correspondence analysis, which combine multiple correspondence analysis with K-means. Two examples on real data sets are provided to illustrate the usage of the main functions.
Dimension reduction for $-Delta_1$
2012
A 3D-2D dimension reduction for $-\Delta_1$ is obtained. A power law approximation from $-\Delta_p$ as $p \to 1$ in terms of $\Gamma$- convergence, duality and asymptotics for least gradient functions has also been provided.
Signal dimension estimation in BSS models with serial dependence
2022
Many modern multivariate time series datasets contain a large amount of noise, and the first step of the data analysis is to separate the noise channels from the signals of interest. A crucial part of this dimension reduction is determining the number of signals. In this paper we approach this problem by considering a noisy latent variable time series model which comprises many popular blind source separation models. We propose a general framework for the estimation of the signal dimension that is based on testing for sub-sphericity and give examples of different tests suitable for time series settings. In the inference we rely on bootstrap null distributions. Several simulation studies are…
Combining PCA and multiset CCA for dimension reduction when group ICA is applied to decompose naturalistic fMRI data
2015
An extension of group independent component analysis (GICA) is introduced, where multi-set canonical correlation analysis (MCCA) is combined with principal component analysis (PCA) for three-stage dimension reduction. The method is applied on naturalistic functional MRI (fMRI) images acquired during task-free continuous music listening experiment, and the results are compared with the outcome of the conventional GICA. The extended GICA resulted slightly faster ICA convergence and, more interestingly, extracted more stimulus-related components than its conventional counterpart. Therefore, we think the extension is beneficial enhancement for GICA, especially when applied to challenging fMRI d…