Search results for "tucker"
showing 7 items of 7 documents
Multi-domain feature extraction for small event-related potentials through nonnegative multi-way array decomposition from low dense array EEG
2013
Non-negative Canonical Polyadic decomposition (NCPD) and non-negative Tucker decomposition (NTD) were compared for extracting the multi-domain feature of visual mismatch negativity (vMMN), a small event-related potential (ERP), for the cognitive research. Since signal-to-noise ratio in vMMN is low, NTD outperformed NCPD. Moreover, we proposed an approach to select the multi-domain feature of an ERP among all extracted features and discussed determination of numbers of extracted components in NCPD and NTD regarding the ERP context.
On the WGSC Property in Some Classes of Groups
2009
The property of quasi-simple filtration (or qsf) for groups has been introduced in literature more than 10 years ago by S. Brick. This is equivalent, for groups, to the weak geometric simple connectivity (or wgsc). The main interest of these notions is that there is still not known whether all finitely presented groups are wgsc (qsf) or not. The present note deals with the wgsc property for solvable groups and generalized FC-groups. Moreover, a relation between the almost-convexity condition and the Tucker property, which is related to the wgsc property, has been considered for 3-manifold groups.
Nonnegative Tensor Train Decompositions for Multi-domain Feature Extraction and Clustering
2016
Tensor train (TT) is one of the modern tensor decomposition models for low-rank approximation of high-order tensors. For nonnegative multiway array data analysis, we propose a nonnegative TT (NTT) decomposition algorithm for the NTT model and a hybrid model called the NTT-Tucker model. By employing the hierarchical alternating least squares approach, each fiber vector of core tensors is optimized efficiently at each iteration. We compared the performances of the proposed method with a standard nonnegative Tucker decomposition (NTD) algorithm by using benchmark data sets including event-related potential data and facial image data in multi-domain feature extraction and clustering tasks. It i…
Tensor decomposition of EEG signals: A brief review
2015
Electroencephalography (EEG) is one fundamental tool for functional brain imaging. EEG signals tend to be represented by a vector or a matrix to facilitate data processing and analysis with generally understood methodologies like time-series analysis, spectral analysis and matrix decomposition. Indeed, EEG signals are often naturally born with more than two modes of time and space, and they can be denoted by a multi-way array called as tensor. This review summarizes the current progress of tensor decomposition of EEG signals with three aspects. The first is about the existing modes and tensors of EEG signals. Second, two fundamental tensor decomposition models, canonical polyadic decomposit…
On the proper homotopy invariance of the Tucker property
2006
A non-compact polyhedron P is Tucker if, for any compact subset K ⊂ P, the fundamental group π1(P − K) is finitely generated. The main result of this note is that a manifold which is proper homotopy equivalent to a Tucker polyhedron is Tucker. We use Poenaru’s theory of the equivalence relations forced by the singularities of a non-degenerate simplicial map.
Low-Rank Tucker-2 Model for Multi-Subject fMRI Data Decomposition with Spatial Sparsity Constraint
2022
Tucker decomposition can provide an intuitive summary to understand brain function by decomposing multi-subject fMRI data into a core tensor and multiple factor matrices, and was mostly used to extract functional connectivity patterns across time/subjects using orthogonality constraints. However, these algorithms are unsuitable for extracting common spatial and temporal patterns across subjects due to distinct characteristics such as high-level noise. Motivated by a successful application of Tucker decomposition to image denoising and the intrinsic sparsity of spatial activations in fMRI, we propose a low-rank Tucker-2 model with spatial sparsity constraint to analyze multi-subject fMRI dat…
The Tucker tensor decomposition for data analysis: capabilities and advantages
2022
Tensors are powerful multi-dimensional mathematical objects, that easily embed various data models such as relational, graph, time series, etc. Furthermore, tensor decomposition operators are of great utility to reveal hidden patterns and complex relationships in data. In this article, we propose to study the analytical capabilities of the Tucker decomposition, as well as the differences brought by its major algorithms. We demonstrate these differences through practical examples on several datasets having a ground truth. It is a preliminary work to add the Tucker decomposition to the Tensor Data Model, a model aiming to make tensors data-centric, and to optimize operators in order to enable…