Search results for "Component"
showing 10 items of 1682 documents
Optimized Kernel Entropy Components
2016
This work addresses two main issues of the standard Kernel Entropy Component Analysis (KECA) algorithm: the optimization of the kernel decomposition and the optimization of the Gaussian kernel parameter. KECA roughly reduces to a sorting of the importance of kernel eigenvectors by entropy instead of by variance as in Kernel Principal Components Analysis. In this work, we propose an extension of the KECA method, named Optimized KECA (OKECA), that directly extracts the optimal features retaining most of the data entropy by means of compacting the information in very few features (often in just one or two). The proposed method produces features which have higher expressive power. In particular…
A General Framework for Complex Network-Based Image Segmentation
2019
International audience; With the recent advances in complex networks theory, graph-based techniques for image segmentation has attracted great attention recently. In order to segment the image into meaningful connected components, this paper proposes an image segmentation general framework using complex networks based community detection algorithms. If we consider regions as communities, using community detection algorithms directly can lead to an over-segmented image. To address this problem, we start by splitting the image into small regions using an initial segmentation. The obtained regions are used for building the complex network. To produce meaningful connected components and detect …
A novel approach to integration by parts reduction
2015
Integration by parts reduction is a standard component of most modern multi-loop calculations in quantum field theory. We present a novel strategy constructed to overcome the limitations of currently available reduction programs based on Laporta's algorithm. The key idea is to construct algebraic identities from numerical samples obtained from reductions over finite fields. We expect the method to be highly amenable to parallelization, show a low memory footprint during the reduction step, and allow for significantly better run-times.
Diffusion map for clustering fMRI spatial maps extracted by Indipendent Component Analysis
2013
Functional magnetic resonance imaging (fMRI) produces data about activity inside the brain, from which spatial maps can be extracted by independent component analysis (ICA). In datasets, there are n spatial maps that contain p voxels. The number of voxels is very high compared to the number of analyzed spatial maps. Clustering of the spatial maps is usually based on correlation matrices. This usually works well, although such a similarity matrix inherently can explain only a certain amount of the total variance contained in the high-dimensional data where n is relatively small but p is large. For high-dimensional space, it is reasonable to perform dimensionality reduction before clustering.…
Upperbounds on the probability of finding marked connected components using quantum walks
2019
Quantum walk search may exhibit phenomena beyond the intuition from a conventional random walk theory. One of such examples is exceptional configuration phenomenon -- it appears that it may be much harder to find any of two or more marked vertices, that if only one of them is marked. In this paper, we analyze the probability of finding any of marked vertices in such scenarios and prove upper bounds for various sets of marked vertices. We apply the upper bounds to large collection of graphs and show that the quantum search may be slow even when taking real-world networks.
Extracting Backbones in Weighted Modular Complex Networks
2020
AbstractNetwork science provides effective tools to model and analyze complex systems. However, the increasing size of real-world networks becomes a major hurdle in order to understand their structure and topological features. Therefore, mapping the original network into a smaller one while preserving its information is an important issue. Extracting the so-called backbone of a network is a very challenging problem that is generally handled either by coarse-graining or filter-based methods. Coarse-graining methods reduce the network size by grouping similar nodes, while filter-based methods prune the network by discarding nodes or edges based on a statistical property. In this paper, we pro…
PRINCIPAL POLYNOMIAL ANALYSIS
2014
© 2014 World Scientific Publishing Company. This paper presents a new framework for manifold learning based on a sequence of principal polynomials that capture the possibly nonlinear nature of the data. The proposed Principal Polynomial Analysis (PPA) generalizes PCA by modeling the directions of maximal variance by means of curves instead of straight lines. Contrarily to previous approaches PPA reduces to performing simple univariate regressions which makes it computationally feasible and robust. Moreover PPA shows a number of interesting analytical properties. First PPA is a volume preserving map which in turn guarantees the existence of the inverse. Second such an inverse can be obtained…
Metastable memristive lines for signal transmission and information processing applications
2016
Traditional studies of memristive devices have mainly focused on their applications in nonvolatile information storage and information processing. Here, we demonstrate that the third fundamental component of information technologies-the transfer of information-can also be employed with memristive devices. For this purpose, we introduce a metastable memristive circuit. Combining metastable memristive circuits into a line, one obtains an architecture capable of transferring a signal edge from one space location to another. We emphasize that the suggested metastable memristive lines employ only resistive circuit components. Moreover, their networks (for example, Y-connected lines) have an info…
Corrigendum: ExGUtils: A Python Package for Statistical Analysis With the ex-Gaussian Probability Density
2018
The study of reaction times and their underlying cognitive processes is an important field in Psychology. Reaction times are usually modeled through the ex-Gaussian distribution, because it provides a good fit to multiple empirical data. The complexity of this distribution makes the use of computational tools an essential element in the field. Therefore, there is a strong need for efficient and versatile computational tools for the research in this area. In this manuscript we discuss some mathematical details of the ex-Gaussian distribution and apply the ExGUtils package, a set of functions and numerical tools, programmed for python, developed for numerical analysis of data involving the ex…
Conditional Bias Robust Estimation of the Total of Curve Data by Sampling in a Finite Population: An Illustration on Electricity Load Curves
2020
Abstract For marketing or power grid management purposes, many studies based on the analysis of total electricity consumption curves of groups of customers are now carried out by electricity companies. Aggregated totals or mean load curves are estimated using individual curves measured at fine time grid and collected according to some sampling design. Due to the skewness of the distribution of electricity consumptions, these samples often contain outlying curves which may have an important impact on the usual estimation procedures. We introduce several robust estimators of the total consumption curve which are not sensitive to such outlying curves. These estimators are based on the conditio…