Search results for "Normal"
showing 10 items of 2571 documents
A saturated strategy robustly ensures stability of the cooperative equilibrium for Prisoner's dilemma
2016
We study diffusion of cooperation in a two-population game in continuous time. At each instant, the game involves two random individuals, one from each population. The game has the structure of a Prisoner's dilemma where each player can choose either to cooperate (c) or to defect (d), and is reframed within the field of approachability in two-player repeated game with vector payoffs. We turn the game into a dynamical system, which is positive, and propose a saturated strategy that ensures local asymptotic stability of the equilibrium (c, c) for any possible choice of the payoff matrix. We show that there exists a rectangle, in the space of payoffs, which is positively invariant for the syst…
Nonholonomic Interpolation for Kinematic Problems, Entropy and Complexity
2008
Here we present the main lines of a theory we developed in a series of previous papers, about the motion planning problem in robotics. We illustrate the theory with a few academic examples.
Compression-based classification of biological sequences and structures via the Universal Similarity Metric: experimental assessment.
2007
Abstract Background Similarity of sequences is a key mathematical notion for Classification and Phylogenetic studies in Biology. It is currently primarily handled using alignments. However, the alignment methods seem inadequate for post-genomic studies since they do not scale well with data set size and they seem to be confined only to genomic and proteomic sequences. Therefore, alignment-free similarity measures are actively pursued. Among those, USM (Universal Similarity Metric) has gained prominence. It is based on the deep theory of Kolmogorov Complexity and universality is its most novel striking feature. Since it can only be approximated via data compression, USM is a methodology rath…
A comparison of STARFM and an unmixing-based algorithm for Landsat and MODIS data fusion
2015
article i nfo The focus of the current study is to compare data fusion methods applied to sensors with medium- and high- spatial resolutions. Two documented methods are applied, the spatial and temporal adaptive reflectance fusion model (STARFM) and an unmixing-based method which proposes a Bayesian formulation to incorporate prior spectral information.Furthermore, thestrengths of both algorithms arecombined ina novel data fusionmethod: the Spatial and Temporal Reflectance Unmixing Model (STRUM). The potential of each method is demonstrated using simulation imagery and Landsat and MODIS imagery. The theoretical basis of the algorithms causes STARFM and STRUM to produce Landsat-like reflecta…
On the Computation of Symmetrized M-Estimators of Scatter
2016
This paper focuses on the computational aspects of symmetrized Mestimators of scatter, i.e. the multivariate M-estimators of scatter computed on the pairwise differences of the data. Such estimators do not require a location estimate, and more importantly, they possess the important block and joint independence properties. These properties are needed, for example, when solving the independent component analysis problem. Classical and recently developed algorithms for computing the M-estimators and the symmetrized M-estimators are discussed. The effect of parallelization is considered as well as new computational approach based on using only a subset of pairwise differences. Efficiencies and…
Co-citation Percentile Rank and JYUcite : a new network-standardized output-level citation influence metric and its implementation using Dimensions A…
2022
AbstractJudging value of scholarly outputs quantitatively remains a difficult but unavoidable challenge. Most of the proposed solutions suffer from three fundamental shortcomings: they involve (i) the concept of journal, in one way or another, (ii) calculating arithmetic averages from extremely skewed distributions, and (iii) binning data by calendar year. Here, we introduce a new metric Co-citation Percentile Rank (CPR), that relates the current citation rate of the target output taken at resolution of days since first citable, to the distribution of current citation rates of outputs in its co-citation set, as its percentile rank in that set. We explore some of its properties with an examp…
Divisive normalization image quality metric revisited.
2010
Structural similarity metrics and information-theory-based metrics have been proposed as completely different alternatives to the traditional metrics based on error visibility and human vision models. Three basic criticisms were raised against the traditional error visibility approach: (1) it is based on near-threshold performance, (2) its geometric meaning may be limited, and (3) stationary pooling strategies may not be statistically justified. These criticisms and the good performance of structural and information-theory-based metrics have popularized the idea of their superiority over the error visibility approach. In this work we experimentally or analytically show that the above critic…
Geometric Algebra Rotors for Sub-symbolic Coding of Natural Language Sentences
2007
A sub-symbolic encoding methodology for natural language sentences is presented. The procedure is based on the creation of an LSA-inspired semantic space and associates rotation operators derived from Geometric Algebra to word bigrams of the sentence. The operators are subsequently applied to an orthonormal standard basis of the created semantic space according to the order in which words appear in the sentence. The final rotated basis is then coded as a vector and its orthogonal part constitutes the sub-symbolic coding of the sentence. Preliminary experimental results for a classification task, compared with the traditional LSA methodology, show the effectiveness of the approach.
Coarse to fine : toward an intelligent 3D acquisition system
2015
International audience; The 3D acquisition-compression-processing chain is , most of the time , sequenced into independent stages. As resulting , a large amount of 3D points are acquired whatever the geometry of the object and the processing to be done in further steps. It appears , particularly in mechanical part 3D modeling and in CAD , that the acquisition of such an amount of data is not always mandatory. We propose a method aiming at minimizing the number of 3D points to be acquired with respect to the local geometry of the part and therefore to compress the cloud of points during the acquisition stage. The method we propose is based on a new coarse to fine approach in which from a coa…
Parallelization strategies for density matrix renormalization group algorithms on shared-memory systems
2003
Shared-memory parallelization (SMP) strategies for density matrix renormalization group (DMRG) algorithms enable the treatment of complex systems in solid state physics. We present two different approaches by which parallelization of the standard DMRG algorithm can be accomplished in an efficient way. The methods are illustrated with DMRG calculations of the two-dimensional Hubbard model and the one-dimensional Holstein-Hubbard model on contemporary SMP architectures. The parallelized code shows good scalability up to at least eight processors and allows us to solve problems which exceed the capability of sequential DMRG calculations.