6533b82efe1ef96bd1293be1
RESEARCH PRODUCT
An efficient method for clustered multi-metric learning
Francesc J. FerriCarlos MorellBac NguyenBernard De Baetssubject
Information Systems and ManagementTheoretical computer scienceComputer science05 social sciences050301 education02 engineering and technologyDisjoint setsRegularization (mathematics)Field (computer science)Computer Science ApplicationsTheoretical Computer ScienceKernel (linear algebra)Metric spaceArtificial IntelligenceControl and Systems EngineeringSimple (abstract algebra)Kernel (statistics)Metric (mathematics)0202 electrical engineering electronic engineering information engineeringEmbedding020201 artificial intelligence & image processing0503 educationSoftwaredescription
Abstract Distance metric learning, which aims at finding a distance metric that separates examples of one class from examples of the other classes, is the key to the success of many machine learning tasks. Although there has been an increasing interest in this field, learning a global distance metric is insufficient to obtain satisfactory results when dealing with heterogeneously distributed data. A simple solution to tackle this kind of data is based on kernel embedding methods. However, it quickly becomes computationally intractable as the number of examples increases. In this paper, we propose an efficient method that learns multiple local distance metrics instead of a single global one. More specifically, the training examples are divided into several disjoint clusters, in each of which a distance metric is trained to separate the data locally. Additionally, a global regularization is introduced to preserve some common properties of different clusters in the learned metric space. By learning multiple distance metrics jointly within a single unified optimization framework, our method consistently outperforms single distance metric learning methods, while being more efficient than other state-of-the-art multi-metric learning methods.
year | journal | country | edition | language |
---|---|---|---|---|
2019-01-01 | Information Sciences |