0000000000400633
AUTHOR
Hector Gonzalez
Accelerated Proximal Gradient Descent in Metric Learning for Kernel Regression
The purpose of this paper is to learn a specific distance function for the Nadayara Watson estimator to be applied as a non-linear classifier. The idea of transforming the predictor variables and learning a kernel function based on Mahalanobis pseudo distance througth an low rank structure in the distance function will help us to lead the development of this problem. In context of metric learning for kernel regression, we introduce an Accelerated Proximal Gradient to solve the non-convex optimization problem with better convergence rate than gradient descent. An extensive experiment and the corresponding discussion tries to show that our strategie its a competitive solution in relation to p…
Improving Nearest Neighbor Based Multi-target Prediction Through Metric Learning
The purpose of this work is to learn specific distance functions to be applied for multi-target regression problems using nearest neighbors. The idea of preserving the order relation between input and output vectors considering their corresponding distances is used along a maximal margin criterion to formulate a specific metric learning problem. Extensive experiments and the corresponding discussion try to put forward the advantages of the proposed algorithm that can be considered as a generalization of previously proposed approaches. Preliminary results suggest that this line of work can lead to very competitive algorithms with convenient properties.
Generalized Multitarget Linear Regression with Output Dependence Estimation
Multitarget regression has recently received attention in the context of modern, large-scale problems in which finding good enough solutions in a timely manner is crucial. Different proposed alternatives use a combination of regularizers that lead to different ways of solving the problem. In this work, we introduce a general formulation with several regularizers. This leads to a biconvex minimization problem and we use an alternating procedure with accelerated proximal gradient steps to solve it. We show that our formulation is equivalent but more efficient than some previously proposed approaches. Moreover, we introduce two new variants. The experimental validation carried out, suggests th…