6533b7dcfe1ef96bd127331c

RESEARCH PRODUCT

Accelerated Proximal Gradient Descent in Metric Learning for Kernel Regression

Francesc J. FerriCarlos MorellHector Gonzalez

subject

Mahalanobis distanceOptimization problembusiness.industryComputer scienceEstimator02 engineering and technology010501 environmental sciences01 natural sciencesRate of convergenceMetric (mathematics)0202 electrical engineering electronic engineering information engineeringKernel regression020201 artificial intelligence & image processingArtificial intelligencebusinessGradient descentAlgorithmClassifier (UML)0105 earth and related environmental sciences

description

The purpose of this paper is to learn a specific distance function for the Nadayara Watson estimator to be applied as a non-linear classifier. The idea of transforming the predictor variables and learning a kernel function based on Mahalanobis pseudo distance througth an low rank structure in the distance function will help us to lead the development of this problem. In context of metric learning for kernel regression, we introduce an Accelerated Proximal Gradient to solve the non-convex optimization problem with better convergence rate than gradient descent. An extensive experiment and the corresponding discussion tries to show that our strategie its a competitive solution in relation to previously proposed approaches. Preliminary results suggest that this line of work can deal with others regularization approach in order to improve the kernel regression problem.

https://doi.org/10.1007/978-3-030-01132-1_25