6533b82dfe1ef96bd12913c6

RESEARCH PRODUCT

Extreme minimal learning machine: Ridge regression with distance-based basis

Tommi Kärkkäinen

subject

0209 industrial biotechnologyComputer scienceCognitive Neuroscienceneuraalilaskentaneuroverkot02 engineering and technologyrandomized learning machinesSet (abstract data type)extreme learning machine020901 industrial engineering & automationArtificial Intelligenceextreme minimal learning machine0202 electrical engineering electronic engineering information engineeringExtreme learning machineta113Training setBasis (linear algebra)Model selectionminimal learning machineOverlearningComputer Science ApplicationskoneoppiminenTransformation (function)020201 artificial intelligence & image processingAlgorithm

description

The extreme learning machine (ELM) and the minimal learning machine (MLM) are nonlinear and scalable machine learning techniques with a randomly generated basis. Both techniques start with a step in which a matrix of weights for the linear combination of the basis is recovered. In the MLM, the feature mapping in this step corresponds to distance calculations between the training data and a set of reference points, whereas in the ELM, a transformation using a radial or sigmoidal activation function is commonly used. Computation of the model output, for prediction or classification purposes, is straightforward with the ELM after the first step. In the original MLM, one needs to solve an additional multilateration problem for the estimation of the distance-regression based output. A natural combination of these two techniques is proposed and experimented here: to use the distance-based basis characteristic in the MLM in the learning framework of the regularized ELM. In other words, we conduct ridge regression using a distance-based basis. The experimental results characterize the basic features of the proposed technique and surprisingly, indicate that overlearning with the distance-based basis is in practice avoided in classification problems. This makes the model selection for the proposed method trivial, at the expense of computational costs. peerReviewed

https://doi.org/10.1016/j.neucom.2018.12.078