Feature selection for support vector regression via Kernel penalization

Maldonado S.; Weber R.

Keywords: model, algorithm, selection, support, regression, networks, criteria, gradient, extraction, errors, algorithms, formulations, non-linear, analysis, vector, methods, method, descent, regressions, problems, Neural, dual, feature, Iterative, minimal, Stopping, anisotropic, Bench-mark, RBF

Abstract

This paper presents a novel feature selection approach (KP-SVR) that determines a non-linear regression function with minimal error and simultaneously minimizes the number of features by penalizing their use in the dual formulation of SVR. The approach optimizes the width of an anisotropic RBF Kernel using an iterative algorithm based on the gradient descent method, eliminating features that have low relevance for the regression model. Our approach presents an explicit stopping criterion, indicating clearly when eliminating further features begins to affect negatively the model's performance. Experiments with two real-world benchmark problems demonstrate that our approach accomplishes the best performance compared to well-known feature selection methods using consistently a small number of features. © 2010 IEEE.

Más información

Título de la Revista: 2010 INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS IJCNN 2010
Editorial: PERGAMON-ELSEVIER SCIENCE LTD
Fecha de publicación: 2010
URL: http://www.scopus.com/inward/record.url?eid=2-s2.0-77958098707&partnerID=q2rCbXpz