Imbalanced data classification using second-order cone programming support vector machines
Abstract
Learning from imbalanced data sets is an important machine learning challenge, especially in Support Vector Machines (SVM), where the assumption of equal cost of errors is made and each object is treated independently. Second-order cone programming SVM (SOCP-SVM) studies each class separately instead, providing quite an interesting formulation for the imbalanced classification task. This work presents a novel second-order cone programming (SOCP) formulation, based on the LP-SVM formulation principle: the bound of the VC dimension is loosened properly using the l(infinity)-norm, and the margin is directly maximized using two margin variables associated with each class. A regularization parameter C is considered in order to control the trade-off between the maximization of these two margin variables. The proposed method has the following advantages: it provides better results, since it is specially designed for imbalanced classification, and it reduces computational complexity, since one conic restriction is eliminated. Experiments on benchmark imbalanced data sets demonstrate that our approach accomplishes the best classification performance, compared with the traditional SOCP-SVM formulation and with cost-sensitive formulations for linear SVM. (C) 2013 Elsevier Ltd. All rights reserved.
Más información
Título según WOS: | Imbalanced data classification using second-order cone programming support vector machines |
Título de la Revista: | PATTERN RECOGNITION |
Volumen: | 47 |
Número: | 5 |
Editorial: | ELSEVIER SCI LTD |
Fecha de publicación: | 2014 |
Página de inicio: | 2070 |
Página final: | 2079 |
Idioma: | English |
URL: | http://linkinghub.elsevier.com/retrieve/pii/S0031320313005074 |
DOI: |
10.1016/j.patcog.2013.11.021 |
Notas: | ISI |