Regularized minimax probability machine
Abstract
In this paper, we propose novel second-order cone programming formulations for binary classification, by extending the Minimax Probability Machine (MPM) approach. Inspired by Support Vector Machines, a regularization term is included in the MPM and Minimum Error Minimax Probability Machine (MEMPM) methods. This inclusion reduces the risk of obtaining ill-posed estimators, stabilizing the problem, and, therefore, improving the generalization performance. Our approaches are first derived as linear methods, and subsequently extended as kernel-based strategies for nonlinear classification. Experiments on well-known binary classification datasets demonstrate the virtues of the regularized formulations in terms of predictive performance. (C) 2019 Elsevier B.V. All rights reserved.
Más información
| Título según WOS: | Regularized minimax probability machine |
| Título según SCOPUS: | Regularized minimax probability machine |
| Título de la Revista: | KNOWLEDGE-BASED SYSTEMS |
| Volumen: | 177 |
| Editorial: | Elsevier |
| Fecha de publicación: | 2019 |
| Página de inicio: | 127 |
| Página final: | 135 |
| Idioma: | English |
| DOI: |
10.1016/j.knosys.2019.04.016 |
| Notas: | ISI, SCOPUS |