Embedded feature selection for robust probability learning machines
Keywords: feature selection, support vector machines, cobb-douglas, second-order cone programming, Minimax Probability Machine, Minimum Error Minimax Probability Machine
Abstract
Methods: Feature selection is essential for building effective machine learning models in binary classification. Eliminating unnecessary features can reduce the risk of overfitting and improve classification performance. Moreover, the data we handle typically contains a stochastic component, making it important to develop robust models that are insensitive to data perturbations. Although there are numerous methods and tools for feature selection, relatively few studies address embedded feature selection within robust classification models using penalization techniques. Objective: In this work, we introduce robust classifiers with integrated feature selection capabilities, utilizing probability machines based on different penalization techniques, such as the ?
Más información
| Título según WOS: | Embedded feature selection for robust probability learning machines |
| Título según SCOPUS: | Embedded feature selection for robust probability learning machines |
| Título de la Revista: | Pattern Recognition |
| Volumen: | 159 |
| Editorial: | Elsevier Ltd. |
| Fecha de publicación: | 2025 |
| Idioma: | English |
| DOI: |
10.1016/j.patcog.2024.111157 |
| Notas: | ISI, SCOPUS |