Bayes-optimal minimax probability machines

Maldonado S.; Lopez J.; Carrasco M.; Bosch P.

Keywords: support vector machines, gaussian distribution, second-order cone programming, Minimax Probability Machine, Minimum Error Minimax Probability Machine

Abstract

The Minimax Probability Machine (MPM) model, a robust machine learning method, has been proposed to address the stochastic nature of data distributions by minimizing the upper bound on classification error probabilities. However, the worst-case scenario assumption in traditional MPM models may not be ideal for datasets that more closely follow a multivariate Gaussian distribution. This paper seeks to bridge this gap by introducing Gaussian adaptations to two MPM variants: the Robust Maximum Margin Classifier (RMMC) and the Cobb-Douglas Learning Machine (CD-LeMa), resulting in three novel methodologies that function as optimal Bayes classifiers. An extensive empirical comparison of various MPM variants, considering both robust and Gaussian cases, provides valuable insights into their predictive capabilities. The proposed Gaussian versions of the MPM models achieve the best performance in 71% of cases, with the RMMC variant emerging as the top approach (an average rank of 2.32 among 10 classifiers, with 1 being the top-ranked method on a given dataset). The Gaussian variants excel primarily in medium-sized datasets (1000 samples or more) and, for the MEMPM, RMMC, and CD-LeMa methods, perform better than their robust counterparts in 90% of cases.

Más información

Título según WOS: Bayes-optimal minimax probability machines
Volumen: 171
Fecha de publicación: 2026
Idioma: English
DOI:

10.1016/j.patcog.2025.112068

Notas: ISI