LocalBoost: A Parallelizable Approach to Boosting Classifiers
Keywords: classification, parallel algorithms, adaboost, ensemble learning, Learning architectures
Abstract
Ensemble learning is an active field of research with applications to a broad range of problems. Adaboost is a widely used ensemble approach, however, its computational burden is high because it uses an explicit diversity method for building the individual learners. To address this issue, we present a variant of Adaboost where the learners can be trained in parallel, exchanging information on a sparse collaborative communication that restricts the visibility among them. Experiments on 12 UCI datasets show that this approach is competitive in terms of generalization error but more efficient than Adaboost and two other parallel approximations of this algorithm.
Más información
Título según WOS: | LocalBoost: A Parallelizable Approach to Boosting Classifiers |
Título de la Revista: | NEURAL PROCESSING LETTERS |
Volumen: | 50 |
Número: | 1 |
Editorial: | Springer |
Fecha de publicación: | 2019 |
Página de inicio: | 19 |
Página final: | 41 |
Idioma: | English |
DOI: |
10.1007/s11063-018-9924-3 |
Notas: | ISI |