Robust learning algorithm for the mixture of experts

Allende, H.; Torres R.; Salas, R.; Moraga, C.

Abstract

The Mixture of Experts model (ME) is a type of modular artificial neural network (MANN) whose architecture is composed by different kinds of networks who compete to learn different aspects of the problem. This model is used when the searching space is stratified. The learning algorithm of the ME model consists in estimating the network parameters to achieve a desired performance. To estimate the parameters, some distributional assumptions are made, so the learning algorithm and, consequently, the parameters obtained depends on the distribution. But when the data is exposed to outliers the assumption is not longer valid, the model is affected and is very sensible to the data as it is showed in this work. We propose a robust learning estimator by means of the generalization of the maximum likelihood estimator called M-estimator. Finally a simulation study is shown, where the robust estimator presents a better performance than the maximum likelihood estimator (MLE). © Springer-Verlag Berlin Heidelberg 2003.

Más información

Título según WOS: Robust learning algorithm for the mixture of experts
Título según SCOPUS: Robust learning algorithm for the mixture of experts
Título de la Revista: LEARNING AND INTELLIGENT OPTIMIZATION, LION 15
Volumen: 2652
Editorial: SPRINGER INTERNATIONAL PUBLISHING AG
Fecha de publicación: 2003
Página de inicio: 19
Página final: 27
Idioma: English
Notas: ISI, SCOPUS