Incremental Ensemble of Deep Learning with Pseudorehearsal

Mellado, Diego; salas, Rodrigo; Chabert, Stéren; Veloz, Alejandro; Saavedra, Carolina

Keywords: Deep Learning, Incremental Learning, Rehearsal, Big Data

Abstract

Deep learning models have recently increased in popularity in recent years. But despite of its multiple advantages over numerous Machine Learning algorithms due to its way of interpreting data, Deep learning models still share some of its common difficulties for learning new, unlearned data. As a solution to the catastrophic forgetting phenomena, there has been a huge number of solutions proposed to improve model generalization, as for example random dropout of neurons, neural network ensembles and non-parametric models. We propose a model able to learn local and global features of new tasks by incrementally increasing neurons within each hidden layer, along with a study of the effects of remembering of small volumes of already learned data while training new tasks. We used MNIST database for training, and compared the proposed model performance with another in-cremental learning model. The proposed model has an accuracy of 68.34% after learning only new data from 7 of a total of 10 classes. After presenting a few instances of already learned data along with the new learning tasks, accuracy for this model greatly increases. We conclude that it's possible to train a model with new data along with very few instances of old data in order to improve its learning performance considerably. Incremental Ensemble of Deep Learning with Pseudorehearsal. Available from: https://www.researchgate.net/publication/314645283_Incremental_Ensemble_of_Deep_Learning_with_Pseudorehearsal [accessed May 8, 2017].

Más información

Fecha de publicación: 2017
Año de Inicio/Término: 9-11 Marzo
Idioma: Inglés