Long short-term cognitive networks
Abstract
In this paper, we present a recurrent neural system named long short-term cognitive networks (LSTCNs) as a generalization of the short-term cognitive network (STCN) model. Such a generalization is motivated by the difficulty of forecasting very long time series efficiently. The LSTCN model can be defined as a collection of STCN blocks, each processing a specific time patch of the (multivariate) time series being modeled. In this neural ensemble, each block passes information to the subsequent one in the form of weight matrices representing the prior knowledge. As a second contribution, we propose a deterministic learning algorithm to compute the learnable weights while preserving the prior knowledge resulting from previous learning processes. As a third contribution, we introduce a feature influence score as a proxy to explain the forecasting process in multivariate time series. The simulations using three case studies show that our neural system reports small forecasting errors while being significantly faster than state-of-the-art recurrent models.
Más información
Título según WOS: | Long short-term cognitive networks |
Título de la Revista: | NEURAL COMPUTING & APPLICATIONS |
Volumen: | 34 |
Número: | 19 |
Editorial: | SPRINGER LONDON LTD |
Fecha de publicación: | 2022 |
Página de inicio: | 16959 |
Página final: | 16971 |
DOI: |
10.1007/s00521-022-07348-5 |
Notas: | ISI |