Learning of Quasi-nonlinear Long-term Cognitive Networks using iterative numerical methods
Keywords: recurrent neural networks, Fuzzy Cognitive Maps, Long-term cognitive networks, Quasi-nonlinear reasoning
Abstract
Quasi-nonlinear Long-term Cognitive Networks (LTCNs) are an extension of Fuzzy Cognitive Maps (FCMs) for simulation and prediction problems ranging from regression and pattern classification to time series forecasting. In this extension, the quasi-nonlinear reasoning allows the model to escape from unique fixed-point attractors, while the unbounded weights equip the network with improved approximation capabilities. However, training these neural systems continues to be challenging due to their recurrent nature. Existing error-driven learning algorithms (metaheuristic-based, regression-based, and gradient-based) are either computationally demanding, fail to fine-tune the recurrent connections, or suffer from vanishing/exploding gradient issues. To bridge this gap, this paper presents a learning procedure that employs numerical iterative optimizers to solve a regularized least squares problem, aiming to enhance the precision and generalization of LTCN models. These optimizers do not require analytical knowledge about the Jacobian or the Hessian and were carefully chosen to address the inherent challenges of training recurrent neural networks. They are devoted to solving nonlinear optimization problems using trust regions, linear or quadratic approximations, and interpolations between the Gauss-Newton and gradient descent methods. In addition, we explore the model's performance for several activation functions including piecewise, sigmoid, and hyperbolic variants. The empirical studies indicate that the proposed learning procedure outperforms state-of-the-art algorithms to a significant extent.
Más información
| Título según WOS: | Learning of Quasi-nonlinear Long-term Cognitive Networks using iterative numerical methods | 
| Título de la Revista: | KNOWLEDGE-BASED SYSTEMS | 
| Volumen: | 317 | 
| Editorial: | Elsevier | 
| Fecha de publicación: | 2025 | 
| Idioma: | English | 
| DOI: | 
 10.1016/j.knosys.2025.113464  | 
| Notas: | ISI |