Enhancing quasi-nonlinear long-term cognitive networks with temporal attention for pattern classification

Napoles G.; Salgueiro Y.

Keywords: recurrent neural networks, Long-term cognitive networks, Quasi-nonlinear reasoning, Temporal attention

Abstract

Long-term Cognitive Networks (LTCNs) are knowledge-based recurrent neural networks that hold significant promise in machine learning settings, particularly in structured pattern classification. These neural systems enable hybrid intelligence by allowing domain experts to encode knowledge into the network through a non-trainable weight matrix. However, LTCN-based classifiers face two key limitations that restrict their approximation capabilities. (i) Both neural concepts and weights must map to specific components of the physical system. (ii) Temporal states are not fully exploited when deriving class labels. This paper presents an enhanced LTCN-based classifier that addresses these limitations. First, we introduce a tunable quasi-nonlinear reasoning rule. Each neural concept has independent learnable unbounded parameters that evolve over iterations. Second, we propose a temporal attention mechanism that projects hidden states to a new state space and assigns different attention weights to each iteration based on its relevance. We provide theoretical evidence that this temporal attention mechanism outperforms residual-like learnable connections. Finally, we formalize a gradient-based learning algorithm to fine-tune both the quasi-nonlinear parameters and the projection matrices used by the temporal attention mechanism. Numerical simulations on real-world datasets confirm that our classifier achieves superior classification accuracy compared to tree ensembles, neurosymbolic methods, transformer-based methods, and kernel machines.

Más información

Título según WOS: Enhancing quasi-nonlinear long-term cognitive networks with temporal attention for pattern classification
Volumen: 659
Fecha de publicación: 2026
Idioma: English
DOI:

10.1016/j.neucom.2025.131830

Notas: ISI