Hypercomplex-valued recurrent correlation neural networks
Abstract
--- - Recurrent correlation neural networks (RCNNs), introduced by Chiueh and Goodman as an improved version of the bipolar correlation-based Hopfield neural network, can be used to implement high-capacity associative memories. In this paper, we extend the bipolar RCNNs for processing hypercomplex-valued data. Precisely, we present the mathematical background for a broad class of hypercomplex-valued RCNNs. Then, we address the stability of the new hypercomplex-valued RCNNs using synchronous and asynchronous update modes. Examples with bipolar, complex, hyperbolic, quaternion, and octonionvalued RCNNs are given to illustrate the theoretical results. Finally, computational experiments confirm the potential application of hypercomplex-valued RCNNs as associative memories designed for the storage and recall of gray-scale images. - (c) 2020 Elsevier B.V. All rights reserved.
Más información
Título según WOS: | ID WOS:000620905000011 Not found in local WOS DB |
Título de la Revista: | NEUROCOMPUTING |
Volumen: | 432 |
Editorial: | Elsevier |
Fecha de publicación: | 2021 |
Página de inicio: | 111 |
Página final: | 123 |
DOI: |
10.1016/j.neucom.2020.12.034 |
Notas: | ISI |