On the computational power of max-min propagation neural networks
Abstract
We investigate the computational power of max-min propagation (MMP) neural networks, composed of neurons with maximum (Max) or minimum (Min) activation functions, applied over the weighted sums of inputs. The main results presented are that a single-layer MMP network can represent exactly any pseudo-Boolean function F:{0,1}n ? [0,1], and that two-layer MMP neural networks are universal approximators. In addition, it is shown that several well-known fuzzy min-max (FMM) neural networks, such as Simpson's FMM, are representable by MMP neural networks.
Más información
| Título según WOS: | On the computational power of max-min propagation neural networks |
| Título según SCOPUS: | On the Computational Power of Max-Min Propagation Neural Networks |
| Título de la Revista: | NEURAL PROCESSING LETTERS |
| Volumen: | 19 |
| Número: | 1 |
| Editorial: | Springer |
| Fecha de publicación: | 2004 |
| Página de inicio: | 11 |
| Página final: | 23 |
| Idioma: | English |
| DOI: |
10.1023/B:NEPL.0000016837.13436.d3 |
| Notas: | ISI, SCOPUS |