Title
Function Approximation With Spiked Random Networks
Keywords
Function approximation random neural networks; Spiked neural networks
Abstract
This paper examines the function approximation properties of the "random neural-network model" or GNN. The output of the GNN can be computed from the firing probabilities of selected neurons. We consider a feedforward Bipolar GNN (BGNN) model which has both "positive and negative neurons" in the output layer, and prove that the BGNN is a universal function approximator. Specifically, for any f ∈ C([0, 1] s) and any ∈ > 0, we show that there exists a feedforward BGNN which approximates f uniformly with error less than ∈. We also show that after some appropriate clamping operation on its output, the feedforward GNN is also a universal function approximator. © 1999 IEEE.
Publication Date
12-1-1999
Publication Title
IEEE Transactions on Neural Networks
Volume
10
Issue
1
Number of Pages
3-9
Document Type
Article
Personal Identifier
scopus
DOI Link
https://doi.org/10.1109/72.737488
Copyright Status
Unknown
Socpus ID
0032762126 (Scopus)
Source API URL
https://api.elsevier.com/content/abstract/scopus_id/0032762126
STARS Citation
Gelenbe, Erol; Mao, Zhi Hong; and Li, Yan Da, "Function Approximation With Spiked Random Networks" (1999). Scopus Export 1990s. 4268.
https://stars.library.ucf.edu/scopus1990/4268