Title
Enhancing Es-Hyperneat To Evolve More Complex Regular Neural Networks
Keywords
HyperNEAT; NEAT; Neuroevolution
Abstract
The recently-introduced evolvable-substrate HyperNEAT algorithm (ES-HyperNEAT) demonstrated that the placement and density of hidden nodes in an artificial neural network can be determined based on implicit information in an infinite-resolution pattern of weights, thereby avoiding the need to evolve explicit placement. However, ES-HyperNEAT is computationally expensive because it must search the entire hy-percube, and was shown only to match the performance of the original HyperNEAT in a simple benchmark problem. Iterated ES-HyperNEAT, introduced in this paper, helps to reduce computational costs by focusing the search on a sequence of two-dimensional cross-sections of the hypercube and therefore makes possible searching the hypercube at a finer resolution. A series of experiments and an analysis of the evolved networks show for the first time that iterated ES-HyperNEAT not only matches but outperforms original HyperNEAT in more complex domains because ES-HyperNEAT can evolve networks with limited connectivity, elaborate on existing network structure, and compensate for movement of information within the hypercube. Copyright 2011 ACM.
Publication Date
8-24-2011
Publication Title
Genetic and Evolutionary Computation Conference, GECCO'11
Number of Pages
1539-1546
Document Type
Article; Proceedings Paper
Personal Identifier
scopus
DOI Link
https://doi.org/10.1145/2001576.2001783
Copyright Status
Unknown
Socpus ID
84860402468 (Scopus)
Source API URL
https://api.elsevier.com/content/abstract/scopus_id/84860402468
STARS Citation
Risi, Sebastian and Stanley, Kenneth O., "Enhancing Es-Hyperneat To Evolve More Complex Regular Neural Networks" (2011). Scopus Export 2010-2014. 2704.
https://stars.library.ucf.edu/scopus2010/2704