Abstract
Research in neuroevolution-that is, evolving artificial neural networks (ANNs) through evolutionary algorithms-is inspired by the evolution of biological brains, which can contain trillions of connections. Yet while neuroevolution has produced successful results, the scale of natural brains remains far beyond reach. This article presents a method called hypercube-based NeuroEvolution of Augmenting Topologies (HyperNEAT) that aims to narrow this gap. HyperNEAT employs an indirect encoding called connective compositional pattern-producing networks (CPPNs) that can produce connectivity patterns with symmetries and repeating motifs by interpreting spatial patterns generated within a hypercube as connectivity patterns in a lower-dimensional space. This approach can exploit the geometry of the task by mapping its regularities onto the topology of the network, thereby shifting problem difficulty away from dimensionality to the underlying problem structure. Furthermore, connective CPPNs can represent the same connectivity pattern at any resolution, allowing ANNs to scale to new numbers of inputs and outputs without further evolution. HyperNEAT is demonstrated through visual discrimination and food-gathering tasks, including successful visual discrimination networks containing over eight million connections. The main conclusion is that the ability to explore the space of regular connectivity patterns opens up a new class of complex high-dimensional tasks to neuroevolution.
Journal Title
Artificial Life
Volume
15
Issue/Number
2
Publication Date
1-1-2009
Document Type
Article
First Page
185
Last Page
212
WOS Identifier
ISSN
1064-5462
Recommended Citation
Stanley, Kenneth O.; D'Ambrosio, David B.; and Gauci, Jason, "A Hypercube-Based Encoding for Evolving Large-Scale Neural Networks" (2009). Faculty Bibliography 2000s. 2178.
https://stars.library.ucf.edu/facultybib2000/2178
Comments
Authors: contact us about adding a copy of your work at STARS@ucf.edu