Title

Hyperneat: The First Five Years

Abstract

HyperNEAT, which stands for Hypercube-based NeuroEvolution of Augmenting Topologies, is a method for evolving indirectly-encoded artificial neural networks (ANNs) that was first introduced in 2007. By exploiting a unique indirect encoding called Compositional Pattern Producing Networks (CPPNs) that does not require a typical developmental stage, HyperNEAT introduced several novel capabilities to the field of neuroevolution (i.e. evolving artificial neural networks). Among these, (1) large ANNs can be compactly encoded by small genomes, (2) the size and resolution of evolved ANNs can scale up or down even after training is completed, and (3) neural structure can be evolved to exploit problem geometry. Five years after its introduction, researchers have leveraged these capabilities to produce a broad range of successful experiments and extensions that highlight the potential for future research to build further on the ideas introduced by HyperNEAT. This chapter reviews these first 5years of research that builds upon this approach, and culminates with thoughts on promising future directions.

Publication Date

1-1-2014

Publication Title

Studies in Computational Intelligence

Volume

557

Number of Pages

159-185

Document Type

Article

Personal Identifier

scopus

DOI Link

https://doi.org/10.1007/978-3-642-55337-0_5

Socpus ID

84926687865 (Scopus)

Source API URL

https://api.elsevier.com/content/abstract/scopus_id/84926687865

This document is currently not available here.

Share

COinS