Title

Generating Large-Scale Neural Networks Through Discovering Geometric Regularities

Keywords

Compositional pattern producing networks; HyperNEAT; Large-scale artificial neural networks; NEAT

Abstract

Connectivity patterns in biological brains exhibit many repeating motifs. This repetition mirrors inherent geometric regularities in the physical world. For example, stimuli that excite adjacent locations on the retina map to neurons that are similarly adjacent in the visual cortex. That way, neural connectivity can exploit geometric locality in the outside world by employing local connections in the brain. If such regularities could be discovered by methods that evolve artificial neural networks (ANNs), then they could be similarly exploited to solve problems that would otherwise require optimizing too many dimensions to solve. This paper introduces such a method, called Hypercube-based Neuroevolution of Augmenting Topologies (HyperNEAT), which evolves a novel generative encoding called connective Compositional Pattern Producing Networks (connective CPPNs) to discover geometric regularities in the task domain. Connective CPPNs encode connectivity patterns as concepts that are independent of the number of inputs or outputs, allowing functional large-scale neural networks to be evolved. In this paper, this approach is tested in a simple visual task for which it effectively discovers the correct underlying regularity, allowing the solution to both generalize and scale without loss of function to an ANN of over eight million connections. Copyright 2007 ACM.

Publication Date

8-27-2007

Publication Title

Proceedings of GECCO 2007: Genetic and Evolutionary Computation Conference

Number of Pages

997-1004

Document Type

Article; Proceedings Paper

Personal Identifier

scopus

DOI Link

https://doi.org/10.1145/1276958.1277158

Socpus ID

34548086552 (Scopus)

Source API URL

https://api.elsevier.com/content/abstract/scopus_id/34548086552

This document is currently not available here.

Share

COinS