Title

Genetic Optimization Of Art Neural Network Architectures

Keywords

ARTMAP; Category proliferation; Classification; Genetic algorithms; Genetic operators; Machine learning

Abstract

Adaptive Resonance Theory (ART) neural network architectures, such as Fuzzy ARTMAP (FAM), Ellipsoidal ARTMAP (EAM), and Gaussian ARTMAP (GAM), have solved successfully a variety of classification problems. However, they suffer from an inherent ART problem, that of creating larger architectures than it is necessary to solve the problem at hand (referred to as the ART category proliferation problem). This problem is especially amplified for classification problems which have noisy data, and/or data, belonging to different labels, that significantly overlap. A variety of modified ART architectures, referred to as semi-supervised (ss) ART architectures (e.g., ssFAM, ssEAM, ssGAM), summarily referred to as ssART, have addressed the category proliferation problem. In this paper, we are proposing another approach of solving the ART category proliferation problem, by designing genetically engineered ART architectures, such as GFAM, GEAM, GGAM, summarily referred to as GART. In particular, in this paper, we explain how to design GART architectures and compare their performance (in terms of accuracy, size, and computational complexity) with the performance of the ssART architectures. Our results demonstrate that GART is superior to ssART, and quite often it produces the optimal classifier.

Publication Date

12-1-2007

Publication Title

Proceedings of the 11th IASTED International Conference on Artificial Intelligence and Soft Computing, ASC 2007

Number of Pages

225-230

Document Type

Article; Proceedings Paper

Personal Identifier

scopus

Socpus ID

54949124195 (Scopus)

Source API URL

https://api.elsevier.com/content/abstract/scopus_id/54949124195

This document is currently not available here.

Share

COinS