Title

Gfam: Evolving Fuzzy Artmap Neural Networks

Abstract

Fuzzy ARTMAP (FAM) is one of the best neural network architectures in solving classification problems. One of the limitations of Fuzzy ARTMAP that has been extensively reported in the literature is the category proliferation problem. That is Fuzzy ARTMAP has the tendency of increasing its network size, as it is confronted with more and more data, especially if the data is of noisy and/or overlapping nature. To remedy this problem a number of researchers have designed modifications to the training phase of Fuzzy ARTMAP that had the beneficial effect of reducing this phenomenon. In this paper we propose a new approach to handle the category proliferation problem in Fuzzy ARTMAP by evolving trained FAM architectures. We refer to the resulting FAM architectures as GFAM. We demonstrate through extensive experimentation that an evolved FAM (GFAM) exhibits good generalization, small size, and produces an optimal or a good sub-optimal network with a reasonable computational effort. Furthermore, comparisons of the GFAM with other approaches, proposed in the literature, that address the FAM category proliferation problem, illustrate that the GFAM has a number of advantages (i.e. produces smaller or equal size architectures, of better or as good generalization, with reduced computational complexity).

Publication Date

7-24-2006

Publication Title

FLAIRS 2006 - Proceedings of the Nineteenth International Florida Artificial Intelligence Research Society Conference

Volume

2006

Number of Pages

694-699

Document Type

Article; Proceedings Paper

Personal Identifier

scopus

Socpus ID

33746095787 (Scopus)

Source API URL

https://api.elsevier.com/content/abstract/scopus_id/33746095787

This document is currently not available here.

Share

COinS