Probabilistic ensemble fuzzy ARTMAP optimization using hierarchical parallel genetic algorithms

In this study, a comprehensive methodology for overcoming the design problem of the Fuzzy ARTMAP neural network is proposed. The issues addressed are the sequence of training data for supervised learning and optimum parameter tuning for parameters such as baseline vigilance. A genetic algorithm sear...

Full description

Saved in:
Bibliographic Details
Main Authors: Loo, C.K., Liew, W.S., Seera, M., Lim, Einly
Format: Article
Language:English
Published: Springer Verlag (Germany) 2015
Subjects:
Online Access:http://eprints.um.edu.my/13734/1/Probabilistic_ensemble_Fuzzy_ARTMAP_optimization.pdf
http://eprints.um.edu.my/13734/
http://link.springer.com/article/10.1007/s00521-014-1632-y
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:In this study, a comprehensive methodology for overcoming the design problem of the Fuzzy ARTMAP neural network is proposed. The issues addressed are the sequence of training data for supervised learning and optimum parameter tuning for parameters such as baseline vigilance. A genetic algorithm search heuristic was chosen to solve this multi-objective optimization problem. To further augment the ARTMAP's pattern classification ability, multiple ARTMAPs were optimized via genetic algorithm and assembled into a classifier ensemble. An optimal ensemble was realized by the inter-classifier diversity of its constituents. This was achieved by mitigating convergence in the genetic algorithms by employing a hierarchical parallel architecture. The best-performing classifiers were then combined in an ensemble, using probabilistic voting for decision combination. This study also integrated the disparate methods to operate within a single framework, which is the proposed novel method for creating an optimum classifier ensemble configuration with minimum user intervention. The methodology was benchmarked using popular data sets from UCI machine learning repository.