A Divide-and-Distribute Approach to Single-Cycle Learning HGN Network for Pattern Recognition
Distributed Hierarchical Graph Neuron (DHGN) is a single-cycle learning distributed pattern recognition algorithm, which reduces the computational complexity of existing pattern recognition algorithms by distributing the recognition process into smaller clusters. This paper investigates an effect of...
Saved in:
Main Authors: | , |
---|---|
Format: | Conference or Workshop Item |
Published: |
2010
|
Subjects: | |
Online Access: | http://eprints.utp.edu.my/5913/1/icarcv10.pdf http://eprints.utp.edu.my/5913/ |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
Summary: | Distributed Hierarchical Graph Neuron (DHGN) is a single-cycle learning distributed pattern recognition algorithm, which reduces the computational complexity of existing pattern recognition algorithms by distributing the recognition process into smaller clusters. This paper investigates an effect of dividing and distributing simple pattern recognition processes within a computational network. Our approach extends the single-cycle pattern recognition capability of Hierarchical Graph Neuron (HGN) for wireless sensor networks into the more generic framework of computational grids. The computational complexity of the hierarchical pattern recognition scheme is significantly reduced and the accuracy is improved. The single-cycle learning capability, which develops within the HGN, shows better noisy pattern recognition accuracy when size of the clusters is adapted to pattern data. The scheme lowers storage capacity requirements per node and incurs lesser communication complexity while retaining HGN’s response-time characteristics. Higher recall accuracy and scalability of the scheme is tested by storing large numbers of binary character patterns and heterogeneous binary images. The results show that the response-time remains insensitive to the number of stored pattern, the accuracy is improved, and the system resource requirements are significantly reduced. |
---|