Improved spikeprop algorithm for neural network learning

Spiking Neural Network (SNN) utilizes individual spikes in time domain to communicate and to perform computation in a manner like what the real neurons actually do. SNN had remained unexplored for many years because it was considered too complex and too difficult to analyze. Since Sander Bothe intro...

Full description

Saved in:
Bibliographic Details
Main Author: Ahmed, Falah Y. H.
Format: Thesis
Language:English
Published: 2013
Subjects:
Online Access:http://eprints.utm.my/id/eprint/33796/5/FalahYHAhmedPFSKSM2013.pdf
http://eprints.utm.my/id/eprint/33796/
http://dms.library.utm.my:8080/vital/access/manager/Repository/vital:70130?site_name=Restricted Repository
Tags: Add Tag
No Tags, Be the first to tag this record!
id my.utm.33796
record_format eprints
spelling my.utm.337962017-07-18T06:14:34Z http://eprints.utm.my/id/eprint/33796/ Improved spikeprop algorithm for neural network learning Ahmed, Falah Y. H. QA Mathematics Spiking Neural Network (SNN) utilizes individual spikes in time domain to communicate and to perform computation in a manner like what the real neurons actually do. SNN had remained unexplored for many years because it was considered too complex and too difficult to analyze. Since Sander Bothe introduced SpikeProp as a supervised learning model for SNN in 2002, many problems which were not clearly known regarding the characteristics of SNN have now been understood. Despite the success of Bohte in his pioneering work on SpikeProp, his algorithm is dictated by fixed time convergence in the iterative process to get optimum initial weights and the lengthy procedure in implementing the sequence of complete learning for classification purposes. Therefore, this thesis proposes an improvement to Bohte’s algorithm by introducing acceleration factors of Particle Swarm Optimization (PSO) denoted as Model 1; SpikeProp using ? Angle driven Learning rate dependency as Model 2; SpikeProp using Radius Initial Weight as Model 3a, and SpikeProp using Differential Evolution (DE) Weights Initialization as Model 3b.The hybridization of Model 1 and Model 2 gives Model 4, and finally Model 5 is obtained from the hybridization of Model 1, Model 3a and Model 3b. With these new methods, it was observed that the errors can be reduced accordingly. Training and classification properties of the new proposed methods were investigated using datasets from Machine Learning Benchmark Repository. Performance results of the proposed Models (for which graphs of time errors with iterative timings, table of number of iterations required to reduce time error measurement to saturation level and bar charts of accuracy at saturation time error for all the datasets have been plotted and drawn up) were compared with one another and with the performance results of Standard SpikeProp and Backpropagation (BP). Results indicated that the performances of Model 4, Model5 and Model 1 are better than Model 2, Model 3a and Model 3b. The findings also reveal that all the proposed models perform better than Standard SpikeProp and BP for all datasets used. 2013-05 Thesis NonPeerReviewed application/pdf en http://eprints.utm.my/id/eprint/33796/5/FalahYHAhmedPFSKSM2013.pdf Ahmed, Falah Y. H. (2013) Improved spikeprop algorithm for neural network learning. PhD thesis, Universiti Teknologi Malaysia, Faculty of Computing. http://dms.library.utm.my:8080/vital/access/manager/Repository/vital:70130?site_name=Restricted Repository
institution Universiti Teknologi Malaysia
building UTM Library
collection Institutional Repository
continent Asia
country Malaysia
content_provider Universiti Teknologi Malaysia
content_source UTM Institutional Repository
url_provider http://eprints.utm.my/
language English
topic QA Mathematics
spellingShingle QA Mathematics
Ahmed, Falah Y. H.
Improved spikeprop algorithm for neural network learning
description Spiking Neural Network (SNN) utilizes individual spikes in time domain to communicate and to perform computation in a manner like what the real neurons actually do. SNN had remained unexplored for many years because it was considered too complex and too difficult to analyze. Since Sander Bothe introduced SpikeProp as a supervised learning model for SNN in 2002, many problems which were not clearly known regarding the characteristics of SNN have now been understood. Despite the success of Bohte in his pioneering work on SpikeProp, his algorithm is dictated by fixed time convergence in the iterative process to get optimum initial weights and the lengthy procedure in implementing the sequence of complete learning for classification purposes. Therefore, this thesis proposes an improvement to Bohte’s algorithm by introducing acceleration factors of Particle Swarm Optimization (PSO) denoted as Model 1; SpikeProp using ? Angle driven Learning rate dependency as Model 2; SpikeProp using Radius Initial Weight as Model 3a, and SpikeProp using Differential Evolution (DE) Weights Initialization as Model 3b.The hybridization of Model 1 and Model 2 gives Model 4, and finally Model 5 is obtained from the hybridization of Model 1, Model 3a and Model 3b. With these new methods, it was observed that the errors can be reduced accordingly. Training and classification properties of the new proposed methods were investigated using datasets from Machine Learning Benchmark Repository. Performance results of the proposed Models (for which graphs of time errors with iterative timings, table of number of iterations required to reduce time error measurement to saturation level and bar charts of accuracy at saturation time error for all the datasets have been plotted and drawn up) were compared with one another and with the performance results of Standard SpikeProp and Backpropagation (BP). Results indicated that the performances of Model 4, Model5 and Model 1 are better than Model 2, Model 3a and Model 3b. The findings also reveal that all the proposed models perform better than Standard SpikeProp and BP for all datasets used.
format Thesis
author Ahmed, Falah Y. H.
author_facet Ahmed, Falah Y. H.
author_sort Ahmed, Falah Y. H.
title Improved spikeprop algorithm for neural network learning
title_short Improved spikeprop algorithm for neural network learning
title_full Improved spikeprop algorithm for neural network learning
title_fullStr Improved spikeprop algorithm for neural network learning
title_full_unstemmed Improved spikeprop algorithm for neural network learning
title_sort improved spikeprop algorithm for neural network learning
publishDate 2013
url http://eprints.utm.my/id/eprint/33796/5/FalahYHAhmedPFSKSM2013.pdf
http://eprints.utm.my/id/eprint/33796/
http://dms.library.utm.my:8080/vital/access/manager/Repository/vital:70130?site_name=Restricted Repository
_version_ 1643649433398673408
score 13.211869