Hyperparameter Optimization of Evolving Spiking Neural Network for Time-Series Classification
Spiking neural networks are the third generation of artificial neural networks that are inspired by a new brain-inspired computational model of ANN. Spiking neural network encodes and processes neural information through precisely timed spike trains. eSNN is an enhanced version of SNN, motivated by...
Saved in:
Main Authors: | , , , , |
---|---|
Format: | Article |
Published: |
2022
|
Online Access: | http://scholars.utp.edu.my/id/eprint/33979/ https://www.scopus.com/inward/record.uri?eid=2-s2.0-85127539295&doi=10.1007%2fs00354-022-00165-3&partnerID=40&md5=1292a91ea1f1dcb8a29a68bbe48db037 |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
Summary: | Spiking neural networks are the third generation of artificial neural networks that are inspired by a new brain-inspired computational model of ANN. Spiking neural network encodes and processes neural information through precisely timed spike trains. eSNN is an enhanced version of SNN, motivated by the principles of Evolving Connectionist System (ECoS), which is relatively a new classifier in the neural information processing area. The performance of eSNN is highly influenced by the values of its significant hyperparameters� modulation factor (mod), threshold factor (c), and similarity factor (sim). In contrast to the manual tuning of hyperparameters, automated tuning is more reliable. Therefore, this research presents an optimizer-based eSNN architecture, intended to solve the issue regarding optimum hyperparameters� values� selection of eSNN. The proposed model is named eSNN-SSA where SSA stands for salp swarm algorithm, which is a metaheuristic optimization technique integrated with eSNN architecture. For the integration of eSNN-SSA, Thorpe�s standard model of eSNN is used with population rate encoding. To examine the performance of eSNN-SSA, various benchmarking data sets from the UCR/UAE time-series classification repository are utilized. From the experimental results, it is concluded that the salp swarm algorithm plays an effective role in improving the flexibility of the eSNN. The proposed eSNN-SSA offers solutions to conquer the disadvantages of eSNN in determining the best number of pre-synaptic neurons for time-series classification problems. The performance accuracy obtained by eSNN-SSA was on datasets spoken Arabic digits, articulatory word recognition, character trajectories, wafer, and GunPoint, i.e., 0.96, 0.97, 0.94, 1.0, and 0.94, respectively. The proposed approach outperformed standard eSNN in terms of time complexity. © 2022, Ohmsha, Ltd. and Springer Japan KK, part of Springer Nature. |
---|