Evolution Performance of Symbolic Radial Basis Function Neural Network by Using Evolutionary Algorithms

Radial Basis Function Neural Network (RBFNN) ensembles have long suffered from non-efficient training, where incorrect parameter settings can be computationally disastrous. This paper examines different evolutionary algorithms for training the Symbolic Radial Basis Function Neural Network (SRBFNN)...

Full description

Saved in:
Bibliographic Details
Main Authors: Shehab Abdulhabib Alzaeemi, Shehab Abdulhabib Alzaeemi, Kim Gaik Tay, Kim Gaik Tay, Audrey Huong, Audrey Huong, Saratha Sathasivam, Saratha Sathasivam, Majid Khan bin Majahar Ali, Majid Khan bin Majahar Ali
Format: Article
Language:English
Published: 2023
Subjects:
Online Access:http://eprints.uthm.edu.my/10524/1/J16174_ee1fefba9e830abb0e36ae31d95d9997.pdf
http://eprints.uthm.edu.my/10524/
http://dx.doi.org/10.32604/csse.2023.038912
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:Radial Basis Function Neural Network (RBFNN) ensembles have long suffered from non-efficient training, where incorrect parameter settings can be computationally disastrous. This paper examines different evolutionary algorithms for training the Symbolic Radial Basis Function Neural Network (SRBFNN) through the behavior’s integration of satisfiability programming. Inspired by evolutionary algorithms, which can iteratively find the nearoptimal solution, different Evolutionary Algorithms (EAs) were designed to optimize the producer output weight of the SRBFNN that corresponds to the embedded logic programming 2Satisfiability representation (SRBFNN2SAT). The SRBFNN’s objective function that corresponds to Satisfiability logic programming can be minimized by different algorithms, including Genetic Algorithm (GA), Evolution Strategy Algorithm (ES), Differential Evolution Algorithm (DE), and Evolutionary Programming Algorithm (EP). Each of these methods is presented in the steps in the flowchart form which can be used for its straightforward implementation in any programming language. With the use of SRBFNN-2SAT, a training method based on these algorithms has been presented, then training has been compared among algorithms, which were applied in Microsoft Visual C++ software using multiple metrics of performance, including Mean Absolute Relative Error (MARE), Root Mean Square Error (RMSE), Mean Absolute Percentage Error (MAPE), Mean Bias Error (MBE), Systematic Error (SD), Schwarz Bayesian Criterion (SBC), and Central Process Unit time (CPU time). Based on the results, the EP algorithm achieved a higher training rate and simple structure compared with the rest of the algorithms. It has been confirmed that the EP algorithm is quite effective in training and obtaining the best output weight, accompanied by the slightest iteration error, which minimizes the objective function of SRBFNN-2SAT.