Advances of metaheuristic algorithms in training neural networks for industrial applications

In recent decades, researches on optimizing the parameter of the artificial neural network (ANN) model has attracted significant attention from researchers. Hybridization of superior algorithms helps improving optimization performance and capable of solving complex applications. As a traditional gra...

Full description

Saved in:
Bibliographic Details
Main Authors: Chong, Hue Yee, Yap, Hwa Jen, Tan, Shing Chiang, Yap, Keem Siah, Wong, Shen Yuong
Format: Article
Published: Springer 2021
Subjects:
Online Access:http://eprints.um.edu.my/26826/
Tags: Add Tag
No Tags, Be the first to tag this record!
id my.um.eprints.26826
record_format eprints
spelling my.um.eprints.268262022-02-24T02:49:12Z http://eprints.um.edu.my/26826/ Advances of metaheuristic algorithms in training neural networks for industrial applications Chong, Hue Yee Yap, Hwa Jen Tan, Shing Chiang Yap, Keem Siah Wong, Shen Yuong QA75 Electronic computers. Computer science In recent decades, researches on optimizing the parameter of the artificial neural network (ANN) model has attracted significant attention from researchers. Hybridization of superior algorithms helps improving optimization performance and capable of solving complex applications. As a traditional gradient-based learning algorithm, ANN suffers from a slow learning rate and is easily trapped in local minima when training techniques such as gradient descent (GD) and back-propagation (BP) algorithm are used. The characteristics of randomization and selection of the best or near-optimal solution of metaheuristic algorithm provide an effective and robust solution; therefore, it has always been used in training of ANN to improve and overcome the above problems. New metaheuristic algorithms are proposed every year. Therefore, the review of its latest developments is essential. This article attempts to summarize the metaheuristic algorithms which have been proposed from the year 1975 to 2020 from various journals, conferences, technical papers, and books. The comparison of the popularity of the metaheuristic algorithm is presented in two time frames, such as algorithms proposed in the recent 20 years and those proposed earlier. Then, some of the popular metaheuristic algorithms and their working principle are reviewed. This article further categorizes the latest metaheuristic search algorithm in the literature to indicate their efficiency in training ANN for various industry applications. More and more researchers tend to develop new hybrid optimization tools by combining two or more metaheuristic algorithms to optimize the training parameters of ANN. Generally, the algorithm's optimal performance must be able to achieve a fine balance of their exploration and exploitation characteristics. Hence, this article tries to compare and summarize the properties of various metaheuristic algorithms in terms of their convergence rate and the ability to avoid the local minima. This information is useful for researchers working on algorithm hybridization by providing a good understanding of the convergence rate and the ability to find a global optimum. Springer 2021-08 Article PeerReviewed Chong, Hue Yee and Yap, Hwa Jen and Tan, Shing Chiang and Yap, Keem Siah and Wong, Shen Yuong (2021) Advances of metaheuristic algorithms in training neural networks for industrial applications. Soft Computing, 25 (16). pp. 11209-11233. ISSN 1432-7643, DOI https://doi.org/10.1007/s00500-021-05886-z <https://doi.org/10.1007/s00500-021-05886-z>. 10.1007/s00500-021-05886-z
institution Universiti Malaya
building UM Library
collection Institutional Repository
continent Asia
country Malaysia
content_provider Universiti Malaya
content_source UM Research Repository
url_provider http://eprints.um.edu.my/
topic QA75 Electronic computers. Computer science
spellingShingle QA75 Electronic computers. Computer science
Chong, Hue Yee
Yap, Hwa Jen
Tan, Shing Chiang
Yap, Keem Siah
Wong, Shen Yuong
Advances of metaheuristic algorithms in training neural networks for industrial applications
description In recent decades, researches on optimizing the parameter of the artificial neural network (ANN) model has attracted significant attention from researchers. Hybridization of superior algorithms helps improving optimization performance and capable of solving complex applications. As a traditional gradient-based learning algorithm, ANN suffers from a slow learning rate and is easily trapped in local minima when training techniques such as gradient descent (GD) and back-propagation (BP) algorithm are used. The characteristics of randomization and selection of the best or near-optimal solution of metaheuristic algorithm provide an effective and robust solution; therefore, it has always been used in training of ANN to improve and overcome the above problems. New metaheuristic algorithms are proposed every year. Therefore, the review of its latest developments is essential. This article attempts to summarize the metaheuristic algorithms which have been proposed from the year 1975 to 2020 from various journals, conferences, technical papers, and books. The comparison of the popularity of the metaheuristic algorithm is presented in two time frames, such as algorithms proposed in the recent 20 years and those proposed earlier. Then, some of the popular metaheuristic algorithms and their working principle are reviewed. This article further categorizes the latest metaheuristic search algorithm in the literature to indicate their efficiency in training ANN for various industry applications. More and more researchers tend to develop new hybrid optimization tools by combining two or more metaheuristic algorithms to optimize the training parameters of ANN. Generally, the algorithm's optimal performance must be able to achieve a fine balance of their exploration and exploitation characteristics. Hence, this article tries to compare and summarize the properties of various metaheuristic algorithms in terms of their convergence rate and the ability to avoid the local minima. This information is useful for researchers working on algorithm hybridization by providing a good understanding of the convergence rate and the ability to find a global optimum.
format Article
author Chong, Hue Yee
Yap, Hwa Jen
Tan, Shing Chiang
Yap, Keem Siah
Wong, Shen Yuong
author_facet Chong, Hue Yee
Yap, Hwa Jen
Tan, Shing Chiang
Yap, Keem Siah
Wong, Shen Yuong
author_sort Chong, Hue Yee
title Advances of metaheuristic algorithms in training neural networks for industrial applications
title_short Advances of metaheuristic algorithms in training neural networks for industrial applications
title_full Advances of metaheuristic algorithms in training neural networks for industrial applications
title_fullStr Advances of metaheuristic algorithms in training neural networks for industrial applications
title_full_unstemmed Advances of metaheuristic algorithms in training neural networks for industrial applications
title_sort advances of metaheuristic algorithms in training neural networks for industrial applications
publisher Springer
publishDate 2021
url http://eprints.um.edu.my/26826/
_version_ 1735409463299407872
score 13.18916