Entropy learning and relevance criteria for neural network pruning

In this paper, entropy is a term used in the learning phase of a neural network. As learning progresses, more hidden nodes get into saturation. The early creation of such hidden nodes may impair generalisation. Hence an entropy approach is proposed to dampen the early creation of such nodes by using...

Full description

Saved in:
Bibliographic Details
Main Authors: Geok, See Ng, Abdul Rahman, Abdul Wahab, Shi, Daming
Format: Article
Language:English
Published: World Scientific Publishing Company 2003
Subjects:
Online Access:http://irep.iium.edu.my/38198/1/Entropy_learning_and_relevance_criteria_for_neural_network_pruning.pdf
http://irep.iium.edu.my/38198/
http://www.worldscientific.com/doi/abs/10.1142/S0129065703001637
Tags: Add Tag
No Tags, Be the first to tag this record!
id my.iium.irep.38198
record_format dspace
spelling my.iium.irep.381982014-09-12T01:37:35Z http://irep.iium.edu.my/38198/ Entropy learning and relevance criteria for neural network pruning Geok, See Ng Abdul Rahman, Abdul Wahab Shi, Daming T Technology (General) In this paper, entropy is a term used in the learning phase of a neural network. As learning progresses, more hidden nodes get into saturation. The early creation of such hidden nodes may impair generalisation. Hence an entropy approach is proposed to dampen the early creation of such nodes by using a new computation called entropy cycle. Entropy learning also helps to increase the importance of relevant nodes while dampening the less important nodes. At the end of learning, the less important nodes can then be pruned to reduce the memory requirements of the neural network. World Scientific Publishing Company 2003-10 Article REM application/pdf en http://irep.iium.edu.my/38198/1/Entropy_learning_and_relevance_criteria_for_neural_network_pruning.pdf Geok, See Ng and Abdul Rahman, Abdul Wahab and Shi, Daming (2003) Entropy learning and relevance criteria for neural network pruning. Internation Journal of Neural Systems, 13 (5). pp. 291-305. ISSN 0129-0657 (P), 1793-6462 (O) http://www.worldscientific.com/doi/abs/10.1142/S0129065703001637
institution Universiti Islam Antarabangsa Malaysia
building IIUM Library
collection Institutional Repository
continent Asia
country Malaysia
content_provider International Islamic University Malaysia
content_source IIUM Repository (IREP)
url_provider http://irep.iium.edu.my/
language English
topic T Technology (General)
spellingShingle T Technology (General)
Geok, See Ng
Abdul Rahman, Abdul Wahab
Shi, Daming
Entropy learning and relevance criteria for neural network pruning
description In this paper, entropy is a term used in the learning phase of a neural network. As learning progresses, more hidden nodes get into saturation. The early creation of such hidden nodes may impair generalisation. Hence an entropy approach is proposed to dampen the early creation of such nodes by using a new computation called entropy cycle. Entropy learning also helps to increase the importance of relevant nodes while dampening the less important nodes. At the end of learning, the less important nodes can then be pruned to reduce the memory requirements of the neural network.
format Article
author Geok, See Ng
Abdul Rahman, Abdul Wahab
Shi, Daming
author_facet Geok, See Ng
Abdul Rahman, Abdul Wahab
Shi, Daming
author_sort Geok, See Ng
title Entropy learning and relevance criteria for neural network pruning
title_short Entropy learning and relevance criteria for neural network pruning
title_full Entropy learning and relevance criteria for neural network pruning
title_fullStr Entropy learning and relevance criteria for neural network pruning
title_full_unstemmed Entropy learning and relevance criteria for neural network pruning
title_sort entropy learning and relevance criteria for neural network pruning
publisher World Scientific Publishing Company
publishDate 2003
url http://irep.iium.edu.my/38198/1/Entropy_learning_and_relevance_criteria_for_neural_network_pruning.pdf
http://irep.iium.edu.my/38198/
http://www.worldscientific.com/doi/abs/10.1142/S0129065703001637
_version_ 1643611369739649024
score 13.160551