An experimental study of the extended NRBF regression model and its enhancement for classification problem

As anextensionofthetraditional normalized radialbasis function (NRBF)model,the extended normalizedRBF (ENRBF) modelwas proposed by Xu [RBF nets, mixture experts, and Bayesian Ying-Yang learning, Neurocomputing 19 (1998) 223–257].Inthispaper,we perform a supplementary study on ENRBF with several prop...

Full description

Saved in:
Bibliographic Details
Main Authors: Ma, L., Abdul Rahman, Abdul Wahab, Geok, See Ng, Erdogan, Sevki
Format: Article
Language:English
Published: Elsevier 2008
Subjects:
Online Access:http://irep.iium.edu.my/38158/1/An_experimental_study_of_the_extended_NRBF_regression_model_and_its_enhancement_for_classification_problem.pdf
http://irep.iium.edu.my/38158/
http://www.sciencedirect.com/science/article/pii/S0925231207003906
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:As anextensionofthetraditional normalized radialbasis function (NRBF)model,the extended normalizedRBF (ENRBF) modelwas proposed by Xu [RBF nets, mixture experts, and Bayesian Ying-Yang learning, Neurocomputing 19 (1998) 223–257].Inthispaper,we perform a supplementary study on ENRBF with several properly designed experiments and some further theoretical discussions. It is shown that ENRBF is able to efficiently improve the learning accuracies under some circumstances. Moreover, since the ENRBF model is initially proposed for the regression and function approximation problems, a further step is taken in this work to modify the ENRBF model to deal with the classification problems. Both the original ENRBF model and the new proposed ENRBF classifier (ENRBFC) can be viewed as the special cases of the mixture-of-experts (ME) model that is discussed in Xuetal. [An alternative model for mixtures of experts, in: Advances in Neural Information Processing Systems, MITPress, Cambridge, MA, 1995]. Experimental results show the potentials of ENRBFC compared to some other related classifiers.