The Impact of Normalization Techniques on Performance Backpropagation Networks

Neural networks (NN) are computational models with the capacity to learn, generalize and the most used are multi- layer perceptrons (MLP). Building successful NN applications depends on several aspects such as the process of acquiring, modeling and selecting the appropriate model. The data needs t...

Full description

Saved in:
Bibliographic Details
Main Author: Norlida, Hassan
Format: Thesis
Language:English
English
Published: 2004
Subjects:
Online Access:http://etd.uum.edu.my/1394/1/NORLIDA_BT._HASSAN.pdf
http://etd.uum.edu.my/1394/2/1.NORLIDA_BT._HASSAN.pdf
http://etd.uum.edu.my/1394/
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:Neural networks (NN) are computational models with the capacity to learn, generalize and the most used are multi- layer perceptrons (MLP). Building successful NN applications depends on several aspects such as the process of acquiring, modeling and selecting the appropriate model. The data needs to be transformed into a form that is acceptable as input to the MLP network. The transform data often determines the efficiency and possibly the accuracy of result from the network. This study explored several normalization techniques using backpropagation learning. The normalization techniques used in the experiments are Min-Max, Z-Score, Decimal Scaling, Sigmoidal, and Softmax or Logistic technique. To explore the impact of normalization technique on the performance on NN, medical datasets with Boolean target were preprocessed, trained, validated and tested using backpropagation learning algorithm. The criterion of choosing the best model is based on the highest percentage of correct prediction. Three preprocessing phase of building the NN models. The results of each normalization techniques are presented and compared with statistical approach. The results reveal that the utilization of different normalization techniques produces different NN performance. The experiments also indicate that all five normalization techniques of logistic regression achieve lower percentage of correct prediction than the results produced using NN. The findings will not only contribute towards enhancing the performance of backpropagation nets but it will also assist in making decision to the choice of normalization techniques to be applied to a particular dataset.