Irrelevant feature and rule removal for structural associative classification

In the classification task, the presence of irrelevant features can significantly degrade the performance of classification algorithms,in terms of additional processing time, more complex models and the likelihood that the models have poor generalization power due to the over fitting problem.Practi...

Full description

Saved in:
Bibliographic Details
Main Authors: Mohd Shaharanee, Izwan Nizal, Jamil, Jastini
Format: Article
Language:English
Published: Universiti Utara Malaysia 2015
Subjects:
Online Access:http://repo.uum.edu.my/14313/1/95-110.pdf
http://repo.uum.edu.my/14313/
http://jict.uum.edu.my
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:In the classification task, the presence of irrelevant features can significantly degrade the performance of classification algorithms,in terms of additional processing time, more complex models and the likelihood that the models have poor generalization power due to the over fitting problem.Practical applications of association rule mining often suffer from overwhelming number of rules that are generated, many of which are not interesting or not useful for the application in question.Removing rules comprised of irrelevant features can significantly improve the overall performance.In this paper, we explore and compare the use of a feature selection measure to filter out unnecessary and irrelevant features/attributes prior to association rules generation.The experiments are performed using a number of real-world datasets that represent diverse characteristics of data items.Empirical results confirm that by utilizing feature subset selection prior to association rule generation, a large number of rules with irrelevant features can be eliminated.More importantly, the results reveal that removing rules that hold irrelevant features improve the accuracy rate and capability to retain the rule coverage rate of structural associative association.