Towards a better feature subset selection approach

The selection of the optimal features subset and the classification has become an important issue in the data mining field.We propose a feature selection scheme based on slicing technique which was originally proposed for programming languages.The proposed approach called Case Slicing Technique (CST...

Full description

Saved in:
Bibliographic Details
Main Author: Shiba, Omar A. A.
Format: Conference or Workshop Item
Language:English
Published: 2010
Subjects:
Online Access:http://repo.uum.edu.my/11237/1/PG629_632.pdf
http://repo.uum.edu.my/11237/
http://www.kmice.uum.edu.my
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:The selection of the optimal features subset and the classification has become an important issue in the data mining field.We propose a feature selection scheme based on slicing technique which was originally proposed for programming languages.The proposed approach called Case Slicing Technique (CST).Slicing means that we are interested in automatically obtaining that portion 'features' of the case responsible for specific parts of the solution of the case at hand.We show that our goal should be to eliminate the number of features by removing irrelevant once.Choosing a subset of the features may increase accuracy and reduce complexity of the acquired knowledge.Our experimental results indicate that the performance of CST as a method of feature subset selection is better than the performance of the other approaches which are RELIEF with Base Learning Algorithm (C4.5), RELIEF with K-Nearest Neighbour (K-NN), RELIEF with Induction of Decision Tree Algorithm (ID3) and RELIEF with Naïve Bayes (NB), which are mostly used in the feature selection task.