Solving two-class classification problem using AdaBoost

This paper presents a learning algorithm based on AdaBoost for solving two-class classification problem. The concept of boosting is to combine several weak learners to form a highly accurate strong classifier. AdaBoost is fast and simple because it focuses on finding weak learning algorithms that on...

Full description

Saved in:
Bibliographic Details
Main Authors: Chan, Lih Heng, Salleh, Hussain
Format: Conference or Workshop Item
Language:English
Published: 2009
Subjects:
Online Access:http://eprints.utm.my/id/eprint/12457/1/LihHengChan2008_SolvingTwoClassClassificationProblem.pdf
http://eprints.utm.my/id/eprint/12457/
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:This paper presents a learning algorithm based on AdaBoost for solving two-class classification problem. The concept of boosting is to combine several weak learners to form a highly accurate strong classifier. AdaBoost is fast and simple because it focuses on finding weak learning algorithms that only need to be better than random, instead of designing an algorithm that learns deliberately over the entire space. We evaluated algorithms using Breast Cancer Wisconsin dataset which consists of 699 patterns with 9 attributes. It aims at assisting medical practitioners in breast cancer diagnosis. Thus the class output is the diagnosis prediction which is either benign or malignant. For comparison, back propagation neural network (BPNN) is developed and implemented on the same database. Experimental results show that AdaBoost is able to outperform BPNN under same experimental condition.