Comparison of Gradient Boosting Decision Tree Algorithms for CPU Performance

Gradient Boosting Decision Trees (GBDT) algorithms have been proven to be among the best algorithms in machine learning. XGBoost, the most popular GBDT algorithm, has won many competitions on websites like Kaggle. However, XGBoost is not the only GBDT algorithm with state-of-the-art performance. T...

Full description

Saved in:
Bibliographic Details
Main Authors: Haithm Haithm, ALSHARI, Abdulrazak Yahya, Saleh, Alper, ODABAŞ
Format: Article
Language:English
Published: Igdir University 2021
Subjects:
Online Access:http://ir.unimas.my/id/eprint/35224/2/abstract%20comparison.pdf
http://ir.unimas.my/id/eprint/35224/
https://dergipark.org.tr/tr/pub/erciyesfen/issue/62093/880315
Tags: Add Tag
No Tags, Be the first to tag this record!
id my.unimas.ir.35224
record_format eprints
spelling my.unimas.ir.352242023-08-22T02:10:24Z http://ir.unimas.my/id/eprint/35224/ Comparison of Gradient Boosting Decision Tree Algorithms for CPU Performance Haithm Haithm, ALSHARI Abdulrazak Yahya, Saleh Alper, ODABAŞ QA75 Electronic computers. Computer science Gradient Boosting Decision Trees (GBDT) algorithms have been proven to be among the best algorithms in machine learning. XGBoost, the most popular GBDT algorithm, has won many competitions on websites like Kaggle. However, XGBoost is not the only GBDT algorithm with state-of-the-art performance. There are other GBDT algorithms that have more advantages than XGBoost and sometimes even more potent like LightGBM and CatBoost. This paper aims to compare the performance of CPU implementation of the top three gradient boosting algorithms. We start by explaining how the three algorithms work and the hyperparameters similarities between them. Then we use a variety of performance criteria to evaluate their performance. We divide the performance criteria into four: accuracy, speed, reliability, and ease of use. The performance of the three algorithms has been tested with five classification and regression problems. Our findings show that the LightGBM algorithm has the best performance of the three with a balanced combination of accuracy, speed, reliability, and ease of use, followed by XGBoost with the histogram method, and CatBoost came last with slow and inconsistent performance. Igdir University 2021-04-28 Article PeerReviewed text en http://ir.unimas.my/id/eprint/35224/2/abstract%20comparison.pdf Haithm Haithm, ALSHARI and Abdulrazak Yahya, Saleh and Alper, ODABAŞ (2021) Comparison of Gradient Boosting Decision Tree Algorithms for CPU Performance. Journal of Institue Of Science and Technology, 37 (1). pp. 157-168. ISSN 2467-9240 https://dergipark.org.tr/tr/pub/erciyesfen/issue/62093/880315
institution Universiti Malaysia Sarawak
building Centre for Academic Information Services (CAIS)
collection Institutional Repository
continent Asia
country Malaysia
content_provider Universiti Malaysia Sarawak
content_source UNIMAS Institutional Repository
url_provider http://ir.unimas.my/
language English
topic QA75 Electronic computers. Computer science
spellingShingle QA75 Electronic computers. Computer science
Haithm Haithm, ALSHARI
Abdulrazak Yahya, Saleh
Alper, ODABAŞ
Comparison of Gradient Boosting Decision Tree Algorithms for CPU Performance
description Gradient Boosting Decision Trees (GBDT) algorithms have been proven to be among the best algorithms in machine learning. XGBoost, the most popular GBDT algorithm, has won many competitions on websites like Kaggle. However, XGBoost is not the only GBDT algorithm with state-of-the-art performance. There are other GBDT algorithms that have more advantages than XGBoost and sometimes even more potent like LightGBM and CatBoost. This paper aims to compare the performance of CPU implementation of the top three gradient boosting algorithms. We start by explaining how the three algorithms work and the hyperparameters similarities between them. Then we use a variety of performance criteria to evaluate their performance. We divide the performance criteria into four: accuracy, speed, reliability, and ease of use. The performance of the three algorithms has been tested with five classification and regression problems. Our findings show that the LightGBM algorithm has the best performance of the three with a balanced combination of accuracy, speed, reliability, and ease of use, followed by XGBoost with the histogram method, and CatBoost came last with slow and inconsistent performance.
format Article
author Haithm Haithm, ALSHARI
Abdulrazak Yahya, Saleh
Alper, ODABAŞ
author_facet Haithm Haithm, ALSHARI
Abdulrazak Yahya, Saleh
Alper, ODABAŞ
author_sort Haithm Haithm, ALSHARI
title Comparison of Gradient Boosting Decision Tree Algorithms for CPU Performance
title_short Comparison of Gradient Boosting Decision Tree Algorithms for CPU Performance
title_full Comparison of Gradient Boosting Decision Tree Algorithms for CPU Performance
title_fullStr Comparison of Gradient Boosting Decision Tree Algorithms for CPU Performance
title_full_unstemmed Comparison of Gradient Boosting Decision Tree Algorithms for CPU Performance
title_sort comparison of gradient boosting decision tree algorithms for cpu performance
publisher Igdir University
publishDate 2021
url http://ir.unimas.my/id/eprint/35224/2/abstract%20comparison.pdf
http://ir.unimas.my/id/eprint/35224/
https://dergipark.org.tr/tr/pub/erciyesfen/issue/62093/880315
_version_ 1775627300384014336
score 13.160551