Copyright © 2025 Authors retain the copyright of this article. This article is an open access article distributed under the Creative Commons Attribution License which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.
@article{167156, author = {Mrs.D.Malarvizhi and Mrs. Kiruthika S}, title = {A comparative study of boosting algorithms: Concepts, Algorithms, Applications, and Prospects}, journal = {International Journal of Innovative Research in Technology}, year = {2025}, volume = {11}, number = {3}, pages = {1816-1825}, issn = {2349-6002}, url = {https://ijirt.org/article?manuscript=167156}, abstract = {Recently, a number of intriguing ideas that prioritize accuracy and speed—such as XGBoost, LightGBM, and CatBoost—have added to the family of gradient boosting algorithms. Scalable ensemble method XGBoost has proven to be a dependable and effective machine learning problem solution. LightGBM is a precise model that uses selective selection of high gradient cases to provide incredibly quick training performance. To increase the model's accuracy, CatBoost alters the gradient computation to prevent the prediction shift. This paper offers a useful examination of the performance of these unique gradient boosting variations in terms of training speed, generalization, and hyper-parameter configuration. Furthermore, a thorough comparison of gradient boosting, random forests, LightGBM, XGBoost, and random forests has been carried out utilizing both the models' default settings and highly calibrated models. Despite the few discrepancies, the comparison's findings show that CatBoost outperforms the other algorithms in terms of both AUC and generalization accuracy throughout the datasets under study. While LightGBM is the quickest approach available, it isn't the most precise. In terms of training speed and accuracy, XGBoost comes in second. Lastly, two new techniques are suggested and used to do a thorough examination of the impact of hyper-parameter adjustment in XGBoost, LightGBM, and CatBoost.}, keywords = {XGBoost, LightGBM, CatBoost, AdaBoost, Gradient boosting}, month = {January}, }
Cite This Article
Submit your research paper and those of your network (friends, colleagues, or peers) through your IPN account, and receive 800 INR for each paper that gets published.
Join NowNational Conference on Sustainable Engineering and Management - 2024 Last Date: 15th March 2024
Submit inquiry