Title
Ensemble of fast learning stochastic gradient boosting
Document Type
Article
Publication Date
1-1-2022
Abstract
Boosting is one of the most popular and powerful learning algorithms. However, due to its sequential nature in model fitting, the computational time of boosting algorithm can be prohibitive for big data analysis. In this paper, we proposed a parallel framework for boosting algorithm, called Ensemble of Fast Learning Stochastic Gradient Boosting (EFLSGB). The proposed EFLSGB is well suited for parallel execution, and therefore, can substantially reduce the computational time. Analysis of simulated and real datasets demonstrates that EFLSGB achieves highly competitive prediction accuracy in comparison with gradient tree boosting.
Publication Source (Journal or Book title)
Communications in Statistics: Simulation and Computation
First Page
40
Last Page
52
Recommended Citation
Li, B., Yu, Q., & Peng, L. (2022). Ensemble of fast learning stochastic gradient boosting. Communications in Statistics: Simulation and Computation, 51 (1), 40-52. https://doi.org/10.1080/03610918.2019.1645170