Gradient Boosted Decision Trees


A long while ago I had posted about how a “A Bunch of Weak Classifiers Working Together Can Outperform a Single Strong Classifier.” Such a bunch of classifiers/regressors is called an ensemble. As mentioned in that post, bagging and boosting are two techniques that are used to create an ensemble of classifiers/regressors. In this post, I will explain how to build an ensemble of decision trees using gradient boosting. Before going into the details, however, let us first understand boosting, gradient boosting, and why the decision tree classifiers and regression trees are the most suitable candidates for creating an ensemble.

Please continue reading at

https://www.iksinc.tech/2024/02/gradient-boosted-regression-and.html