BOTH BEING TREE BASED ALGORITHM, HOW IS RANDOM FOREST DIFFERENT FROM GRADIENT BOOSTING ALGORITHM (GBM)?

Data Science Interview QuestionsCategory: Data ScienceBOTH BEING TREE BASED ALGORITHM, HOW IS RANDOM FOREST DIFFERENT FROM GRADIENT BOOSTING ALGORITHM (GBM)?
MockInterview Staff asked 5 years ago
2 Answers
MockInterview Staff answered 5 years ago

Answer from Analytics Vidhya:
The fundamental difference is, random forest uses bagging technique to make predictions. GBM uses boosting techniques to make predictions.
In bagging technique, a data set is divided into n samples using randomized sampling. Then, using a single learning algorithm a model is build on all samples. Later, the resultant predictions are combined using voting or averaging. Bagging is done is parallel. In boosting, after the first round of predictions, the algorithm weighs misclassified predictions higher, such that they can be corrected in the succeeding round. This sequential process of giving higher weights to misclassified predictions continue until a stopping criterion is reached.
Random forest improves model accuracy by reducing variance (mainly). The trees grown are uncorrelated to maximize the decrease in variance. On the other hand, GBM improves accuracy my reducing both bias and variance in a model.
Know more: Tree based modeling

Your Answer

15 + 13 =