2 Answers
Step 1: Define Bias and Variance and talk about the “trade-off”
- The bias is an error from erroneous assumptions in the learning algorithm. High bias can cause an algorithm to miss the relevant relations between features and target outputs (underfitting).
- The variance is an error from sensitivity to small fluctuations in the training set. High variance can cause an algorithm to model the random noise in the training data, rather than the intended outputs (overfitting).
Step 2: Talk about Bagging and Boosting
Step 3: Link them together like below:
Boosting leads to over-fitting and so by boosting the model, you are increasing variance. So if you inverse it, by bagging the model you are minimizing variance.
Your Answer