Bagging and Bootstrap in Data Mining, Machine Learning


  • Bootstrap Aggregation famously knows as bagging, is a powerful and simple ensemble method.
  • An ensemble method is a technique that combines the predictions from many machine learning algorithms together to make more reliable and accurate predictions than any individual model.It means that we can say that prediction of bagging is very strong.
  • Why we use bagging?
    • The main purpose of using the bagging technique is to improve Classification Accuracy.
[quads id=1]

How does Bagging work
For example, we have 1000 observations and 200 elements. In bagging, we will create several models with a subset of variables and a subset of observations. i.e we might create 300 trees with 300 random variables and 20 observations in each tree. After that, we can average the results of all the 300 tree’s (models) to get to our final prediction.

Model’s Derivation/Estimation (In General)



bagging data mining
Figure: bagging in data mining


  • Accuracy Estimation
  • Sampling with replacement
  • Some may not be used, other may be used more than once
[quads id=2]

Bootstrap – At Abstract

bootstrap in data mining
Figure: bootstrap in data mining

Bootstrapping – in Detail

bootstraping in details
Figure: bootstrapping in details


Advantages of Bagging
Figure: Bagging

Benefits of Bagging

  • Can also improve continuous label predictive model’s accuracy
  • Ideal for parallel processing environment
  • Significantly greater accuracy than single classifier
  • Reduces the variance of individual model
  • Best in case of diverse classifiers

Differences and similarities or comparison of  boosting and bagging

Differences in boosting and bagging

Similarities of boosting and bagging

Only Boosting determines weights for the data to tip the scales in favor of the most difficult cases.Both generate several training data sets by random sampling…
Only Boosting tries to reduce bias.

Boosting can increase the over-fitting problem.

Bagging may solve the over-fitting problem.

Both are good at reducing the variance.

Both are good at providing higher stability.

A weighted average for Boosting and equally weighted average for BaggingBoth make the final decision by taking the majority of them (or averaging  the N learners)
Boosting focus is to add new models that do well where previous models fail.Both are ensemble methods to get N learners from 1 learner…