Bootstrap aggregating

In machine learning (ML), bootstrap aggregating, also called bagging (from bootstrap aggregating) or bootstrapping, is an ensemble metaheuristic for primarily reducing variance (as opposed to bias). It can also improve the stability and accuracy of ML classification and regression algorithms, and can reduce overfitting. Although it is usually applied to decision tree methods, it can be used with any type of method. Bagging is a special case of the ensemble averaging approach.


Developed by StudentB