An iterative form of bagging in which the bootstrap sampling in subsequent iterations (see iterative algorithm) is weighted to increase the probability of including data poorly predicted in the previous iteration. The observations not included in each bootstrap sample are referred to as the out-of-bag sample and may be used to further improve accuracy. The most widely used boosting algorithm is AdaBoost. See also machine learning.
http://cseweb.ucsd.edu/~yfreund/adaboost/ Applet for AdaBoost.