Boosting 90500 223002924 2008-07-02T03:27:29Z Matuag 1705168 {{other}} '''Boosting''' is a [[machine learning]] [[meta-algorithm]] for performing [[supervised learning]]. Boosting is based on the question posed by Kearns<ref name="Kearns88">Michael Kearns. Thoughts on hypothesis boosting. Unpublished manuscript. 1988</ref>: can a set of '''weak learners''' create a single '''strong learner''' ? A weak learner is defined to be a classifier which is only slightly correlated with the true classification. In contrast, a strong learner is a classifier that is arbitrarily well-correlated with the true classification. The affirmative answer to Kearns' question has significant ramifications in [[machine learning]] and [[statistics]]. == Boosting algorithms == While boosting is not algorithmically constrained, most boosting algorithms consist of iteratively learning weak classifiers with respect to a distribution and adding them to a final strong classifier. When they are added, they are typically weighted in some way, which is typically related to the weak learner's accuracy. After a weak learner is added, the data is reweighted: examples that are misclassified gain weight and examples that are classified correctly lose weight (some boosting algorithms actually decrease the weight of repeatedly misclassified examples, e.g., [[boost by majority]] and [[BrownBoost]]). Thus, future weak learners focus more on the examples that previous weak learners misclassified. There are many boosting algorithms. The original ones, proposed by [[Robert Schapire]] (a recursive majority gate formulation <ref name="Schapire90">Rob Schapire. Strength of Weak Learnability. Journal of Machine Learning Vol. 5, pages 197-227. 1990</ref>) and [[Yoav Freund]] (boost by majority <ref name="Freund90">Yoav Freund. Boosting a weak learning algorithm by majority. Proceedings of the Third Annual Workshop on Computational Learning Theory. 1990</ref>) were not adaptive and could not take full advantage of the weak learners. Only algorithms that are provable boosting algorithms in the [[probably approximately correct learning]] formulation are boosting algorithms. Other algorithms that are similar in spirit to boosting algorithms are sometimes called "leveraging algorithms", although they are also sometimes incorrectly called boosting algorithms.<ref name="Krause04">Nir Krause and Yoram Singer. Leveraging the margin more carefully. In Proceedings of the International Conference on Machine Learning (ICML), 2004.</ref> == Examples of boosting algorithms == The main variation between many boosting algorithms is their method of weighting training data points and hypotheses. [[AdaBoost]] is very popular and perhaps the most significant historically as it was the first algorithm that could adapt to the weak learners. However, there are many more recent algorithms such as [[LPBoost]], [[TotalBoost]], [[BrownBoost]],[[MadaBoost]], [[LogitBoost]], and others. Many boosting algorithms fit into the [[AnyBoost]] framework,<ref name="Mason00">Llew Mason, Jonathan Baxter, Peter Bartlett, and Marcus Frean. Boosting algorithms as gradient descent. In S.A. Solla, T.K. Leen, and K.-R. Muller, editors, Advances in Neural Information Processing Systems 12, pages 512--518. MIT Press, 2000</ref> which shows that boosting performs [[gradient descent]] in [[function space]] using a [[convex]] cost function. == See also == {| || * [[AdaBoost]] * [[Alternating decision tree]] * [[Bootstrap aggregating]] * [[BrownBoost]] * [[CoBoosting]] || * [[LPBoost]] * [[logistic regression]] * [[Principle of maximum entropy|maximum entropy methods]] * [[neural network]]s * [[support vector machine]]s * [[margin classifier]]s |} == References == ===Footnotes=== <div class="references-small"> <references /> </div> ===Notations=== * Yoav Freund and Robert E. Schapire A decision-theoretic generalization of on-line learning and an application to boosting. Journal of Computer and System Sciences, 55(1):119--139, 1997. http://www.cse.ucsd.edu/~yfreund/papers/adaboost.pdf * Robert E. Schapire and Yoram Singer. Improved Boosting Algorithms Using Confidence-Rated Predictors. Machine Learning, 37(3):297--336, 1999. http://citeseer.ist.psu.edu/schapire99improved.html == External links == * [http://citeseer.ist.psu.edu/489339.html The boosting approach to machine learning: An overview] * [http://citeseer.ist.psu.edu/schapire90strength.html The strength of weak learnability] * [http://www.cs.princeton.edu/~schapire/boost.html An up-to-date collection of papers on boosting] [[Category:Classification algorithms]] [[Category:Ensemble learning]] [[de:Boosting]] [[fr:Boosting]] [[ja:ブースティング]]