Cornell University - Department of Statistical Science
Classification is a fundamental task in statistics and machine learning. Developed at Stanford Statistics, both LogitBoost and MART are highly influential boosting algorithms for classification. This talk is about ABC-Boost, a very recent work for multi-class classification, where â€œABCâ€ stands for â€œAdaptive Base Class.â€ I will talk about two implementations of ABC-Boost, named ABC-MART and ABC-LogitBoost. A numerically stable version of LogitBoost named â€œRobust LogitBoostâ€ will also be presented.
Deep Learning is currently one of the hottest topics in machine learning research. We have conducted extensive experiments on many datasets (including the gold-standard datasets used by the Deep Learning community). Our results demonstrate that: (1) ABC-MART and ABC-LogitBoost considerably improve MART and (Robust) LogitBoost, respectively; (2) Compared with SVM and the best Deep Learning algorithms, MART/LogitBoost/ABC-Boost are very competitive, especially on difficult tasks.