Seminar Details

Seminar Details


Apr 5

3:30 pm

Adaptive Base Class Boost for Multi-Class Classification and Comparisons with Deep Learning

Ping Li


Cornell University - Department of Statistical Science

Classification is a fundamental task in statistics and machine learning. Developed at Stanford Statistics, both LogitBoost and MART are highly influential boosting algorithms for classification. This talk is about ABC-Boost, a very recent work for multi-class classification, where “ABC” stands for “Adaptive Base Class.” I will talk about two implementations of ABC-Boost, named ABC-MART and ABC-LogitBoost. A numerically stable version of LogitBoost named “Robust LogitBoost” will also be presented.

Deep Learning is currently one of the hottest topics in machine learning research. We have conducted extensive experiments on many datasets (including the gold-standard datasets used by the Deep Learning community). Our results demonstrate that: (1) ABC-MART and ABC-LogitBoost considerably improve MART and (Robust) LogitBoost, respectively; (2) Compared with SVM and the best Deep Learning algorithms, MART/LogitBoost/ABC-Boost are very competitive, especially on difficult tasks.