Faculty Host: Carlos Guestrin
Stat Liason: Emily Fox
Society is witnessing remarkable technological and scientific advances as numerous disciplines are adopting more advanced statistical and computational methodologies. Along with this progress comes an increasing need for scalable algorithms with solid theoretical foundations; the hope is that algorithms which address efficiency (with regards to both statistical and computational perspectives) can further facilitate breakthroughs. This talk will highlight some recent progress and some future challenges at the increasingly important intersection of computer science and statistics.
We will examine a central question of how should we tradeoff computational runtime with statistical accuracy. This issue has often been referred to as the tradeoffs in large scale learning. We will also examine one of the core estimation challenges we face in unsupervised learning (e.g. how do we cluster points in spaces or find topics in documents? Here, the underlying challenge is that statistical estimation fundamentally involves non-convex optimization. Finally, I hope to highlight a few of the future challenges we face that are inspired by the impressive successes of deep learning.
Sham is a principal research scientist at Microsoft Research, New England, a lab in Cambridge, MA. Previously, he was an associate professor at the Department of Statistics, Wharton, University of Pennsylvania (from 2010-2012), and he was an assistant professor at the Toyota Technological Institute at Chicago. Before this, he did a postdoc in the Computer and Information Science department at the University of Pennsylvania under the supervision of Michael Kearns. He completed his PhD at the Gatsby Unit where his advisor was Peter Dayan. Before Gatsby, he was an undergraduate at Caltech where he did his BS in physics.