Seminar Details

Seminar Details


May 6

3:30 pm

Determinantal Point Processes in Machine Learning

Ben Taskar


University of Washington - Computer Science and Engineering

Many real-world problems involve negative interactions; we might want search results to be diverse, sentences in a summary to cover distinct aspects of the subject, or objects in an image to occupy different regions of space. However, traditional structured probabilistic models tend deal poorly with these kinds of situations; Markov random fields, for example, become intractable even to approximate. Determinantal point processes (DPPs), which arise in random matrix theory and quantum physics, behave in a complementary fashion: while they cannot encode positive interactions, they define expressive models of negative correlations that come with surprising and exact algorithms for many types of inference,including conditioning,marginalization, and sampling.I\'ll present our recent work on a novel factorization and dual representation of DPPs that enables efficient and exact inference for exponentially-sized structured sets.We also develop an exact inference algorithm for DPPs conditioned on subset size and derive efficient parameter estimation for DPPs from several types of observations, as well as approximation algorithms for large-scale non-linear DPPs. I\'ll illustrate the advantages of DPPs on several natural language and computer vision tasks.
Joint work with Alex Kulesza, Jennifer Gillenwater, Raja Affandi and Emily Fox