Lab Home | Phone | Search
Center for Nonlinear Studies  Center for Nonlinear Studies
 Colloquia Archive 
 Postdoc Seminars Archive 
 Quantum Lunch 
 CMS Colloquia 
 Q-Mat Seminars 
 Q-Mat Seminars Archive 
 Kac Lectures 
 Dist. Quant. Lecture 
 Ulam Scholar 
 Summer Research 
 Past Visitors 
 History of CNLS 
 Maps, Directions 
 CNLS Office 
Wednesday, July 30, 2008
2:00 PM - 3:00 PM
CNLS Conference Room (TA-3, Bldg 1690)


Learning Permutations with Exponential Weights

David Helmbold
UC Santa Cruz

We consider learning permutations in the on-line setting. In this setting the algorithm's goal is to have small regret, i.e. the cumulative loss of the algorithm on any sequence of trials should not be too much greater than the loss of the best permutation in hindsight. We present a new algorithm that maintains a doubly stochastic weight matrix representing its uncertainty about the best permutation and makes predictions by decomposing this weight matrix into a convex combination of permutations. After receiving feedback information, the algorithm updates its weight matrix for the next trial. A new insight allows us to prove an optimal (up to small constant factors) bound on the regret our algorithm despite the fact that there is no closed form for the re-balanced weight matrix. This regret bound is significantly better than the bounds on either Kalai and Vempala's more computationally efficient "Follow the Perturbed Leader" algorithm or the straightforward but expensive method that explicitly tracks each permutation.

Host: James Theiler