Lab Home | Phone | Search
Center for Nonlinear Studies  Center for Nonlinear Studies
 Colloquia Archive 
 Postdoc Seminars Archive 
 Quantum Lunch 
 CMS Colloquia 
 Q-Mat Seminars 
 Q-Mat Seminars Archive 
 Kac Lectures 
 Dist. Quant. Lecture 
 Ulam Scholar 
 Summer Research 
 Past Visitors 
 History of CNLS 
 Maps, Directions 
 CNLS Office 
Thursday, February 05, 2015
1:00 PM - 2:00 PM
CNLS Conference Room (TA-3, Bldg 1690)


A flexible framework for regularized low-rank matrix estimation

Julie Josse
Agrocampus Ouest

Low-rank matrix estimation plays a key role in many scientific and engineering tasks including collaborative filtering and image denoising. Low-rank procedures are often motivated by the statistical model where we observe a noisy matrix drawn from some distribution with expectation assumed to have a low-rank representation. The statistical goal is to try to recover the signal from the noisy data. Classical approaches are centered around singular-value decomposition algorithms. Although the truncated singular value decomposition has been extensively used and studied, the estimator is found to be noisy and its performance can be improved by regularization. Methods based on singular-value shrinkage have achieved considerable empirical success and also have provable optimality properties in the Gaussian noise model (Gavish & Donoho, 2014). In this presentation, we propose a new framework for regularized low-rank estimation that does not start from the singular-value shrinkage point of view. Our approach is motivated by a simple parametric boostrap idea. In the simplest case of isotropic Gaussian noise, we end up with a new singular-value shrinkage estimator whereas for non-isotropic noise models, our procedure yields new estimators that perform well in experiments.

Host: Nick Hengartner