Lab Home | Phone | Search
Center for Nonlinear Studies  Center for Nonlinear Studies
 Home 
 People 
 Current 
 Affiliates 
 Alumni 
 Visitors 
 Students 
 Research 
 ICAM-LANL 
 Quantum 
 Publications 
 Publications 
 2007 
 2006 
 2005 
 2004 
 2003 
 2002 
 2001 
 2000 
 <1999 
 Conferences 
 Workshops 
 Sponsorship 
 Talks 
 Colloquia 
 Colloquia Archive 
 Seminars 
 Postdoc Seminars Archive 
 Quantum Lunch 
 CMS Colloquia 
 Q-Mat Seminars 
 Q-Mat Seminars Archive 
 Archive 
 Kac Lectures 
 Dist. Quant. Lecture 
 Ulam Scholar 
 Colloquia 
 
 Jobs 
 Students 
 Summer Research 
 Student Application 
 Visitors 
 Description 
 Past Visitors 
 Services 
 General 
 PD Travel Request 
 
 History of CNLS 
 
 Maps, Directions 
 CNLS Office 
 T-Division 
 LANL 
 
Thursday, February 05, 2015
1:00 PM - 2:00 PM
CNLS Conference Room (TA-3, Bldg 1690)

Seminar

A flexible framework for regularized low-rank matrix estimation

Julie Josse
Agrocampus Ouest

Low-rank matrix estimation plays a key role in many scientific and engineering tasks including collaborative filtering and image denoising. Low-rank procedures are often motivated by the statistical model where we observe a noisy matrix drawn from some distribution with expectation assumed to have a low-rank representation. The statistical goal is to try to recover the signal from the noisy data. Classical approaches are centered around singular-value decomposition algorithms. Although the truncated singular value decomposition has been extensively used and studied, the estimator is found to be noisy and its performance can be improved by regularization. Methods based on singular-value shrinkage have achieved considerable empirical success and also have provable optimality properties in the Gaussian noise model (Gavish & Donoho, 2014). In this presentation, we propose a new framework for regularized low-rank estimation that does not start from the singular-value shrinkage point of view. Our approach is motivated by a simple parametric boostrap idea. In the simplest case of isotropic Gaussian noise, we end up with a new singular-value shrinkage estimator whereas for non-isotropic noise models, our procedure yields new estimators that perform well in experiments.

Host: Nick Hengartner