Lab Home | Phone | Search
Center for Nonlinear Studies  Center for Nonlinear Studies
 Home 
 People 
 Current 
 Executive Committee 
 Postdocs 
 Visitors 
 Students 
 Research 
 Publications 
 Conferences 
 Workshops 
 Sponsorship 
 Talks 
 Seminars 
 Postdoc Seminars Archive 
 Quantum Lunch 
 Quantum Lunch Archive 
 P/T Colloquia 
 Archive 
 Ulam Scholar 
 
 Postdoc Nominations 
 Student Requests 
 Student Program 
 Visitor Requests 
 Description 
 Past Visitors 
 Services 
 General 
 
 History of CNLS 
 
 Maps, Directions 
 CNLS Office 
 T-Division 
 LANL 
 
Tuesday, August 18, 2009
1:00 PM - 2:00 PM
CNLS Conference Room (TA-3, Bldg 1690)

Seminar

Escaping the Curse of Dimensionality with a Tree-Based Regressor

Samory Kpotufe
University of California at San Diego

We consider the problem of constructing a scheme for nonparametric regression that automatically adapts to the intrinsic dimension of the input domain. It is known that, if the regression function is Lipschitz-continuous in R^D, the minimax rate of convergence for the MSE is n^(-2/(2+D)), where n is the number of samples. This rate is quite slow when D is large. Building on recent work by Dasgupta and Freund on random space partitioning, we present a first tree-based regressor whose convergence rate depends only on the intrinsic dimension of the data, namely its Assouad dimension. This notion is general enough to capture many situations where data is expected to be much less complex than indicated by the ambient dimension D.

(Part of ongoing work with Sanjoy Dasgupta, first part has appeared at COLT 2009).

Host: Ingo Steinwart