Lab Home | Phone | Search | ||||||||
|
||||||||
We consider the problem of constructing a scheme for nonparametric regression that automatically adapts to the intrinsic dimension of the input domain. It is known that, if the regression function is Lipschitz-continuous in R^D, the minimax rate of convergence for the MSE is n^(-2/(2+D)), where n is the number of samples. This rate is quite slow when D is large. Building on recent work by Dasgupta and Freund on random space partitioning, we present a first tree-based regressor whose convergence rate depends only on the intrinsic dimension of the data, namely its Assouad dimension. This notion is general enough to capture many situations where data is expected to be much less complex than indicated by the ambient dimension D.
(Part of ongoing work with Sanjoy Dasgupta, first part has appeared at COLT 2009). |