On bandwidth selection in empirical risk minimization
Mon., March 31, 2014
3:00 - 4:00 PM
521 Cory Hall (Hogan Room)
I will first start by a short summary on bandwidth selection methods such as Cross-Validation or Lepski-type procedure explaining their strengths and weaknesses. Our contributions joined with J. Lederer and S. Loustau consist of a data-driven selection of the bandwidth for empirical risk minimization. One typically deals with this issue in pointwise estimation in the setting of heavy-tail noises with local M-estimators. To this end, we apply Lepski's method to these estimators to get optimal (minimax) results in pointwise estimation. Besides, we will explain how we can choose the robustness of M-estimators. In learning theory, many authors have recently investigated supervised and unsupervised learning with errors in variables. As a rule, such issues (viewed as an inverse problem) require to plug-in deconvolution kernels in the empirical risk (and then select a bandwidth) such as Hall and Lahiri [2008] in quantile and moment estimation, Loustau and Marteau [2013] in noisy discriminant analysis, Loustau [2013] in noisy learning, Chichignoud and Loustau [2013] in noisy clustering and Dattner, ReiƟ and Trabs [2013] in quantile estimation. We will especially present the result of Chichignoud and Loustau [2013] in noisy clustering where a new version of Lepski's method is provided to obtain excess risk bounds.
UC Berkeley Networking
Varun Jog and Ka Kit Lam Last Modification Date: Sunday, January 26, 2014