Conference Agenda
Overview and details of the sessions of this conference. Please select a date or location to show only sessions at that day or location. Please select a single session for detailed view (with abstracts and downloads if available).
|
Agenda Overview |
| Session | ||
New developments in nonparametric classification and estimation based on the nearest neighbor method
| ||
| Presentations | ||
Chatterjee's graph correlation University of Washington, United States of America This talk will survey recent advances in understanding Chatterjee's nearest neighbor graph-based correlation coefficient. I will introduce, for the first time, a comprehensive theoretical framework for statistical inference based on this coefficient. The framework involves results on asymptotic normality, bias correction, and the (in)consistency of bootstrap methods. Nearest Neighbor Estimates for Dependent Data University of Manitoba, Canada This paper considers the nonparametric estimation problem for a class of nonlinear time series Nearest Neighbor matching: from Average Treatment Effects to Transfer Learning ENSAI-CREST, France Estimating some mathematical expectations from partially observed data and in particular missing outcomes is a central problem encountered in numerous fields such as transfer learning, counterfactual analysis or causal inference. Matching estimators, estimators based on k-nearest neighbors, are widely used in this context. Under suitable regularity conditions, one can show that the variance of such estimators can converge to zero at a parametric rate. However their bias can have a slower rate when the dimension of the covariates is larger than 2. This makes analysis of this bias particularly important. In this paper, we provide higher order properties of the bias. In contrast to the existing literature on this topic, we do not assume that the support of the target distribution of the covariates is strictly included in that of the source, and we discuss two geometric conditions on the support that prevent boundary bias issues. We show that these conditions are much more general than the usual convex support assumption, leading to an improvement of existing results. Furthermore, we show that the matching estimator studied by Abadie and Imbens (2006) for the average treatment effect can be asymptotically efficient when the dimension of the covariates is less than 4, a result only known in dimension 1. Multivariate Root-N-Consistent Smoothing Parameter Free Matching Estimators and Estimators of Inverse Density Weighted Expectations 1Universität Rostock, Germany; 2Philipps-Universität Marburg, Germany Expected values weighted by the inverse of a multivariate density or, equivalently, Lebesgue integrals of regression functions with multivariate regressors occur in various areas of applications, including estimating average treatment effects, nonparametric estimators in random coefficient regression models or deconvolution estimators in Berkson errors-in-variables models. The frequently used nearest-neighbor and matching estimators suffer from bias problems in multiple dimensions. By using polynomial least squares fits on each cell of the Kth-order Voronoi tessellation for sufficiently large K, we develop novel modifications of nearest-neighbor and matching estimators which again converge at the parametric root-n-rate under mild smoothness assumptions on the unknown regression function and without any smoothness conditions on the unknown density of the covariates. We stress that in contrast to competing methods for correcting for the bias of matching estimators, our estimators do not involve nonparametric function estimators and in particular do not rely on sample-size dependent smoothing parameters. We complement the upper bounds with appropriate lower bounds derived from information-theoretic arguments, which show that some smoothness of the regression function is indeed required to achieve the parametric rate. Simulations illustrate the practical feasibility of the proposed methods. | ||

