Conference Agenda
Overview and details of the sessions of this conference. Please select a date or location to show only sessions at that day or location. Please select a single session for detailed view (with abstracts and downloads if available).
|
Agenda Overview |
| Session | ||
High-dimensional statistics and learning
| ||
| Presentations | ||
Supervised classification for Ornstein-Uhlenbeck diffusions with separation condition Humboldt University of Berlin, Germany We study binary supervised classification based on repeated independent observations of continuous sample paths. Our focus is a diffusion classification model in which the features follow an Ornstein-Uhlenbeck process with class-dependent drifts. We consider plug-in classifiers constructed from drift estimators and analyze the performance via the excess risk. Under a separation condition on the drift parameters, we establish upper bounds of the excess risk, which are explicitly parametrized by the separation distance quantifying the difficulty of the problem. Specifically, when the drift distance is bounded away from zero, the plug-in classifiers achieve a fast convergence rate of order n-1 (up to logarithmic factors) in the constant drift scenario. Furthermore, we discuss extensions of this framework to time-inhomogeneous drift functions. The theoretical approach utilizes the Wiener chaos representation and spectral theory to characterize the log-likelihood ratio as a quadratic form of Gaussian random variables, enabling a precise analysis of margin properties and concentration results. This extends the fast-rate results from classification problems with linear and Gaussian white noise models to dynamical diffusion systems with Gaussian structure under separation conditions. Asymptotic Bounds and Online Algorithms for Average-Case Matrix Discrepancy 1Johns Hopkins University, USA; 2FAU Erlangen-Nürnberg, Germany; 3Yale University, USA
We study the matrix discrepancy problem in the average-case setting. Given a sequence of $m times m$ symmetric matrices $A_1,ldots,A_n$, its discrepancy is defined as the minimal spectral norm over all signed sums $sum_{i=1}^n x_iA_i$ with $x_1,ldots,x_n in {pm1}$. Our contributions are twofold. First, we study the asymptotic discrepancy of random matrices. When the matrices belong to the Gaussian orthogonal ensemble, we provide a sharp characterization of the asymptotic discrepancy and show that the limiting distribution is concentrated around $Theta(sqrt{nm}4^{-(1 + o(1))n/m^2})$, under the assumption $m^2 ll n/log{n}$. We observe that the trivial bound $O(sqrt{nm})$ cannot be improved when $n ll m^2$ and show that this phenomenon occurs for a broad class of random matrices. In the case $n = Omega(m^2)$, we provide a matching upper bound. Second, we analyse the matrix hyperbolic cosine algorithm, an online algorithm for matrix discrepancy minimization due to Zouzias~(2011), in the average-case setting. We show that the algorithm achieves with high probability a discrepancy of $O(mlog{m})$ for a broad class of random matrices, including Wigner matrices with entries satisfying a hypercontractive inequality and Gaussian Wishart matrices.
Asymptotic confidence bands for centered purely random forests Karlsruhe Institute of Technology, Germany In this talk we will study asymptotic uniform confidence bands for centered purely random forests in a multivariate nonparametric regression setting. The most popular example in this class of random forests, namely the uniformly centered purely random forests, is well known to suffer from suboptimal rates. Therefore, a new type of purely random forests, called the Ehrenfest centered purely random forests, is proposed which achieves minimax optimal rates. Our main confidence band theorem applies to both random forests. The proof is based on an interpretation of random forests as generalized U-Statistics together with a Gaussian approximation of the supremum of empirical processes. | ||

