Conference Agenda
Overview and details of the sessions of this conference. Please select a date or location to show only sessions at that day or location. Please select a single session for detailed view (with abstracts and downloads if available).
|
Agenda Overview |
| Session | ||
High-dimensional statistics and learning
| ||
| Presentations | ||
Self-regularized learning methods University of Stuttgart, Germany We introduce a new framework for the theoretical analysis of learning algorithms called self-regularization. In a nutshell, self-regularized learning algorithms guarantee implicitly that they produce sufficiently regular prediction functions. Central examples of self-regularized learning algorithms include gradient descent and regularized empirical risk minimization. We establish a general theory for the statistical analysis of self-regularized algorithms which in many cases yields minmax-optimal learning rates. Max Schölpple, Ingo Steinwart Institut für Stochastik und Anwendungen, Universität Stuttgart, Pfaffenwaldring 57, Concentration and moment inequalities for heavy-tailed random matrices universität wien, Austria Fuk-Nagaev and Rosenthal-type inequalities are proven for the sums of independent random matrices, focusing on the situation when the norms of the matrices possess finite moments of only low orders. The bounds depend on the intrinsic dimensional characteristics, such as the effective rank, as opposed to the dimension of the ambient space. The advantages of such results are illustrated in several applications, including new moment inequalities for sample covariance matrices and the corresponding eigenvectors of heavy-tailed random vectors. Authors: Moritz Jirak, Stanislav Minsker, Yiqiu Shen, Martin Wahl Laplacian eigenmaps for bounded manifolds and the Neumann Laplacian Universität Bielefeld, Germany The spectrum of the Laplace-Beltrami operator encodes essential geometric information about a smooth manifold. In practice, the manifold is unknown, but supports a finite sample of random points. It is then standard to approximate its spectrum by the spectrum of the resulting graph Laplacian. When the manifold is bounded, it is known that the graph Laplacian eigen-converges to the Neumann Laplacian. However, finite sample results, such as convergence rates, are still lacking, and are at the center of this talk. | ||

