Online seminar
On the interactions between Statistics and Geometry
This monthly online seminar invites researchers working in Geometry or Mathematical Statistics to present a paper, result or idea. The aim is to promote communication between the two fields. We encourage speakers to give synthetic and pedagogical talks of about one hour, aimed at a broad mathematical audience. Topics of interest include (but are not limited to): Statistics and Learning in metric spaces, Non-Euclidean Optimization and Sampling, Optimal Transport, Information Geometry, Manifold Learning and Functional Inequalities.
The seminar is co-organised by Victor-Emmanuel Brunel, Austin Stomme, Alexey Kroshnin and Quentin Paris.
Next talk
December 13, 2024
- Speaker: Flavien Léger (INRIA Paris)
- Title: TBA
- Abstract: TBA
- Time: TBA
- Zoom link: TBA
Upcoming scheduled talks
TBA
Past talks
November 8, 2024
- Speaker: Dario Trevisan (Universita degli Studi di Pisa)
- Title: Asymptotics for Random Quadratic Transportation Costs
- Abstract: We establish the validity of asymptotic limits for the general transportation problem between random i.i.d. points and their common distribution, with respect to the squared Euclidean distance cost, in any dimension larger than three. Previous results were essentially limited to the two (or one) dimensional case, or to distributions whose absolutely continuous part is uniform. The proof relies upon recent advances in the stability theory of optimal transportation, combined with functional analytic techniques and some ideas from quantitative stochastic homogenization. The key tool we develop is a quantitative upper bound for the usual quadratic optimal transportation problem in terms of its boundary variant, where points can be freely transported along the boundary. The methods we use are applicable to more general random measures, including occupation measure of Brownian paths, and may open the door to further progress on challenging problems at the interface of analysis, probability, and discrete mathematics. Based on joint work with M. Huesmann and M. Goldman (arXiv:2409.08612)
- See Dario’s blog post: Link
- Recording: Link
May 31, 2024
- Speaker: Giuseppe Savaré (Bocconi University)
- Title: The construction of Dirichlet forms and Sobolev spaces on the Wasserstein space
- Abstract: The talk will concern the construction of a class of Dirichlet forms and corresponding Sobolev spaces induced by a finite reference measure on the L^2-Kantorovich-Wasserstein space of probability measures on R^d (or a Riemannian manifold). Such forms can be characterised by at least two different approaches. The first is based on the closure of the canonical energy form on smooth cylinder functions that arise from Otto calculus. The second relies on the Cheeger energy, which is defined in terms of the underlying Wasserstein metric by integrating the squared asymptotic Lipschitz constant of Lipschitz functions. The equivalence of these two approaches is a consequence of general approximation results for Sobolev spaces in metric-measure spaces. (In collaboration with Massimo Fornasier and Giacomo Sodini)
- Recording: Link
May 3, 2024
- Speaker: Govind Menon (Brown University)
- Title: The Riemannian Langevin equation: models and sampling schemes
- Abstract: The rigorous foundations of Brownian motion on Riemannian manifolds was developed in the 1970s. However, our understanding of this problem, in particular the interplay between the underlying metric and the Brownian motion has been considerably enriched by recent applications. In several recent works, we have used this theory to design Riemannian Langevin equations, all of which correspond to stochastic gradient descent of entropy. We will describe two such examples in this talk:
- (a) A low-regularity construction of Brownian motion (with Dominik Inauen)
- (b) Gibbs sampling with Riemannian Langevin Monte Carlo schemes (with Jianfeng Lu, Tianmin Yu, Xiangxiong Zhang and Shixin Zheng)
- Recording: Link
April 5, 2024
- Speaker: Sinho Chewi (IAS, Princeton)
- Title: Variational inference via Wasserstein gradient flows
- Abstract: Variational inference (VI), which seeks to approximate the Bayesian posterior by a more tractable distribution within a variational family, has been widely advocated as a scalable alternative to MCMC. However, obtaining non-asymptotic convergence guarantees has been a longstanding challenge. In this talk, I will argue that viewing this problem as optimization over the Wasserstein space of probability measures equipped with the optimal transport metric leads to the design of principled algorithms which exhibit strong practical performance and are backed by rigorous theory. In particular, we address Gaussian VI, as well as (non-parametric) mean-field VI.
- Recording: Link
March 8, 2024
- Speaker: Lénaïc Chizat (EPFL, Institute of Mathematics)
- Title: Doubly Regularized Entropic Wasserstein Barycenters
- Abstract: Wasserstein barycenters are natural objects to summarize a family of probability distributions, but they suffer from the curse of dimensionality, both statistically and computationally. In this talk, I will propose a new look at the entropic regularization of Wasserstein barycenters. I will show that, via a double entropic regularization of the problem, one obtains a notion of barycenter with none of these drawbacks. In addition, and perhaps counter-intuitively, with well-chosen regularization strengths, this double regularization approximates the true Wasserstein barycenter better than with a single regularization. In this talk, the barycenter problem serves as a common thread to present recent results in the theory of entropic optimal transport from a statistical, approximation and computational viewpoint, which are relevant in more general contexts. References: Ref 1, Ref 2
- Recording: Link
February 9, 2024
- Speaker: Santosh Vempala (Georgia Tech)
- Title: High-dimensional Sampling: From Euclid to Riemann
- Abstract: Sampling high-dimensional densities is a basic algorithmic problem that has led to mathematically interesting tools and techniques. Many sampling algorithms can be viewed as discretizations of suitable continuous stochastic processes, raising the questions: Which stochastic process to use? And how to discretize it? In this talk, we discuss the use of Riemannian metrics to guide sampling algorithms and how, perhaps surprisingly, they can lead to improvements even for Euclidean sampling. We will focus on two methods – Riemannian Langevin and Riemannian Hamiltonian Monte Carlo –- and highlight some open questions.
- Recording: Link
January 19, 2024
- Speaker: Eugene Stepanov (PDMI RAS, Universita di Pisa, HSE University)
- Title: Eigenvalues and eigenvectors of squared distance matrices and geometry of metric measure spaces
- Abstract: We will discuss what the spectral data of matrices of squared distances between points from very large subsets (covering densely the space in the limit) of a metric measure space say about the geometry of the latter. In particular, we will discuss how the metric measure space can be reconstructed from such data.
- Recording: Link