Online seminar

On the interactions between Statistics and Geometry

A.D.Alexandrov following the gradient Starting from January 2024, this monthly online seminar invites researchers working in Geometry or Mathematical Statistics to present a paper, result or idea. The aim is to promote communication between the two fields. We encourage speakers to give synthetic and pedagogical talks of about one hour, aimed at a broad mathematical audience. Topics of interest include (but are not limited to):

  • Statistics and Learning in metric spaces,
  • Non-Euclidean Optimization and Sampling,
  • Optimal Transport,
  • Information Geometry,
  • Manifold Learning,
  • Functional Inequalities.

The seminar is co-organised by Victor-Emmanuel Brunel, Austin Stomme, Alexey Kroshnin and Quentin Paris.

Next talk

May 3, 2024

  • Speaker: Govind Menon (Brown University)
  • Title: The Riemannian Langevin equation: models and sampling schemes
  • Abstract: The rigorous foundations of Brownian motion on Riemannian manifolds was developed in the 1970s. However, our understanding of this problem, in particular the interplay between the underlying metric and the Brownian motion has been considerably enriched by recent applications. In several recent works, we have used this theory to design Riemannian Langevin equations, all of which correspond to stochastic gradient descent of entropy. We will describe two such examples in this talk:
    • (a) A low-regularity construction of Brownian motion (with Dominik Inauen)
    • (b) Gibbs sampling with Riemannian Langevin Monte Carlo schemes (with Jianfeng Lu, Tianmin Yu, Xiangxiong Zhang and Shixin Zheng)
  • Time: 3pm CEST, 9am EDT, 4pm MSK, 10pm JST
  • Zoom: Link

Upcoming scheduled talks

May 31, 2024

  • Speaker: Giuseppe Savaré (Bocconi University)
  • Title: TBA
  • Abstract: TBA
  • Time: 11am CET, 5am EST, 7pm JST, 1pm MSK

Past talks

April 5, 2024

  • Speaker: Sinho Chewi (IAS, Princeton)
  • Title: Variational inference via Wasserstein gradient flows
  • Abstract: Variational inference (VI), which seeks to approximate the Bayesian posterior by a more tractable distribution within a variational family, has been widely advocated as a scalable alternative to MCMC. However, obtaining non-asymptotic convergence guarantees has been a longstanding challenge. In this talk, I will argue that viewing this problem as optimization over the Wasserstein space of probability measures equipped with the optimal transport metric leads to the design of principled algorithms which exhibit strong practical performance and are backed by rigorous theory. In particular, we address Gaussian VI, as well as (non-parametric) mean-field VI.
  • Recording: Link

March 8, 2024

  • Speaker: Lénaïc Chizat (EPFL, Institute of Mathematics)
  • Title: Doubly Regularized Entropic Wasserstein Barycenters
  • Abstract: Wasserstein barycenters are natural objects to summarize a family of probability distributions, but they suffer from the curse of dimensionality, both statistically and computationally. In this talk, I will propose a new look at the entropic regularization of Wasserstein barycenters. I will show that, via a double entropic regularization of the problem, one obtains a notion of barycenter with none of these drawbacks. In addition, and perhaps counter-intuitively, with well-chosen regularization strengths, this double regularization approximates the true Wasserstein barycenter better than with a single regularization. In this talk, the barycenter problem serves as a common thread to present recent results in the theory of entropic optimal transport from a statistical, approximation and computational viewpoint, which are relevant in more general contexts. References: Ref 1, Ref 2
  • Recording: Link

February 9, 2024

  • Speaker: Santosh Vempala (Georgia Tech)
  • Title: High-dimensional Sampling: From Euclid to Riemann
  • Abstract: Sampling high-dimensional densities is a basic algorithmic problem that has led to mathematically interesting tools and techniques. Many sampling algorithms can be viewed as discretizations of suitable continuous stochastic processes, raising the questions: Which stochastic process to use? And how to discretize it? In this talk, we discuss the use of Riemannian metrics to guide sampling algorithms and how, perhaps surprisingly, they can lead to improvements even for Euclidean sampling. We will focus on two methods – Riemannian Langevin and Riemannian Hamiltonian Monte Carlo –- and highlight some open questions.
  • Recording: Link

January 19, 2024

  • Speaker: Eugene Stepanov (PDMI RAS, Universita di Pisa, HSE University)
  • Title: Eigenvalues and eigenvectors of squared distance matrices and geometry of metric measure spaces
  • Abstract: We will discuss what the spectral data of matrices of squared distances between points from very large subsets (covering densely the space in the limit) of a metric measure space say about the geometry of the latter. In particular, we will discuss how the metric measure space can be reconstructed from such data.
  • Recording: Link