Skip to main content

Bio: I am a joint NIST-IMA Postdoctoral fellow in Analysis of Machine Learning at the Institute for Mathematics and its Applications (IMA) at the University of Minnesota (UMN). I completed my Ph.D. at the University of California, Los Angeles in mathematics in 2022 under the guidance of Professor Stan Osher. Throughout my Ph.D., I have developed optimal transport-based algorithms to solve nonlinear partial differential equations (PDEs) such as Darcy’s law, tumor growth model, and mean field games. As a Postdoc, I will be working with Professor Jeff Calder, Gilad Lerman, and Li Wang at UMN to develop PDE-based algorithms to solve high-dimensional machine learning problems and analyze the theoretical properties of the algorithms.

Talk Title: The Back-And-Forth Method For Wasserstein Gradient Flows

Talk Abstract: We present a method to efficiently compute Wasserstein gradient flows. Our approach is based on a generalization of the back-and-forth method (BFM) introduced by Jacobs and Leger to solve optimal transport problems. We evolve the gradient flow by solving the dual problem to the JKO scheme. In general, the dual problem is much better behaved than the primal problem. This allows us to efficiently run large scale gradient flows simulations for a large class of internal energies including singular and non-convex energies. Joint work with Matt Jacobs (Purdue University) and Flavien Leger (INRIA Paris).

arrow-left-smallarrow-right-large-greyarrow-right-large-yellowarrow-right-largearrow-right-long-yellowarrow-right-smallfacet-arrow-down-whitefacet-arrow-downCheckedCheckedlink-outmag-glass