Most commonly used f-divergences of measures, e.g., the Kullback-Leibler divergence, are subject to limitations regarding the support of the involved measures. A remedy consists of regularizing the f-divergence by a squared maximum mean discrepancy (MMD) associated with a characteristic kernel K. In this paper, we use the so-called kernel mean embedding to show that the corresponding regularization can be rewritten as the Moreau envelope of some function in the reproducing kernel Hilbert space associated with K. Then, we exploit well-known results on Moreau envelopes in Hilbert spaces to prove properties of the MMD-regularized f-divergences and, in particular, their gradients. Subsequently, we use our findings to analyze Wasserstein gradient flows of MMD-regularized f-divergences. Finally, we consider Wasserstein gradient flows starting from empirical measures. We provide proof-of-the-concept numerical examples for f-divergences with both infinite and finite recession constant.
Joint work with Sebastian Neumayer, Gabriele Steidl, and Nicolaj Rux, see https://arxiv.org/abs/2402.04613.
You can view this talk here: https://www.youtube.com/watch?v=iuaQ1w4U-q8.