谷歌浏览器插件
订阅小程序
在清言上使用

Non-asymptotic Convergence Bounds for Modified Tamed Unadjusted Langevin Algorithm in Non-Convex Setting

Journal of Mathematical Analysis and Applications(2024)

引用 0|浏览3
暂无评分
摘要
We consider the problem of sampling from a high-dimensional target distribution πβ on Rd with density proportional to θ↦e−βU(θ) using explicit numerical schemes based on discretising the Langevin stochastic differential equation (SDE). In recent literature, taming has been proposed and studied as a method for ensuring stability of Langevin-based numerical schemes in the case of super-linearly growing drift coefficients for the Langevin SDE. In particular, the Tamed Unadjusted Langevin Algorithm (TULA) was proposed in [Bro+19] to sample from such target distributions with the gradient of the potential U being super-linearly growing. However, theoretical guarantees in Wasserstein distances for Langevin-based algorithms have traditionally been derived assuming strong convexity of the potential U. In this paper, we propose a novel taming factor and derive, under a setting with possibly non-convex potential U and super-linearly growing gradient of U, non-asymptotic theoretical bounds in Wasserstein-1 and Wasserstein-2 distances between the law of our algorithm, which we name the modified Tamed Unadjusted Langevin Algorithm (mTULA), and the target distribution πβ. We obtain respective rates of convergence O(λ) and O(λ1/2) in Wasserstein-1 and Wasserstein-2 distances for the discretisation error of mTULA in step size λ. High-dimensional numerical simulations which support our theoretical findings are presented to showcase the applicability of our algorithm.
更多
查看译文
关键词
Modified Tamed Unadjusted Langevin Algorithm,Langevin SDE,Super-linearly growing diffusion coefficients,High-dimensional sampling,Non-asymptotic convergence bounds
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要