An Adaptive Riemannian Gradient Method Without Function Evaluations

J. Optim. Theory Appl.(2023)

引用 0|浏览0
暂无评分
摘要
In this paper, we present an adaptive gradient method for the minimization of differentiable functions on Riemannian manifolds. The method is designed to minimize functions with Lipschitz continuous gradient field, but it does not required the knowledge of the Lipschitz constant. In contrast with line search schemes, the dynamic adjustment of the stepsizes is done without the use of function evaluations. We prove worst-case complexity bounds for the number of gradient evaluations that the proposed method needs to find an approximate stationary point. Preliminary numerical results are also presented and illustrate the potential advantages of different versions of our method in comparison with a Riemannian gradient method with Armijo line search.
更多
查看译文
关键词
Riemannian optimization,Gradient method,Adaptive methods Worst-case complexity bounds
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要