Finite Difference Gradient Approximation: To Randomize or Not?

INFORMS Journal on Computing(2022)

引用 3|浏览18
暂无评分
摘要
We discuss two classes of methods of approximating gradients of noisy black box functions—the classical finite difference method and recently popular randomized finite difference methods. Despite of the popularity of the latter, we argue that it is unclear whether the randomized schemes have an advantage over the traditional methods when employed inside an optimization method. We point to theoretical and practical evidence that show that the opposite is true at least in a general optimization setting. We then pose the question of whether a particular setting exists when the advantage of the new method may be clearly shown, at least numerically. The larger underlying challenge is a development of black box optimization methods that scale well with the problem dimension.
更多
查看译文
关键词
finite difference approximation,gradient descent,randomized
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要