Globally Convergent Distributed Sequential Quadratic Programming with Overlapping Decomposition and Exact Augmented Lagrangian Merit Function
arxiv(2024)
摘要
In this paper, we address the problem of solving large-scale graph-structured
nonlinear programs (gsNLPs) in a scalable manner. GsNLPs are problems in which
the objective and constraint functions are associated with a graph node, and
they depend only on the variables of adjacent nodes. This graph-structured
formulation encompasses various specific instances, such as dynamic
optimization, PDE-constrained optimization, multi-stage stochastic
optimization, and optimization over networks. We propose a globally convergent
overlapping graph decomposition method for solving large-scale gsNLPs under the
standard regularity assumptions and mild conditions on the graph topology. At
each iteration step, we use an overlapping graph decomposition to compute an
approximate Newton step direction using parallel computations. We then select a
suitable step size and update the primal-dual iterates by performing
backtracking line search with the exact augmented Lagrangian merit function. By
exploiting the exponential decay of sensitivity of gsNLPs, we show that the
approximate Newton direction is a descent direction of the augmented Lagrangian
merit function, which leads to global convergence and fast local convergence.
In particular, global convergence is achieved for sufficiently large overlaps,
and the local linear convergence rate improves exponentially in terms of the
overlap size. This result matches existing results for dynamic programs. We
validate our theory with an elliptic PDE-constrained problem.
更多查看译文
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要