Stable and efficient differentiation of tensor network algorithms

arxiv(2023)

引用 0|浏览2
暂无评分
摘要
Gradient based optimization methods are the established state-of-the-art paradigm to study strongly entangled quantum systems in two dimensions with Projected Entangled Pair States. However, the key ingredient, the gradient itself, has proven challenging to calculate accurately and reliably in the case of a corner transfer matrix (CTM)-based approach. Automatic differentiation (AD), which is the best known tool for calculating the gradient, still suffers some crucial shortcomings. Some of these are known, like the problem of excessive memory usage and the divergences which may arise when differentiating a singular value decomposition (SVD). Importantly, we also find that there is a fundamental inaccuracy in the currently used backpropagation of SVD that had not been noted before. In this paper, we describe all these problems and provide them with compact and easy to implement solutions. We analyse the impact of these changes and find that the last problem -- the use of the correct gradient -- is by far the dominant one and thus should be considered a crucial patch to any AD application that makes use of an SVD for truncation.
更多
查看译文
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要