Solving High Frequency and Multi-Scale PDEs with Gaussian Processes
ICLR 2024(2023)
摘要
Machine learning based solvers have garnered much attention in physical
simulation and scientific computing, with a prominent example, physics-informed
neural networks (PINNs). However, PINNs often struggle to solve high-frequency
and multi-scale PDEs, which can be due to spectral bias during neural network
training. To address this problem, we resort to the Gaussian process (GP)
framework. To flexibly capture the dominant frequencies, we model the power
spectrum of the PDE solution with a student t mixture or Gaussian mixture. We
then apply the inverse Fourier transform to obtain the covariance function
(according to the Wiener-Khinchin theorem). The covariance derived from the
Gaussian mixture spectrum corresponds to the known spectral mixture kernel. We
are the first to discover its rationale and effectiveness for PDE solving.
Next,we estimate the mixture weights in the log domain, which we show is
equivalent to placing a Jeffreys prior. It automatically induces sparsity,
prunes excessive frequencies, and adjusts the remaining toward the ground
truth. Third, to enable efficient and scalable computation on massive
collocation points, which are critical to capture high frequencies, we place
the collocation points on a grid, and multiply our covariance function at each
input dimension. We use the GP conditional mean to predict the solution and its
derivatives so as to fit the boundary condition and the equation itself. As a
result, we can derive a Kronecker product structure in the covariance matrix.
We use Kronecker product properties and multilinear algebra to greatly promote
computational efficiency and scalability, without any low-rank approximations.
We show the advantage of our method in systematic experiments.
更多查看译文
关键词
ML PDE solver,gaussian process,PINN
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要