Neural networks can be FLOP-efficient integrators of 1D oscillatory integrands
arxiv(2024)
摘要
We demonstrate that neural networks can be FLOP-efficient integrators of
one-dimensional oscillatory integrands. We train a feed-forward neural network
to compute integrals of highly oscillatory 1D functions. The training set is a
parametric combination of functions with varying characters and oscillatory
behavior degrees. Numerical examples show that these networks are
FLOP-efficient for sufficiently oscillatory integrands with an average FLOP
gain of 1000 FLOPs. The network calculates oscillatory integrals better than
traditional quadrature methods under the same computational budget or number of
floating point operations. We find that feed-forward networks of 5 hidden
layers are satisfactory for a relative accuracy of 0.001. The computational
burden of inference of the neural network is relatively small, even compared to
inner-product pattern quadrature rules. We postulate that our result follows
from learning latent patterns in the oscillatory integrands that are otherwise
opaque to traditional numerical integrators.
更多查看译文
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要