Resistive Memory-based Neural Differential Equation Solver for Score-based Diffusion Model
arxiv(2024)
摘要
Human brains image complicated scenes when reading a novel. Replicating this
imagination is one of the ultimate goals of AI-Generated Content (AIGC).
However, current AIGC methods, such as score-based diffusion, are still
deficient in terms of rapidity and efficiency. This deficiency is rooted in the
difference between the brain and digital computers. Digital computers have
physically separated storage and processing units, resulting in frequent data
transfers during iterative calculations, incurring large time and energy
overheads. This issue is further intensified by the conversion of inherently
continuous and analog generation dynamics, which can be formulated by neural
differential equations, into discrete and digital operations. Inspired by the
brain, we propose a time-continuous and analog in-memory neural differential
equation solver for score-based diffusion, employing emerging resistive memory.
The integration of storage and computation within resistive memory synapses
surmount the von Neumann bottleneck, benefiting the generative speed and energy
efficiency. The closed-loop feedback integrator is time-continuous, analog, and
compact, physically implementing an infinite-depth neural network. Moreover,
the software-hardware co-design is intrinsically robust to analog noise. We
experimentally validate our solution with 180 nm resistive memory in-memory
computing macros. Demonstrating equivalent generative quality to the software
baseline, our system achieved remarkable enhancements in generative speed for
both unconditional and conditional generation tasks, by factors of 64.8 and
156.5, respectively. Moreover, it accomplished reductions in energy consumption
by factors of 5.2 and 4.1. Our approach heralds a new horizon for hardware
solutions in edge computing for generative AI applications.
更多查看译文
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要