EINNET: Optimizing Tensor Programs with Derivation-Based Transformations.

OSDI(2023)

引用 1|浏览91
暂无评分
摘要
Boosting the execution performance of deep neural networks (DNNs) is critical due to their wide adoption in real-world applications. However, existing approaches to optimizing the tensor computation of DNNs only consider transformations representable by a fixed set of predefined tensor operators, resulting in a highly restricted optimization space. To address this issue, we propose EINNET, a derivation-based tensor program optimizer. EINNET optimizes tensor programs by leveraging transformations between general tensor algebra expressions and automatically creating new operators desired by transformations, enabling a significantly larger search space that includes those supported by prior works as special cases. Evaluation on seven DNNs shows that EINNET outperforms existing tensor program optimizers by up to 2.72x (1.52x on average) on NVIDIA A100 and up to 2.68x (1.55x on average) on NVIDIA V100. EINNET is publicly available at https://github.com/InfiniTensor/InfiniTensor.
更多
查看译文
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要