NNL:a domain-specific language for neural networks

High Technology Letters(2020)

引用 1|浏览0
暂无评分
摘要
Recent years,neural networks ( NNs) have received increasing attention from both academia and industry.So far significant diversity among existing NNs as well as their hardware platforms makes NN programming a daunting task.In this paper,a domain-specific language ( DSL) for NNs,neural network language ( NNL) is proposed to deliver productivity of NN programming and portable performance of NN execution on different hardware platforms.The productivity and flexibility of NN programming are enabled by abstracting NNs as a directed graph of blocks.The language describes 4 representative and widely used NNs and runs them on 3 different hardware platforms ( CPU,GPU and NN accelerator) .Experimental results show that NNs written with the proposed language are,on average,14.5% better than the baseline implementations across these 3 platforms.Moreover,compared with the Caffe framework that specifically targets the GPU platform,the code can achieve similar performance.
更多
查看译文
关键词
artificial neural network ( NN),domain-specific language ( DSL),neural network ( NN) accelerator
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要