EnergyNN: Energy Estimation for Neural Network Inference Tasks on DPU

2021 31st International Conference on Field-Programmable Logic and Applications (FPL)(2021)

引用 2|浏览17
暂无评分
摘要
Convolutional Neural Networks (CNNs) are increasingly becoming popular in embedded and energy limited mobile applications. Hardware designers have proposed various accelerators to speed up the execution of CNNs on embedded platforms. Deep Learning Processor Unit (DPU) is one such generic CNN accelerator for Xilinx platforms that can execute any CNN on one or more DPUs configured on an FPGA. In a p...
更多
查看译文
关键词
Energy consumption,Neural networks,Estimation,Predictive models,Time measurement,Hardware,Mobile applications
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要