Low-Overhead Early-Stopping Policies for Efficient Random Forests Inference on Microcontrollers

VLSI-SOC: TECHNOLOGY ADVANCEMENT ON SOC DESIGN (VLSI-SOC 2021)(2022)

引用 0|浏览9
暂无评分
摘要
Random Forests (RFs) are popular Machine Learning models for edge computing, due to their lightweight nature and high accuracy on several common tasks. Large RFs however, still have significant energy costs, a serious concern for battery-operated ultra-low-power devices. Following the adaptive (or dynamic) inference paradigm, we introduce a hardware-friendly early stopping policy for RF-based classifiers, halting the execution as soon as a sufficient prediction confidence is achieved. We benchmark our approach on three state-of-the-art datasets relative to different embedded classification tasks, and deploy our models on a single core RISC-V microcontroller. We achieve an energy reduction ranging from 18% to more than 91%, with an accuracy drop lower than 0.5%. Additionally, we compare our approach with other early-stopping policies, showing that we outperform them.
更多
查看译文
关键词
Machine learning, TinyML, Adaptive inference, Dynamic inference, Energy-efficiency, Random forests, Microcontrollers
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要