Robot Locomotion through Tunable Bursting Rhythms using Efficient Bio-mimetic Neural Networks on Loihi and Arduino Platforms.

ICONS(2023)

引用 0|浏览5
暂无评分
摘要
Rhythmic tasks that biological beings perform such as breathing, walking, and swimming, use specialized neural networks called central pattern generators (CPG). Spiking CPGs have already been implemented to control robot locomotion. This paper aims to take this concept further by designing and implementing a tunable bursting central pattern generator to control quadruped robots for the first time, to the best of our knowledge. Bursting CPGs allow for more granular control over the motion and speed of operation while retaining the low memory usage and latency capabilities of spiking CPGs. A bio-mimetic neuron model is chosen for this implementation which is highly optimized to run real-time on standard (Arduino microcontroller) and specialized (Intel Loihi) hardware. The Petoi bittle is chosen as the model hardware setup to showcase the efficiency of the proposed CPGs even in serial processing architectures. The CPG network is also realized in a completely asynchronous Loihi architecture to illustrate its versatility. The fully connected network running on CPG takes around 10 kilo bytes of memory (33% of Arduino capacity) to execute different modes of locomotion - walk, jump, trot, gallop, and crawl. Benchmarking results show that the bio-mimetic neurons take around 600 bytes (around 2%) more memory than Izhikevich neurons while being 0.02ms (around 14%) faster in isolated neuron testing.
更多
查看译文
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要