Compression of Deep Convolutional Neural Networks for Fast and Low Power Mobile Applications

taelim choi
taelim choi
lu yang
lu yang
dongjun shin
dongjun shin

international conference on learning representations, 2015.

Cited by: 430|Bibtex|Views9|Links
EI

Abstract:

Although the latest high-end smartphone has powerful CPU and GPU, running deeper convolutional neural networks (CNNs) for complex tasks such as ImageNet classification on mobile devices is challenging. To deploy deep CNNs on mobile devices, we present a simple and effective scheme to compress the entire CNN, which we call one-shot whole...More

Code:

Data:

Your rating :
0

 

Tags
Comments