Article ID: 15.20180212
This paper proposes an energy-efficient reconfigurable architecture for deep neural networks (EERA-DNN) with hybrid bit-width and logarithmic multiplier. To speed up the computing and achieve high energy efficiency, we first propose an efficient network compression method with hybrid bit-width weights scheme, that saves the memory storage of network LeNet, AlexNet and EESEN by 7x-8x with negligible accuracy loss. Then, we propose an approximate unfolded logarithmic multiplier to process the multiplication operations efficiently. Comparing with state-of-the-art architectures EIE and Thinker, this work achieves over 1.8x and 2.7x better in energy efficiency respectively.