Title |
Truncation-Aware Binary Neural Networks for Efficient Processing-In-Memory Implementation |
Authors |
이채은(Chae-Eun Lee) ; 이수정(Su-Jung Lee) ; 김태환(Tae-Hwan Kim) |
DOI |
https://doi.org/10.5573/ieie.2021.58.12.27 |
Keywords |
Binary neural network; Processing-In-Memory; Inference; Deep learning; Back propagation |
Abstract |
A new binary neural network model is proposed with the aim of an efficient implementation based on processing-in-memory (PIM) technology. In implementing PIM technology, analog-to-digital converters (ADC) need to be embedded, where each ADC has to be designed so as to have a high resolution enough to avoid truncation. The proposed model is designed considering the truncation inside the ADC. To realize a back-propagation through the truncation, its transfer function has been implemented by a polynomial of a learnable degree. The proposed model achieves a higher accuracy than the conventional model that does not consider the truncation; up to 13.97% and 30.41%, respectively for the SVHN and CIFAR-10 classification tasks. In addition, the proposed model does not require any additional circuitry for the implementation, minimizing the overhead in the memory density due to the PIM technology. |