Skip to main navigation Skip to search Skip to main content

In-memory batch-normalization for resistive memory based binary neural network hardware

  • Pohang University of Science and Technology

Research output: Chapter in Book/Report/Conference proceedingConference contributionpeer-review

Abstract

Binary Neural Network (BNN) has a great potential to be implemented on Resistive memory Crossbar Array (RCA)-based hardware accelerators because it requires only 1-bit precision for weights and activations. While general structures to implement convolution or fully-connected layers in RCA-based BNN hardware were actively studied in previous works, Batch-Normalization (BN) layer, which is another key layer of BNN, has not been discussed in depth yet. In this work, we propose in-memory batch-normalization schemes which integrate BN layers on RCA so that area/energy-efficiency of the BNN accelerators can be maximized. In addition, we also show that sense amp error due to device mismatch can be suppressed using the proposed in-memory BN design.

Original languageEnglish
Title of host publicationASP-DAC 2019 - 24th Asia and South Pacific Design Automation Conference
PublisherInstitute of Electrical and Electronics Engineers Inc.
Pages645-650
Number of pages6
ISBN (Electronic)9781450360074
DOIs
StatePublished - 21 Jan 2019
Externally publishedYes
Event24th Asia and South Pacific Design Automation Conference, ASPDAC 2019 - Tokyo, Japan
Duration: 21 Jan 201924 Jan 2019

Publication series

NameProceedings of the Asia and South Pacific Design Automation Conference, ASP-DAC

Conference

Conference24th Asia and South Pacific Design Automation Conference, ASPDAC 2019
Country/TerritoryJapan
CityTokyo
Period21/01/1924/01/19

Fingerprint

Dive into the research topics of 'In-memory batch-normalization for resistive memory based binary neural network hardware'. Together they form a unique fingerprint.

Cite this