Large-scale neural networks implemented with non-volatile memory as the synaptic weight element: impact of conductance response

We assess the impact of the conductance response of Non-Volatile Memory (NVM) devices employed as the synaptic weight element for on-chip acceleration of the training of large-scale artificial neural networks (ANN). We briefly review our previous work towards achieving competitive performance (classification accuracies) for such ANN with both Phase-Change Memory (PCM) [1], [2] and non-filamentary ReRAM based on PrCaMnO (PCMO) [3], and towards assessing the potential advantages for ML training over GPU–based hardware in terms of speed (up to 25x faster) and power (from 120–2850x lower power) [4]. We then discuss the “jump-table” concept, previously introduced to model real-world NVM such as PCM [1] or PCMO, to describe the full cumulative distribution function (CDF) of conductance-change at each device conductance value, for both potentiation (SET) and depression (RESET). Using several types of artificially–constructed jump-tables, we assess the relative importance of deviations from an ideal NVM with perfectly linear conductance response.

Published in:
Proceedings of the 2016 European Solid-State Device Research Conference (ESSDERC 2016)
Presented at:
European Solid-State Device Research Conference (ESSDERC) 2016, Lausanne, Switzerland, September 12-16, 2016
New York, Ieee

 Record created 2016-08-11, last modified 2018-03-17

Rate this document:

Rate this document:
(Not yet reviewed)