Experimental Demonstration and Tolerancing of a Large-Scale Neural Network (165 000 Synapses) Using Phase-Change Memory as the Synaptic Weight Element

Using two phase-change memory devices per synapse, a three-layer perceptron network with 164 885 synapses is trained on a subset (5000 examples) of the MNIST database of handwritten digits using a backpropagation variant suitable for nonvolatile memory (NVM) + selector crossbar arrays, obtaining a training (generalization) accuracy of 82.2% (82.9%). Using a neural network simulator matched to the experimental demonstrator, extensive tolerancing is performed with respect to NVM variability, yield, and the stochasticity, linearity, and asymmetry of the NVM-conductance response. We show that a bidirectional NVM with a symmetric, linear conductance response of high dynamic range is capable of delivering the same high classification accuracies on this problem as a conventional, software-based implementation of this same network.


Published in:
Ieee Transactions On Electron Devices, 62, 11, 3498-3507
Year:
2015
Publisher:
Piscataway, Ieee-Inst Electrical Electronics Engineers Inc
ISSN:
0018-9383
Keywords:
Laboratories:




 Record created 2015-12-02, last modified 2018-09-13


Rate this document:

Rate this document:
1
2
3
 
(Not yet reviewed)