Content Tags

There are no tags.

Spontaneous sparse learning for PCM-based memristor neural networks

Authors
Dong-Hyeok Lim, Shuang Wu, Rong Zhao, Jung-Hoon Lee, Hongsik Jeong, Luping Shi

Neural networks trained by backpropagation have achieved tremendous successes on numerous intelligent tasks. However, naïve gradient-based training and updating methods on memristors impede applications due to intrinsic material properties. Here, we built a 39 nm 1 Gb phase change memory (PCM) memristor array and quantified the unique resistance drift effect. On this basis, spontaneous sparse learning (SSL) scheme that leverages the resistance drift to improve PCM-based memristor network training is developed. During training, SSL regards the drift effect as spontaneous consistency-based distillation process that reinforces the array weights at the high-resistance state continuously unless the gradient-based method switches them to low resistance. Experiments show that the SSL not only helps the convergence of network with better performance and sparsity controllability without additional computation in handwritten digit classification. This work promotes the learning algorithms with the intrinsic properties of memristor devices, opening a new direction for development of neuromorphic computing chips.

Stay in the loop.

Subscribe to our newsletter for a weekly update on the latest podcast, news, events, and jobs postings.