Content Tags

There are no tags.

Fast and Accurate Sparse Coding of Visual Stimuli with a Simple, Ultra-Low-Energy Spiking Architecture.

RSS Source
Authors
Walt Woods, Christof Teuscher

Memristive crossbars have become a popular means for realizing unsupervisedand supervised learning techniques. In previous neuromorphic architectures withleaky integrate-and-fire neurons, the crossbar itself has been separated fromthe neuron capacitors to preserve mathematical rigor. In this work, we soughtto simplify the design, creating a fast circuit that consumed significantlylower power at a minimal cost of accuracy. We also showed that connecting theneurons directly to the crossbar resulted in a more efficient sparse codingarchitecture, and alleviated the need to pre-normalize receptive fields. Thiswork provides derivations for the design of such a network, named the SimpleSpiking Locally Competitive Algorithm, or SSLCA, as well as CMOS designs andresults on the CIFAR and MNIST datasets. Compared to a non-spiking model whichscored 33% on CIFAR-10 with a single-layer classifier, this hardware scored 32%accuracy. When used with a state-of-the-art deep learning classifier, thenon-spiking model achieved 82% and our simplified, spiking model achieved 80%,while compressing the input data by 92%. Compared to a previously proposedspiking model, our proposed hardware consumed 99% less energy to do the samework at 21x the throughput. Accuracy held out with online learning to a writevariance of 3%, suitable for the often-reported 4-bit resolution required forneuromorphic algorithms; with offline learning to a write variance of 27%; andwith read variance to 40%. The proposed architecture's excellent accuracy,throughput, and significantly lower energy usage demonstrate the utility of ourinnovations. This work provides a means for extremely low-energy sparse codingin mobile devices, such as cellular phones, or for very sparse coding as isneeded by self-driving cars or robotics that must integrate data from multiple,high-resolution sensors.

Stay in the loop.

Subscribe to our newsletter for a weekly update on the latest podcast, news, events, and jobs postings.