Skip to main content
Publication

Energy efficient stochastic-based deep spiking neural networks for sparse datasets

by
Publication Type
Journal
Journal Name
IEEE Explore
Publication Date
Conference Date
-

With large deep neural networks (DNNs) necessary to solve complex and data-intensive problems, energy efficiency is a key bottleneck for effectively deploying DL in the real world. Deep spiking NNs have gained much research attention recently due to the interest in building biological neural networks and the availability of neuromorphic platforms, which can be orders of magnitude more energy efficient compared to CPUs and GPUs. Although spiking NNs have proven to be an efficient technique for solving many machine learning and computer vision problems, to the best of our knowledge, this is the first attempt to adapt spiking NNs to sparse datasets. In this paper, we study the behaviour of spiking NNs in handling NLP datasets and the sparsity in their data representation. Then, we propose a novel framework for spiking NN using the concept of stochastic computing. Specifically, instead of generating spike trains with firing rates proportional to the intensity of each value in the feature set separately, the whole feature set is treated as a distribution function and a stochastic spiking train that follow this distribution is generated. This framework reduces the connectivity between NN layers from O(N) to O(log N). Also, it encodes input data differently and make suitable to handle sparse datasets. Finally, the framework achieves high energy efficiency since it uses Integrate and Fire neurons same as conventional spiking NNs. The results show that our proposed stochastic-based SNN achieves nearly the same accuracy as the original DNN on MNIST dataset, and it has better performance than state-of-the-art SNN. Besides that stochastic-based SNN is energy efficient, where the fully connected DNN, the conventional SNN, and the data normalized SNN consume 38.24, 1.83, and 1.85-times more energy than the stochastic-based SNN, respectively. For sparse datasets, including IMDb and In-House clinical datasets, stochastic-based SNN achieves performance comparable to that of the conventional DNN. However, the conventional spiking NN has a significant decline in classification accuracy.