Skip to main content

Training Spiking Neural Networks with Synaptic Plasticity under Integer Representation

by Shruti R Kulkarni, Maryam Parsa, John P Mitchell, Catherine D Schuman
Publication Type
Conference Paper
Book Title
International Conference on Neuromorphic Systems 2021
Publication Date
Page Numbers
1 to 7
Publisher Location
New York, United States of America
Conference Name
International Conference on Neuromorphic Systems (ICONS) 2021
Conference Location
Oak Ridge, Tennessee, United States of America
Conference Sponsor
Conference Date

Neuromorphic computing is emerging as a promising Beyond Moore computing paradigm that employs event-triggered computation and non-von Neumann hardware. Spike Timing Dependent Plasticity (STDP) is a well-known bio-inspired learning rule that relies on activities of locally connected neurons to adjust the weights of their respective synapses. In this work, we analyze a basic STDP rule and its sensitivity on the different hyperparameters for training spiking neural networks (SNNs) with supervision, customized for a neuromorphic hardware implementation with integer weights. We compare the classification performance on four UCI datasets (iris, wine, breast cancer and digits) that depict varying levels of complexity. We perform a search for optimal set of hyperparameters using both grid search and Bayesian optimization. Through the use of Bayesian optimization, we show the general trends in hyperparameter sensitivity in SNN classification problem. With the best sets of hyperparameters, we achieve accuracies comparable to some of the best performing SNNs on these four datasets. With a highly optimized supervised STDP rule we show that these accuracies can be achieved with just 20 epochs of training.