Skip to main content

Stochasticity and robustness in spiking neural networks...

by Wilkie Ammentorp, Karsten Beckmann, Catherine D Schuman, James Plank, Nathaniel Cady
Publication Type
Journal Name
Publication Date
Page Numbers
23 to 36

Despite drawing inspiration from biological systems which are inherently noisy and variable, artificial neural networks have been shown to require precise weights to carry out the task which they are trained to accomplish. This creates a challenge when adapting these artificial networks to specialized execution platforms which may encode weights in a manner which restricts their accuracy and/or precision.

Reflecting back on the non-idealities which are observed in biological systems, we investigated the effect these properties have on the robustness of spiking neural networks under perturbations to weights. First, we examined techniques extant in conventional neural networks which resemble noisy processes, and postulated they may produce similar beneficial effects in spiking neural networks. Second, we evolved a set of spiking neural networks utilizing biological non-idealities to solve a pole-balancing task, and estimated their robustness. We showed it is higher in networks using noisy neurons, and demonstrated that one of these networks can perform well under the variance expected when a hafnium-oxide based resistive memory is used to encode synaptic weights. Lastly, we trained a series of networks using a surrogate gradient method on the MNIST classification task. We confirmed that these networks demonstrate similar trends in robustness to the evolved networks. We discuss these results and argue that they display empirical evidence supporting the role of noise as a regularizer which can increase network robustness.