Skip to main content

SuperNeuro: A Fast and Scalable Simulator for Neuromorphic Computing

Publication Type
Conference Paper
Book Title
Proceedings of the 2023 International Conference on Neuromorphic Systems
Publication Date
Page Numbers
1 to 4
Publisher Location
New York, New York, United States of America
Conference Name
International Conference on Neuromorphic Systems 2023 (ICONS)
Conference Location
Santa Fe, New Mexico, United States of America
Conference Sponsor
Association of Computing Machinery (ACM)
Conference Date

In many neuromorphic workflows, simulators play a vital role for important tasks such as training spiking neural networks, running neuroscience simulations, and designing, implementing, and testing neuromorphic algorithms. Currently available simulators cater to either neuroscience workflows (e.g., NEST and Brian2) or deep learning workflows (e.g., BindsNET). Problematically, the neuroscience-based simulators are slow and not very scalable, and the deep learning-based simulators do not support certain functionalities that are typical of neuromorphic workloads (e.g., synaptic delay). In this paper, we address this gap in the literature and present SuperNeuro, which is a fast and scalable simulator for neuromorphic computing capable of both homogeneous and heterogeneous simulations as well as GPU acceleration. We also present preliminary results that compare SuperNeuro to widely used neuromorphic simulators such as NEST, Brian2, and BindsNET in terms of computation times. We demonstrate that SuperNeuro can be approximately 10×--300× faster than some of the other simulators for small sparse networks. On large sparse and large dense networks, SuperNeuro can be approximately 2.2×--3.4× faster than the other simulators, respectively.