Preprint 96/2019

Training Delays in Spiking Neural Networks

Laura State

Contact the author: Please use for correspondence this email.
Submission date: 21. Oct. 2019
Pages: 64
published in: Artificial neural networks and machine learning - ICANN 2019 : theoretical neural computation ; 28th international conference on artificial neural networks, Munich, Germany, September 17-19, 2019, proceedings, part I / I. V. Tetko (ed.)
Cham : Springer, 2019. - P. 713 - 717
(Lecture notes in computer science ; 11727) 
DOI number (of the published article): 10.1007/978-3-030-30487-4_54
Bibtex
MSC-Numbers: 15-04, 42-04, 94-04, 92-0
Keywords and phrases: Spiking Neural Networks, Machine Learning, Complex Domain
Download full preprint: PDF (1870 kB)

Abstract:
Artificial Neural Networks (ANNs) are a state-of-the-art technique in machine learning, showing high performance in many different tasks. However, their demand for computational resources is high, both during training and testing. An alternative framework is provided by Spiking Neural Networks (SNNs), a model that is closely inspired by biological networks. The energy consumption of SNNs is small, however, their performance lies below that of ANNs. The main reason for this gap is the much harder training of SNNs. In this thesis, we propose a new supervised framework for training SNNs. Inspired by research in theoretical neuroscience that highlights the importance of temporal codes, we introduce a delay parameter. We propose two different training approaches: a transformation to the complex domain combined with a linear regression and a standard gradient descent. We e valuate our training framework on two different classification tasks, based on a synthetic dataset and th e MNIST dataset of handwritten digits. A single-layer network trained by both approaches is able to pe rform the given classifications tasks. Our supervised framework provides a new approach for training SNNs and can be used to optimize the training of neuromorphic chips

05.11.2019, 08:49