Preprint 96/2019

Training Delays in Spiking Neural Networks

Laura State

Contact the author: Please use for correspondence this email.
Submission date: 21. Oct. 2019
Pages: 64
published as:
State, L.: Training delays in spiking neural networks
   Dissertation, Universität Tübingen, 2019
Bibtex
MSC-Numbers: 15-04, 42-04, 94-04, 92-0
Keywords and phrases: Spiking Neural Networks, Machine Learning, Complex Domain
Download full preprint: PDF (1870 kB)

Abstract:
Artificial Neural Networks (ANNs) are a state-of-the-art technique in machine learning, showing high performance in many different tasks. However, their demand for computational resources is high, both during training and testing. An alternative framework is provided by Spiking Neural Networks (SNNs), a model that is closely inspired by biological networks. The energy consumption of SNNs is small, however, their performance lies below that of ANNs. The main reason for this gap is the much harder training of SNNs. In this thesis, we propose a new supervised framework for training SNNs. Inspired by research in theoretical neuroscience that highlights the importance of temporal codes, we introduce a delay parameter. We propose two different training approaches: a transformation to the complex domain combined with a linear regression and a standard gradient descent. We e valuate our training framework on two different classification tasks, based on a synthetic dataset and th e MNIST dataset of handwritten digits. A single-layer network trained by both approaches is able to pe rform the given classifications tasks. Our supervised framework provides a new approach for training SNNs and can be used to optimize the training of neuromorphic chips

20.02.2020, 02:16