Preserving temporal information allows a better representation of dynamic features, such as sounds, and enables fast responses to events that may occur at any moment. Based on this biological insight, project Ihmehimmeli explores how artificial spiking neural networks can exploit temporal dynamics using various architectures and learning settings. The essence of this word captures our aim to build complex recurrent neural network architectures with temporal encoding of information.
We use artificial spiking networks with a temporal coding scheme, in which more interesting or surprising information, such as louder sounds or brighter colours, causes earlier neuronal spikes. Along the information processing hierarchy, the winning neurons are those that spike first. Such an encoding can naturally implement a classification scheme where input features are encoded in the spike times of their corresponding input neurons, while the output class is encoded by the output neuron that spikes earliest. The Ihmehimmeli project team holding a himmeli , a symbol for the aim to build recurrent neural network architectures with temporal encoding of information.
Spiking neural network - Wikipedia
We recently published and open-sourced a model in which we demonstrated the computational capabilities of fully connected spiking networks that operate using temporal coding. Our model uses a biologically-inspired synaptic transfer function , where the electric potential on the membrane of a neuron rises and gradually decays over time in response to an incoming signal, until there is a spike. The strength of the associated change is controlled by the "weight" of the connection, which represents the synapse efficiency.
Crucially, this formulation allows exact derivatives of postsynaptic spike times with respect to presynaptic spike times and weights.
Third Generation Neural Networks: Spiking Neural Networks
The process of training the network consists of adjusting the weights between neurons, which in turn leads to adjusted spike times across the network. Much like in conventional artificial neural networks, this was done using backpropagation. We used synchronization pulses, whose timing is also learned with backpropagation, to provide a temporal reference to the network. We trained the network on classic machine learning benchmarks, with features encoded in time. The spiking network successfully learned to solve noisy Boolean logic problems and achieved a test accuracy of However, unlike conventional networks, our spiking network uses an encoding that is in general more biologically-plausible, and, for a small trade-off in accuracy, can compute the result in a highly energy-efficient manner, as detailed below.
Search Advanced. Current Journals. Archive Journals. All Journals. New Titles.
Pick and Choose. Literature Updates. For Members. For Librarians. RSS Feeds.
- US Navy F-4 Phantom II MiG Killers 1972–73!
- Spiking neural networks: Applications to computing, algorithmics, and robotics.
- The Brian spiking neural network simulator!
- World Bank Assistance to Agriculture in Sub-Saharan Africa: An IEG Review (Independent Evaluation Studies);
- High Energy Astrophysics: An Introduction.
Chemistry World. Education in Chemistry.
3rd HBP Curriculum workshop series | Spiking neural networks: Applications...
This article is Open Access.
Please wait while we load your content Something went wrong. Try again?
Cited by. Back to tab navigation Download options Please wait Article type: Paper.