Adapting spiking recurrent neural networks with synaptic plasticity for temporal pattern recognition.
Chrol-Cannon, J. (2016) Adapting spiking recurrent neural networks with synaptic plasticity for temporal pattern recognition. Doctoral thesis, University of Surrey.
JCC_Thesis.pdf - Version of Record
Available under License Creative Commons Attribution Non-commercial Share Alike.
Download (3MB) | Preview
Self-organization in biological nervous systems during the lifetime is known to largely occur through a process of plasticity that is dependent upon the spike-timing activity in connected neurons. This thesis investigates how synaptic plasticity can be applied to recurrently connected spiking neural networks in order to improve time-series pattern recognition accuracy by learning the temporal structure of the input signal within the synaptic weights. Reservoir computing is a recurrent neural network paradigm that can naturally process temporal signals in real-time. However, the learning in the model is limited to linear regression of a simple perceptron, not the recurrent part of the network, preventing the temporal structure of the input to be learned by the model. It is the intention of this work to allow the learning of temporal structure in the recurrent connections through application of synaptic plasticity models derived from biological observation. By allowing all parameters to adapt to the input signal, it is expected that a trained output will achieve higher accuracy in a pattern recognition task. However, it is found that adapting the recurrent connections with a variety of plasticity models does not necessarily improve accuracy, the difference observed is largely negligible, and is mostly detrimental when significant. To establish why this is, several metrics used to quantify separability and information content within recurrent networks are employed to determine thecomputational effect that synaptic plasticity is having on the network structure. Surprisingly, it is found that plasticity can improve or degrade the metrics unpredictably depending on the plasticity model, initial connectivity and input data used. Furthermore, the metrics themselves are demonstrated to have only weak correlation with the pattern recognition accuracy of a trained readout, suggesting that these measures are not as indicative of performance as often claimed. A more fundamental analysis of synaptic weight change is undertaken that compares the geometric distance of the synaptic changes caused by input patterns of different classes. These experiments show that plasticity leads to remarkable class-specificity in synaptic changes in data sets that include auditory speech samples and visual human motion behaviors. The limitation of input-specific learning actually originates in interference within synaptic parameters caused by the incremental nature of weight updates that leads to continuous overwriting of previously learned parameters. These insights are then applied to propose a plasticity sensitive readout neuron for temporal pattern recognition that uses the change in weight rather than the neural activity to directly form an output. This method utilizes the input sensitivity of the plasticity models while avoiding the limitations of structural adaptation that is curbed by interference. In this framework, the role of structural adaptation is to reach a critical state in which each input drives the parameters away from the sample mean in a unique direction. The challenges and limitations faced of current spiking and reservoir computing models are explored. These include interference and forgetting in the synaptic parameters when applying plasticity to the reservoir computing framework, and the limitations regarding use of advanced spike-time neural coding in recurrent networks with continuous activity. Further work is discussed regarding solutions to the aforementioned limitations as well as possible new directions for more advanced applications of synaptic plasticity based on develop- mental genetic principles that could enhance the self-organizing ability of neural networks.
|Item Type:||Thesis (Doctoral)|
|Subjects :||Computational Neuroscience|
|Date :||31 May 2016|
|Depositing User :||Joseph Chrol-Cannon|
|Date Deposited :||17 Jun 2016 10:55|
|Last Modified :||17 Jun 2016 10:55|
Actions (login required)
Downloads per month over past year