University of Surrey

Test tubes in the lab Research in the ATI Dance Research

On the choice of parameters of the cost function in nested modular RNN's

Mandic, DP and Chambers, JA (2000) On the choice of parameters of the cost function in nested modular RNN's IEEE Transactions on Neural Networks, 11 (2). pp. 315-322.

Full text not available from this repository.


We address the choice of the coefficients in the cost function of a modular nested recurrent neural-network (RNN) architecture, known as the pipelined recurrent neural network (PRNN). Such a network can cope with the problem of vanishing gradient, experienced in prediction with RNN's. Constraints on the coefficients of the cost function, in the form of a vector norm, are considered. Unlike the previous cost function for the PRNN, which included a forgetting factor motivated by the recursive least squares (RLS) strategy, the proposed forms of cost function provide `forgetting' of the outputs of adjacent modules based upon the network architecture. Such an approach takes into account the number of modules in the PRNN, through the unit norm constraint on the coefficients of the cost function of the PRNN. This is shown to be particularly suitable, since due to inherent nesting in the PRNN, every module gives its full contribution to the learning process, whereas the unit norm constrained cost function introduces a sense of forgetting in the memory management of the PRNN. The PRNN based upon a modified cost function outperforms existing PRNN schemes in the time series prediction simulations presented.

Item Type: Article
Divisions : Surrey research (other units)
Authors :
Mandic, DP
Date : 3 December 2000
DOI : 10.1109/72.839003
Depositing User : Symplectic Elements
Date Deposited : 17 May 2017 13:27
Last Modified : 24 Jan 2020 23:57

Actions (login required)

View Item View Item


Downloads per month over past year

Information about this web site

© The University of Surrey, Guildford, Surrey, GU2 7XH, United Kingdom.
+44 (0)1483 300800