University of Surrey

Test tubes in the lab Research in the ATI Dance Research

Optimal and adaptive radial basis function neural networks.

Shahsavand, Akbar. (2000) Optimal and adaptive radial basis function neural networks. Doctoral thesis, University of Surrey (United Kingdom)..

Available under License Creative Commons Attribution Non-commercial Share Alike.

Download (8MB) | Preview


The optimisation and adaptation of single hidden layer feed-forward neural networks employing radial basis activation functions (RBFNs) was investigated. Previous work on RBFNs has mainly focused on problems with large data sets. The training algorithms developed with large data sets prove unreliable for problems with a small number of observations, a situation frequently encountered in process engineering. The primary objective of this study was the development of efficient and reliable learning algorithms for the training of RJBFNs with small and noisy data sets. It was demonstrated that regularisation is essential in order to filter out the noise and prevent over-fitting. The selection of the appropriate level of regularisation, lambda*, with small data sets presents a major challenge. The leave-one-out cross validation technique was considered as a potential means for automatic selection of lambda*. The computational burden of selecting lambda* was significantly reduced by a novel application of the generalised singular value decomposition. The exact solution of the multivariate linear regularisation problem can be represented as a single hidden layer neural network, the Regularisation Network, with one neurone for each distinct exemplar. A new formula was developed for automatic selection of the regularisation level for a Regularisation Network with given non-linearities. It was shown that the performance of a Regularisation Network is critically dependent on the non-linear parameters of the activation function employed; a point which has received surprisingly little attention. It was demonstrated that a measure of the effective degrees of freedom df(lambda*,alpha) of a Regularisation Network can be used to select the appropriate width of the local radial basis functions, alpha, based on the data alone. The one-to-one correspondence between the number of exemplars and the number of hidden neurones of a Regularisation Network may prove computationally prohibitive. The remedy is to use a network with a smaller number of neurones, the Generalised Radial Basis Function Network (GRBFN). The training of a GRBFN ultimately settles down to a large-scale non-linear optimisation problem. A novel sequential back-fit algorithm was developed for training the GRBFNs, which enabled the optimisation to proceed one neurone at a time. The new algorithm was tested with very promising results and its application to a simple chemical engineering process was demonstrated In some applications the overall response is composed of sharp localised features superimposed on a gently varying global background. Existing multivariate regression techniques as well as conventional neural networks are aimed at filtering the noise and recovering the overall response. An initial attempt was made at developing an Adaptive GRBFN to separate the local and global features. An efficient algorithm was developed simply by insisting that all the activation functions which are responsible for capturing the global trend should lie in the null space of the differential operator generating the activation function of the kernel based neurones. It was demonstrated that the proposed algorithm performs extremely well in the absence of strong global input interactions.

Item Type: Thesis (Doctoral)
Divisions : Theses
Authors :
Shahsavand, Akbar.
Date : 2000
Contributors :
Depositing User : EPrints Services
Date Deposited : 09 Nov 2017 12:18
Last Modified : 20 Jun 2018 11:45

Actions (login required)

View Item View Item


Downloads per month over past year

Information about this web site

© The University of Surrey, Guildford, Surrey, GU2 7XH, United Kingdom.
+44 (0)1483 300800