CIRG - Research  -  Neural Networks 

[Bullet] Home

- NN
- DM
- SI
- EC
- IA
- Bioinf
- Games
- Opt
- FA
- Industry

Contact Us


The focus of the neural networks group is to investigate aspects of training and optimization of neural networks, and to apply neural networks to solve real-world problems. The activities of this focus area are mainly centered around architecture selection, active learning, and the development of new an efficient training algorithms. Some work is done on self-organizing maps.

Current applications are directed towards data mining, spam detection, user authentication, fraud detection, gesture recognition, and trading on financial markets. Applications of self-organization maps to exploratory data analysis, data mining, and species identification are done.


List the current members actively doing research in this focus area. [ Show ]


S van der Stockt

M.Sc Completed in 2008

E Dean

M.Sc Started in 2002

J Pun

M.Sc Completed

E Clements

Hons-B.Sc Completed in 2003

U Paquet

M.Sc Completed in 2003

R van den Hoven

Hons-B.Sc Completed in 2003

A Ismail

PhD Started in 2005
M.Sc Completed in 2001

A Adejumo

M.Sc Completed in 1999


List publications of this research focus area. [ Show ]



 Adiel Ismail

Portrait photo




 Swarm Intelligence
Neural Networks


 Degree specific information: PhD


 Non-Parametic PSO


Not available

 Supervisor / Co-Supervisor:

 AP Engelbrecht


 Not available for download yet.


 Degree specific information: M.Sc


 Training and Optimization of Product Unit Neural Networks


Product units in the hidden layer of multilayer neural networks provide a pwerful mechanism for neural networks to efficiently learn higher-order combinations of inputs. Training product unit neural networks using local optimization algorithms is difficult due to an increased number of local minima and increased chances of network paralysis. This research investigates the problems using local optimization, especially gradient descent, to train product unit neural networks, and shows that particle swarm optimization, genetic algorithms and leapfrog are efficient alternatives to successfully train product unit neural networks. Architecture selection, i.e. pruning, of product unit neural networks is also studied and a pruning algorithm developed.

 Supervisor / Co-Supervisor:

 AP Engelbrecht



You are visitor #51793
Contact webmaster
Back to top

QualNet Network Simulator University Program Valid XHTML 1.0! Valid CSS!

Computational Intelligence Research Group
University of Pretoria
Copyright © 2018