Neural Network System Techniques And Applications

By : Cornelius T. Leondes



Preface
Inspired by the structure of the human brain, artificial neural networks have been widely applied to fields such as pattern recognition, optimization, coding, control, etc., because of their ability to solve cumbersome or intractable problems by learning directly from data. An artificial neural network usually consists of a large number of simple processing units, i.e., neurons, via mutual interconnection. It learns to solve problems by adequately adjusting the strength of the interconnections according to input data. Moreover, the neural network adapts easily to new environments by learning, and it can deal with information that is noisy, inconsistent, vague, or probabilistic. These features have motivated extensive research and developments in artificial neural networks. This volume is probably the
first rather diversely comprehensive treatment devoted to the broad areas of algorithms and architectures for the realization of neural network systems. Techniques and diverse methods in numerous areas of this broad subject are presented. In addition, various major neural network structures for achieving effective systems are presented and illustrated by examples in all cases. Numerous other techniques and subjects related to this broadly significant area are treated.


The remarkable breadth and depth of the advances in neural network systems with their many substantive applications, both realized and yet to be realized, make it quite evident that adequate treatment of this broad area requires a number of distinctly titled but well integrated volumes. This is the first of seven volumes on the subject of neural network systems and it is entitled Algorithms and Architectures, The entire set of seven volumes contains

Volume 1: Algorithms and Architectures
Volume 2: Optimization Techniques
Volume 3: Implementation Techniques
Volume 4: Industrial and Manufacturing Systems
Volume 5: Image Processing and Pattern Recognition
Volume 6: Fuzzy Logic and Expert Systems Applications
Volume 7: Control and Dynamic Systems

The first contribution to Volume 1 is "Statistical Theories of Learning in Radial Basis Function Networks," by Jason A. S. Freeman, Mark J. L. Orr, and David Saad. There are many heuristic techniques described in the neural network hterature to perform various tasks within the supervised learning paradigm, such as optimizing training, selecting an appropriately sized network, and predicting how much data will be required to achieve a particular generalization performance. This contribution explores these issues in a theoretically based, well-founded manner for the radial basis function network. It treats issues such as using cross-validation to select network size, growing networks, regularization, and the determination of
the average and worst-case generalization performance. Numerous illustrative examples are included which clearly manifest the substantive effectiveness of the techniques presented here.

The next contribution is "The Synthesis of Three-Layer Threshold Networks," by Jung Hwan Kim, Sung-Kwon Park, Hyunseo Oh, and Youngnam Han. In 1969, Minsky and Papert (reference listed in the contribution) demonstrated that two-layer perception networks were inadequate for many real world problems such as the exclusive-OR function and the parity functions which are basically linearly inseparable functions.
Although Minsky and Papert recognized that three-layer threshold networks can possibly solve many real world problems, they felt it unlikely that a training method could be developed to find three-layer threshold networks to solve these problems. This contribution presents a learning algorithm called expand-and-truncate learning to synthesize a three-layer threshold network with guaranteed convergence for an arbitrary switching function. Evidently, to date, there has not been found an algorithm to synthesize a threshold network for an arbitrary switching function. The
most significant such contribution is the development for a three-layer threshold network, of a synthesis algorithm which guarantees the convergence for any switching function including linearly inseparable functions, and automatically determines the required number of threshold elements in the hidden layer. A number of illustrative examples are presented to demonstrate the effectiveness of the techniques.

Download E-Book Lengkap

Monday, September 21, 2009

0 Comments:

 
Tutorial - Wordpress Themes is proudly powered by WordPress and themed by Mukkamu Templates Novo Blogger