BACKPROPAGATION TRAINING ALGORITHM WITH ADAPTIVE PARAMETERS TO SOLVE DIGITAL PROBLEMS

Abstract
An efficient technique namely Backpropagation training with adaptive parameters using Lyapunov Stability Theory for training single hidden layer feed forward network is proposed. A three-layered Feedforward neural network architecture is used to solve the selected problems. Sequential Training Mode is used to train the network. Lyapunov stability theory is employed to ensure the faster and steady state error convergence and to construct and energy surface with a single global minimum point through the adaptive adjustment of the weights and the adaptive parameter ß. To avoid local minima entrapment, an adaptive backpropagation algorithm based on Lyapunov stability theory is used. Lyapunov stability theory gives the algorithm, the efficiency of attaining a single global minimum point. The learning parameters used in this algorithm is responsible for the faster error convergence. The adaptive learning parameter used in this algorithm is chosen properly for faster error convergence. The error obtained has been asymptotically converged to zero according to Lyapunov Stability theory. The performance of the adaptive Backpropagation algorithm is measured by solving parity problem, half adder and full adder problems.

Authors
R. Saraswathi
The American College, Tamil Nadu, India

Keywords
Single Hidden Layer, Lyapunov Stability Theory, Adaptive Learning Parameter
Published By :
ICTACT
Published In :
ICTACT Journal on Soft Computing
( Volume: 1 , Issue: 3 )
Date of Publication :
January 2011

Creative Commons License
This work is licensed under a Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International License.