AN ADAPTIVE LEAST SQUARES ALGORITHM FOR THE EFFICIENT TRAINING OF ARTIFICIAL NEURAL NETWORKS
Abstract
Recently a number of publications have proposed alternative methods to apply in least mean square (LMS) algorithms in order to improve convergence rate. It has been also shown that variable step size methods can provide better convergence speed than the fixed step size ones. This paper introduces a new algorithm for the on-going calculation of the step size, and investigates its applicability in the training of multilayer neural networks. The proposed method seems to be efficient at least in the case of lower level additive input noise.
Keywords:
adaptive step size, training of neural networks.How to Cite
Várkonyi-Kóczy, A. R. “AN ADAPTIVE LEAST SQUARES ALGORITHM FOR THE EFFICIENT TRAINING OF ARTIFICIAL NEURAL NETWORKS ”, Periodica Polytechnica Electrical Engineering, 37(2), pp. 119–129, 1993.
Issue
Section
Articles