TRAINING ALGORITHMS FOR THE SINGLE LAYER PERCEPTRON

Authors

  • Noury Elhadi
  • Klára Cséfalvay

Abstract

The perceptron is essentially an adaptive linear combiner with the output quantized to one of two possible states, it is the basic building block of multilayer, feedforward neural networks. This paper describes the learning algorithms for the perceptron. Each algorithm is viewed as a steepest descent method, where the algorithm iteratively minimizes an instantaneous performance function. A new performance function is introduced, and a new algorithm is developed that increases the learning speed. Advantages of the new algorithm are demonstrated in computer experiments.

Keywords:

neural networks, adaptive linear element, error correction rules, steepest de- scent rules

How to Cite

Elhadi, N., Cséfalvay, K. “TRAINING ALGORITHMS FOR THE SINGLE LAYER PERCEPTRON ”, Periodica Polytechnica Electrical Engineering, 37(3), pp. 237–250, 1993.

Issue

Section

Articles