Globally optimal learning rates in multilayer neural networks

Saad, David and Rattray, Magnus (1998). Globally optimal learning rates in multilayer neural networks. Philosophical Magazine Part B, 77 (5), pp. 1523-1530.

Abstract

A method for calculating the globally optimal learning rate in on-line gradient-descent training of multilayer neural networks is presented. The method is based on a variational approach which maximizes the decrease in generalization error over a given time frame. We demonstrate the method by computing optimal learning rates in typical learning scenarios. The method can also be employed when different learning rates are allowed for different parameter vectors as well as to determine the relevance of related training algorithms based on modifications to the basic gradient descent rule.

Publication DOI: https://doi.org/10.1080/13642819808205044
Divisions: Engineering & Applied Sciences > Mathematics
Engineering & Applied Sciences > Systems analytics research institute (SARI)
Additional Information: Proceedings of the MINERVA workshop on Mesoscopics, Fractals and Neural Networks, 25-27 March 1997, Eilat (IL). This is an electronic version of an article published in Saad, David and Rattray, Magnus (1998). Globally optimal learning rates in multilayer neural networks. Philosophical Magazine Part B, 77 (5), pp. 1523-1530. Philosophical Magazine Part B is available online at: http://www.informaworld.com/openurl?genre=article&issn=1364-2812&volume=77&issue=5&spage=1523
Uncontrolled Keywords: optimal learning rate,gradient-descent,multilayer neural networks,variational approach,generalization error,gradient descent rule
Full Text Link:
Related URLs: http://www.scop ... tnerID=8YFLogxK (Scopus URL)
http://www.info ... ue=5&spage=1523 (Publisher URL)
Published Date: 1998-05
Authors: Saad, David ( 0000-0001-9821-2623)
Rattray, Magnus

Download

[img]

Version: Accepted Version


Export / Share Citation


Statistics

Additional statistics for this record