Learning with noise and regularizers in multilayer neural networks


We study the effect of two types of noise, data noise and model noise, in an on-line gradient-descent learning scenario for general two-layer student network with an arbitrary number of hidden units. Training examples are randomly drawn input vectors labeled by a two-layer teacher network with an arbitrary number of hidden units. Data is then corrupted by Gaussian noise affecting either the output or the model itself. We examine the effect of both types of noise on the evolution of order parameters and the generalization error in various phases of the learning process.

Divisions: College of Engineering & Physical Sciences > Systems analytics research institute (SARI)
Additional Information: Copiright of Massachusetts Institute of Technology Press (MIT Press)
Uncontrolled Keywords: noise,data noise,model noise,gradient-descent learning,vectors,gaussian noise,error
Publication ISSN: 1049-5258
Last Modified: 27 Jun 2024 07:35
Date Deposited: 08 Jul 2009 11:18
Full Text Link:
Related URLs: http://www.scop ... tnerID=8YFLogxK (Scopus URL)
http://mitpress ... type=2&tid=3990 (Publisher URL)
PURE Output Type: Article
Published Date: 1996
Authors: Saad, David (ORCID Profile 0000-0001-9821-2623)
Solla, Sara A.



Version: Published Version

Export / Share Citation


Additional statistics for this record