Institutions | About Us | Help | Gaeilge
rian logo


Mark
Go Back
Automatic Learning Rate Maximization by On-Line Estimation of the Hessian's Eigenvectors
Lecun, Yann; Simard, Patrice Y.; Pearlmutter, Barak A.
We propose a very simple, and well principled wayofcomputing the optimal step size in gradient descent algorithms. The on-line version is very efficient computationally, and is applicable to large backpropagation networks trained on large data sets. The main ingredient is a technique for estimating the principal eigenvalue(s) and eigenvector(s) of the objective function's second derivativematrix (Hessian), which does not require to even calculate the Hessian. Several other applications of this technique are proposed for speeding up learning, or for eliminating useless parameters.
Keyword(s): Automatic Learning; Rate Maximization; On-Line Estimation; Hessian's Eigenvectors
Publication Date:
1993
Type: Book chapter
Peer-Reviewed: Yes
Institution: Maynooth University
Citation(s): Lecun, Yann and Simard, Patrice Y. and Pearlmutter, Barak A. (1993) Automatic Learning Rate Maximization by On-Line Estimation of the Hessian's Eigenvectors. In: Advances in Neural Information Processing Systems 6 : Proceedings of the annual Conference on Advances in Neural Information Processing Systems 1993. Neural Information Processing Systems (NIPS). ISBN 9781558603226
Publisher(s): Neural Information Processing Systems (NIPS)
File Format(s): other
Related Link(s): http://mural.maynoothuniversity.ie/8137/1/BP-Automatic-1993.pdf
First Indexed: 2020-04-02 06:32:21 Last Updated: 2020-04-02 06:32:21