Open Access. Powered by Scholars. Published by Universities.®

Physical Sciences and Mathematics Commons

Open Access. Powered by Scholars. Published by Universities.®

Computer Sciences

2002

Brigham Young University

Faculty Publications

Lazy training

Articles 1 - 1 of 1

Full-Text Articles in Physical Sciences and Mathematics

Improving Speech Recognition Learning Through Lazy Training, Tony R. Martinez, Michael E. Rimer, D. Randall Wilson May 2002

Improving Speech Recognition Learning Through Lazy Training, Tony R. Martinez, Michael E. Rimer, D. Randall Wilson

Faculty Publications

Multi-layer backpropagation, like most learning algorithms that can create complex decision surfaces, is prone to overfitting. We present a novel approach, called lazy training, for reducing the overfit in multiple-layer networks. Lazy training consistently reduces generalization error of optimized neural networks by more than half on a large OCR dataset and on several real world problems from the UCI machine learning database repository. Here, lazy training is shown to be effective in a multi-layered adaptive learning system, reducing the error of an optimized backpropagation network in a speech recognition system by 50.0% on the TIDIGITS corpus.