Open Access. Powered by Scholars. Published by Universities.®

Digital Commons Network

Open Access. Powered by Scholars. Published by Universities.®

Computer Sciences

PDF

Brigham Young University

2002

Generalization

Articles 1 - 1 of 1

Full-Text Articles in Entire DC Network

Improving Speech Recognition Learning Through Lazy Training, Tony R. Martinez, Michael E. Rimer, D. Randall Wilson May 2002

Improving Speech Recognition Learning Through Lazy Training, Tony R. Martinez, Michael E. Rimer, D. Randall Wilson

Faculty Publications

Multi-layer backpropagation, like most learning algorithms that can create complex decision surfaces, is prone to overfitting. We present a novel approach, called lazy training, for reducing the overfit in multiple-layer networks. Lazy training consistently reduces generalization error of optimized neural networks by more than half on a large OCR dataset and on several real world problems from the UCI machine learning database repository. Here, lazy training is shown to be effective in a multi-layered adaptive learning system, reducing the error of an optimized backpropagation network in a speech recognition system by 50.0% on the TIDIGITS corpus.