Open Access. Powered by Scholars. Published by Universities.®

Physical Sciences and Mathematics Commons

Open Access. Powered by Scholars. Published by Universities.®

Articles 1 - 3 of 3

Full-Text Articles in Physical Sciences and Mathematics

Data Mining With Newton's Method., James Dale Cloyd Dec 2002

Data Mining With Newton's Method., James Dale Cloyd

Electronic Theses and Dissertations

Capable and well-organized data mining algorithms are essential and fundamental to helpful, useful, and successful knowledge discovery in databases. We discuss several data mining algorithms including genetic algorithms (GAs). In addition, we propose a modified multivariate Newton's method (NM) approach to data mining of technical data. Several strategies are employed to stabilize Newton's method to pathological function behavior. NM is compared to GAs and to the simplex evolutionary operation algorithm (EVOP). We find that GAs, NM, and EVOP all perform efficiently for well-behaved global optimization functions with NM providing an exponential improvement in convergence rate. For local optimization problems, we …


Improving Speech Recognition Learning Through Lazy Training, Tony R. Martinez, Michael E. Rimer, D. Randall Wilson May 2002

Improving Speech Recognition Learning Through Lazy Training, Tony R. Martinez, Michael E. Rimer, D. Randall Wilson

Faculty Publications

Multi-layer backpropagation, like most learning algorithms that can create complex decision surfaces, is prone to overfitting. We present a novel approach, called lazy training, for reducing the overfit in multiple-layer networks. Lazy training consistently reduces generalization error of optimized neural networks by more than half on a large OCR dataset and on several real world problems from the UCI machine learning database repository. Here, lazy training is shown to be effective in a multi-layered adaptive learning system, reducing the error of an optimized backpropagation network in a speech recognition system by 50.0% on the TIDIGITS corpus.


The Effect Of Model Formulation On The Comparative Performance Of Artificial Neural Networks And Regression, Michael F. Cochrane Apr 2002

The Effect Of Model Formulation On The Comparative Performance Of Artificial Neural Networks And Regression, Michael F. Cochrane

Engineering Management & Systems Engineering Theses & Dissertations

Multiple linear regression techniques have been traditionally used to construct predictive statistical models, relating one or more independent variables (inputs) to a dependent variable (output). Artificial neural networks can also be constructed and trained to learn these complex relationships, and have been shown to perform at least as well as linear regression on the same data sets. Research on the use of neural network models as alternatives to multivariate linear regression has focused predominantly on the effects of sample size, noise, and input vector size on the comparative performance of these two modeling techniques. However, research has also shown that …