Open Access. Powered by Scholars. Published by Universities.®

Digital Commons Network

Open Access. Powered by Scholars. Published by Universities.®

Articles 1 - 6 of 6

Full-Text Articles in Entire DC Network

Head Related Transfer Function Approximation Using Neural Networks, John K. Millhouse Dec 1994

Head Related Transfer Function Approximation Using Neural Networks, John K. Millhouse

Theses and Dissertations

This thesis determines whether an artificial neural network (ANN) can approximate the Armstrong Aerospace Medical Research Laboratories (AAMRL) head related transfer functions (HRTF) data obtained from research at AAMRL during the fall of 1988. The first test determines whether HRTF lends any support in sound localization when compared to no HRTF (Interaural Time Delay only). There is a statistically significant interaction between the location of the sound and whether the HRTF or no HRTF is used. When this interaction is removed using the alternate F.Value, the statistics give the result of equal means for the filters and azimuth. This means …


Embedology And Neural Estimation For Time Series Prediction, Robert E. Garza Dec 1994

Embedology And Neural Estimation For Time Series Prediction, Robert E. Garza

Theses and Dissertations

Time series prediction has widespread application, ranging from predicting the stock market to trying to predict future locations of scud missiles. Recent work by Sauer and Casdagli has developed into the embedology theorem, which sets forth the procedures for state space manipulation and reconstruction for time series prediction. This includes embedding the time series into a higher dimensional space in order to form an attractor, a structure defined by the embedded vectors. Embedology is combined with neural technologies in an effort to create a more accurate prediction algorithm. These algorithms consist of embedology, neural networks, Euclidean space nearest neighbors, and …


Feature And Model Selection In Feedforward Neural Networks, Jean M. Steppe Jun 1994

Feature And Model Selection In Feedforward Neural Networks, Jean M. Steppe

Theses and Dissertations

This research advances feature and model selection for feedforward neural networks. Feature selection involves determining a good feature subset given a set of candidate features. Model selection involves determining an appropriate architecture number of middle nodes for the neural network. Specific advances are made in neural network feature saliency metrics used for evaluating or ranking features, statistical identification of irrelevant noisy features, and statistical investigation of reduced neural network architectures and reduced feature subsets. New feature saliency metrics are presented which provide a more succinct quantitative measure of a features importance than other similar metrics. A catalogue of feature saliency …


Subgrouped Real Time Recurrent Learning Neural Networks, Jeffrey S. Dean May 1994

Subgrouped Real Time Recurrent Learning Neural Networks, Jeffrey S. Dean

Theses and Dissertations

A subgrouped Real Time Recurrent Learning (RTRL) network was evaluated. The one layer net successfully learns the XOR problem, and can be trained to perform time dependent functions. The net was tested as a predictor on the behavior of a signal, based on past behavior. While the net was not able to predict the signal's future behavior, it tracked the signal closely. The net was also tested as a classifier for time varying phenomena; for the differentiation of five classes of vehicle images based on features extracted from the visual information. The net achieved a 99.2% accuracy in recognizing the …


A Fortran Based Learning System Using Multilayer Back-Propagation Neural Network Techniques, Gregory L. Reinhart Mar 1994

A Fortran Based Learning System Using Multilayer Back-Propagation Neural Network Techniques, Gregory L. Reinhart

Theses and Dissertations

An interactive computer system which allows the researcher to build an optimal neural network structure quickly, is developed and validated. This system assumes a single hidden layer perceptron structure and uses the back- propagation training technique. The software enables the researcher to quickly define a neural network structure, train the neural network, interrupt training at any point to analyze the status of the current network, re-start training at the interrupted point if desired, and analyze the final network using two- dimensional graphs, three-dimensional graphs, confusion matrices and saliency metrics. A technique for training, testing, and validating various network structures and …


An Analysis Of Stopping Criteria In Artificial Neural Networks, Bruce Kostal Mar 1994

An Analysis Of Stopping Criteria In Artificial Neural Networks, Bruce Kostal

Theses and Dissertations

The goal of this study was to decide when to terminate training of an artificial neural network ANN. In pursuit of this goal, several characteristics of the ANN were monitored throughout ANN training classification error rate of the training set, testing set, or a weighted average of the two moving average classification error rate measurements of the difference between ANN output and desired output error sum of squares, total absolute error. or largest absolute error or ANN weight changes absolute weight change, squared weight change, or relative weight change. Throughout this research, the learning rate was held constant at 0.35. …