Open Access. Powered by Scholars. Published by Universities.®

Engineering Commons

Open Access. Powered by Scholars. Published by Universities.®

Theses/Dissertations

Operations Research, Systems Engineering and Industrial Engineering

Theses and Dissertations

Neural networks (Computer science)

Articles 1 - 11 of 11

Full-Text Articles in Engineering

Methods To Address Extreme Class Imbalance In Machine Learning Based Network Intrusion Detection Systems, Russell W. Walter Mar 2016

Methods To Address Extreme Class Imbalance In Machine Learning Based Network Intrusion Detection Systems, Russell W. Walter

Theses and Dissertations

Despite the considerable academic interest in using machine learning methods to detect cyber attacks and malicious network traffic, there is little evidence that modern organizations employ such systems. Due to the targeted nature of attacks and cybercriminals’ constantly changing behavior, valid observations of attack traffic suitable for training a classifier are extremely rare. Rare positive cases combined with the fact that the overwhelming majority of network traffic is benign create an extreme class imbalance problem. Using publically available datasets, this research examines the class imbalance problem by using small samples of the attack observations to create multiple training sets that …


Combat Identification Modeling Using Neural Network Techniques, Changwook Lim Mar 2009

Combat Identification Modeling Using Neural Network Techniques, Changwook Lim

Theses and Dissertations

The purposes of this research were: (1) validating Kim’s (2007) simulation method by applying analytic methods and (2) comparing the two different Robust Parameter Design methods with three measures of performance (label accuracy for enemy, friendly, and clutter). Considering the features of CID, input variables were defined as two controllable (threshold combination of detector and classifier) and three uncontrollable (map size, number of enemies and friendly). The first set of experiments considers Kim’s method using analytical methods. In order to create response variables, Kim’s method uses Monte Carlo simulation. The output results showed no difference between simulation and the analytic …


Metamodeling Techniques To Aid In The Aggregation Process Of Large Hierarchical Simulation Models, June F.D. Rodriguez Aug 2008

Metamodeling Techniques To Aid In The Aggregation Process Of Large Hierarchical Simulation Models, June F.D. Rodriguez

Theses and Dissertations

This research investigates how aggregation is currently conducted for simulation of large systems. The purpose is to examine how to achieve suitable aggregation in the simulation of large systems. More specifically, investigating how to accurately aggregate hierarchical lower-level (higher resolution) models into the next higher-level in order to reduce the complexity of the overall simulation model. The focus is on the exploration of the different aggregation techniques for hierarchical lower-level (higher resolution) models into the next higher-level. We develop aggregation procedures between two simulation levels (e.g., aggregation of engagement level models into a mission level model) to address how much …


An Investigation Of The Effects Of Correlation, Autocorrelation, And Sample Size In Classifier Fusion, Nathan J. Leap Mar 2004

An Investigation Of The Effects Of Correlation, Autocorrelation, And Sample Size In Classifier Fusion, Nathan J. Leap

Theses and Dissertations

This thesis extends the research found in Storm, Bauer, and Oxley, 2003. Data correlation effects and sample size effects on three classifier fusion techniques and one data fusion technique were investigated. Identification System Operating Characteristic Fusion (Haspert, 2000), the Receiver Operating Characteristic Within Fusion method (Oxley and Bauer, 2002), and a Probabilistic Neural Network were the three classifier fusion techniques; a Generalized Regression Neural Network was the data fusion technique. Correlation was injected into the data set both within a feature set (autocorrelation) and across feature sets for a variety of classification problems, and sample size was varied throughout. Total …


An Investigation Of The Effects Of Correlation In Sensor Fusion, Susan A. Storm Mar 2003

An Investigation Of The Effects Of Correlation In Sensor Fusion, Susan A. Storm

Theses and Dissertations

This thesis takes the first step towards the creation of a synthetic classifier fusion-testing environment. The effects of data correlation on three classifier fusion techniques were examined. The three fusion methods tested were the ISOC fusion method (Haspert, 2000), the ROC "Within" Fusion method (Oxley and Bauer, 2002) and the simple use of a Probabilistic Neural Network (PNN) as a fusion tool. Test situations were developed to allow the examination of various levels of correlation both between and within feature streams. The effects of training a fusion ensemble on a common dataset versus an independent data set were also contrasted. …


An Integrated Architecture And Feature Selection Algorithm For Radial Basis Neural Networks, Timothy D. Flietstra Mar 2002

An Integrated Architecture And Feature Selection Algorithm For Radial Basis Neural Networks, Timothy D. Flietstra

Theses and Dissertations

There are two basic ways to control an Unmanned Combat Aerial Vehicle (UCAV) as it searches for targets: allow the UCAV to act autonomously or employ man-in-the-loop control. There are also two target sets of interest: fixed or mobile targets. This research focuses on UCAV-based targeting of mobile targets using man-in-the-loop control. In particular, the interest is in how levels of satellite signal latency or signal degradation affect the ability to accurately track, target, and attack mobile targets. This research establishes a weapon effectiveness model assessing targeting inaccuracies as a function of signal latency and/or signal degradation. The research involved …


Feature Saliency In Artificial Neural Networks With Application To Modeling Workload, Kelly A. Greene Dec 1998

Feature Saliency In Artificial Neural Networks With Application To Modeling Workload, Kelly A. Greene

Theses and Dissertations

This dissertation research extends the current knowledge of feature saliency in artificial neural networks (ANN). Feature saliency measures allow for the user to rank order the features based upon the saliency, or relative importance, of the features. Selecting a parsimonious set of salient input features is crucial to the success of any ANN model. In this research, several methodologies were developed using the Signal to Noise Ratio (SNR) Feature Screening Method and its associated SNR Saliency Measure for selecting a parsimonious set of salient features to classify pilot workload in addition to air traffic controller workload. Candidate features were derived …


Experiments In Aggregating Air Ordnance Effectiveness Data For The Tacwar Model, James E. Parker Feb 1997

Experiments In Aggregating Air Ordnance Effectiveness Data For The Tacwar Model, James E. Parker

Theses and Dissertations

An interactive MS Access&trademark; based application that aggregates the output of the SABSEL model for input into the TACWAR model is developed. The application was developed following efforts to create a functional approximation of the SABSEL data using neural networks, statistical networks, and traditional statistical techniques. These approximations were compared to a look-up table methodology on the basis of accuracy, (RMSE


An Investigation Of Preliminary Feature Screening Using Signal-To-Noise Ratios, David B. Sumrell Mar 1996

An Investigation Of Preliminary Feature Screening Using Signal-To-Noise Ratios, David B. Sumrell

Theses and Dissertations

A new saliency metric and a new saliency screening method are developed. This new metric, the SN saliency metric, is based upon signal-to-noise ratios, where the signal is provided by a sum of squared weights associated with a given feature, and the noise is based upon a sum of squared weights associated with a reference noise feature which is injected into the data. The resultant metric allows for a direct comparison of the feature of interest with a reference noise feature which is known to be nonsalient. The SN saliency screening method, which uses the SN saliency metric, offers the …


A Fortran Based Learning System Using Multilayer Back-Propagation Neural Network Techniques, Gregory L. Reinhart Mar 1994

A Fortran Based Learning System Using Multilayer Back-Propagation Neural Network Techniques, Gregory L. Reinhart

Theses and Dissertations

An interactive computer system which allows the researcher to build an optimal neural network structure quickly, is developed and validated. This system assumes a single hidden layer perceptron structure and uses the back- propagation training technique. The software enables the researcher to quickly define a neural network structure, train the neural network, interrupt training at any point to analyze the status of the current network, re-start training at the interrupted point if desired, and analyze the final network using two- dimensional graphs, three-dimensional graphs, confusion matrices and saliency metrics. A technique for training, testing, and validating various network structures and …


An Analysis Of Stopping Criteria In Artificial Neural Networks, Bruce Kostal Mar 1994

An Analysis Of Stopping Criteria In Artificial Neural Networks, Bruce Kostal

Theses and Dissertations

The goal of this study was to decide when to terminate training of an artificial neural network ANN. In pursuit of this goal, several characteristics of the ANN were monitored throughout ANN training classification error rate of the training set, testing set, or a weighted average of the two moving average classification error rate measurements of the difference between ANN output and desired output error sum of squares, total absolute error. or largest absolute error or ANN weight changes absolute weight change, squared weight change, or relative weight change. Throughout this research, the learning rate was held constant at 0.35. …