Open Access. Powered by Scholars. Published by Universities.®

Digital Commons Network

Open Access. Powered by Scholars. Published by Universities.®

Articles 1 - 30 of 81

Full-Text Articles in Entire DC Network

Power, Performance, And Perception (P3): Integrating Usability Metrics And Technology Acceptance Determinants To Validate A New Model For Predicting System Usage, Alan P. Fiorello Dec 1999

Power, Performance, And Perception (P3): Integrating Usability Metrics And Technology Acceptance Determinants To Validate A New Model For Predicting System Usage, Alan P. Fiorello

Theses and Dissertations

Currently, there are two distinct approaches to assist information technology managers in the successful implementation of office automation software. The first approach resides within the field of usability engineering, while the second approach is derived from the discipline of management information systems (MIS). However, neither approach has successfully produced conclusive evidence that explains what characteristics facilitate system use as well as influence user acceptance of the system. This study reports on the validity of a new model, entitled the Power, Performance, Perception (P3) model, that links the constructs of usability engineering to user acceptance. Additionally, speech recognition software (SRS) was …


The Effect Of Divalent Cations On The Prophenoloxidase Enzyme Cascade Activity In The Freshwater Crayfish Cambarus Latimanus, Hans Skailand Eikaas Dec 1999

The Effect Of Divalent Cations On The Prophenoloxidase Enzyme Cascade Activity In The Freshwater Crayfish Cambarus Latimanus, Hans Skailand Eikaas

Theses and Dissertations

The effect of divalent cations such as cadmium, calcium, copper, lead and magnesium upon the prophenoloxidase system (proPO) was studied in hemocytes of the crayfish Cambarus latimanus. It was demonstrated that cadmium, calcium, copper and lead increased proPO activity significantly, whereas magnesium had no statistically significant effect on the system. Also, the molecular weight of the proPO enzyme was estimated using SDS-PAGE and found to be approximately 76 kDa.


Implementation Of Speech Recognition Software For Text Processing: An Exploratory Analysis, Sean P. Abell Dec 1999

Implementation Of Speech Recognition Software For Text Processing: An Exploratory Analysis, Sean P. Abell

Theses and Dissertations

The rationale behind implementing new information technologies is often to gain productivity improvements associated with the substitution of machinery for labor. However, the literature shows little direct evidence of a positive relationship between information technology investment and subsequent productivity benefits. This thesis reports on the examination into the productivity implications of implementing speech recognition software in a text processing environment. More specifically, research was conducted to compare text processing speeds and error rates using speech recognition software versus the keyboard and mouse. Of interest was the time required to input and proofread text processing tasks as well as the number …


A Numerical Simulation Of A Carbon Black Suspension Cell Via A Time-Reversed, Double Layer Compute Algorithm, Gregg T. Anderson Dec 1999

A Numerical Simulation Of A Carbon Black Suspension Cell Via A Time-Reversed, Double Layer Compute Algorithm, Gregg T. Anderson

Theses and Dissertations

A numerical simulation of a carbon black suspension cell is explored which models a laser-induced plasma within a liquid ethanol medium of approximately 1 mm thickness. The simulation model assumes a laser pulse with a pulse width of approximately 9 nsecs propagating in the left-to-right direction striking the front surface of the medium and focusing to a spot within the liquid volume. When the energy density within a given irradiated volume is sufficiently high, it ignites the carbon particles and generates a large number of free electrons, i.e., a plasma. The plasma couples with the in-coming laser energy on a …


Network Security Versus Network Connectivity: A Framework For Addressing The Issues Facing The Air Force Medical Community, Franklin E. Cunningham Jr. Dec 1999

Network Security Versus Network Connectivity: A Framework For Addressing The Issues Facing The Air Force Medical Community, Franklin E. Cunningham Jr.

Theses and Dissertations

The Air Force has instituted Barrier Reef to protect its networks. The Air Force medical community operates network connections that are incompatible with Barrier Reef. To overcome this problem, OASD(HA) directed the Tri-Service Management Program Office (TIMPO) to develop an architecture that protects all military health systems and allows them to link with all three services and outside partners. This research studied the underlying networking issues and formed a framework based on data from network experts from the Air Force's medical centers and their base network organizations. The findings were compared TIMPO and a composite framework was developed that more …


Evaporation Of Jet Fuels, Charles Eric Hack Dec 1999

Evaporation Of Jet Fuels, Charles Eric Hack

Theses and Dissertations

Determining the fate and transport of JP-8 jet fuel is a complex and important problem. As part of the startup procedures for jet engines, fuel is passed through aircraft engines before combustion is initiated. Because of the extremely low temperatures at northern tier Air Force bases, the unburned fuel does not evaporate readily and may come into contact with ground crew. To determine the amount and duration of contaminant contact, the evaporation of the emitted fuel must be modeled. The amount and composition of the fuel upon reaching the ground crew may be determined by droplet evaporation models that have …


Information System Incidents: The Development Of A Damage Assessment Model, Mark D. Horony Dec 1999

Information System Incidents: The Development Of A Damage Assessment Model, Mark D. Horony

Theses and Dissertations

Information system (IS) incidents are on the rise. With low manning and undertrained information security specialists it is difficult for organizations to stop IS incidents from occurring. Once an incident has occurred it is the IS manager's responsibility to ensure that a full and accurate damage assessment has been accomplished. However, most IS managers lack the necessary tools to assess the damage from an incident. This exploratory thesis developed an IS incident damage assessment model (DAM) that can be part of the IS manager's tool kit. During the development of the model, it became apparent that the model was supported …


Application Of The Interaction Picture To Reactive Scattering In One Dimension, Michael J. Maclachlan Dec 1999

Application Of The Interaction Picture To Reactive Scattering In One Dimension, Michael J. Maclachlan

Theses and Dissertations

The interaction picture is used together with the channel-packet method in a new time-dependent approach to compute reactive scattering matrix elements. The channelpacket method enables the use of the interaction picture for computing reactive S-matrix elements by splitting the computational effort into two parts. First, asymptotic reactant and product wavepackets are individually propagated into the interaction region of the potential to form Moller states. The interaction picture, in contrast to the usual Schrödinger picture of quantum mechanics, is so constructed that a wavefunction that experiences no change in potential (that is, a free-particle wavefunction) remains always fixed, with no translation …


Towards A Game Theory Model Of Information Warfare, David A. Burke Dec 1999

Towards A Game Theory Model Of Information Warfare, David A. Burke

Theses and Dissertations

The repeated game of incomplete information model, a subclass of game theory models, was modified to include aspects of information warfare. The repeated game of incomplete information model was first developed to analyze nuclear weapons disarmament negotiations. The central role of information in this model suggested its applicability to IW, which focuses on the defense and acquisition of information. A randomized experimental design was utilized to determine how people behave in a laboratory IW setting and to test the IW game model's basic predictions. The impact of experience and learning on IW performance was also assessed during the experiment. IW …


Efficient Simulation Via Validation And Application Of An External Analytical Model, Thomas H. Irish Sep 1999

Efficient Simulation Via Validation And Application Of An External Analytical Model, Thomas H. Irish

Theses and Dissertations

This research makes significant contributions towards improving the efficiency of simulation studies using an external analytical model. The foundation for this research is the analytical control variate (ACV) method. The ACV method can produce significant variance reduction, but the resulting point estimate may exhibit bias. A Monte Carlo sampling method for resolving the bias problem is developed and demonstrated through a queueing network example. The method requires knowledge of the parameters and approximate distributions of the random variables used to produce the ACV. Often, some of these parameters or distributions are not known. Both parametric and non-parametric alternatives to the …


Multiple Comparison Pruning Of Neural Networks., Donald E. Duckro Sep 1999

Multiple Comparison Pruning Of Neural Networks., Donald E. Duckro

Theses and Dissertations

Reducing a neural network's complexity improves the ability of the network to be applied to future examples. Like an overfitted regression function, neural networks may miss their target because of the excessive degrees of freedom stored up in unnecessary parameters. Over the past decade, the subject of pruning networks has produced non-statistical algorithms like Skeletonization, Optimal Brain Damage, and Optimal Brain Surgery as methods to remove connections with the least salience. There are conflicting views as to whether more than one parameter can be removed at a time. The methods proposed in this research use statistical multiple comparison procedures to …


Formal Representation And Application Of Software Design Information, Thomas M. Schorsch Sep 1999

Formal Representation And Application Of Software Design Information, Thomas M. Schorsch

Theses and Dissertations

Formal methods for developing software use mathematical frameworks to specify, develop and verify software systems, especially safety critical systems where error free software is a necessity. A transformation system is a formal method that refines a requirement specification into an implementation by successively adding design decisions in the form of precisely verified design information. Current algebraic representations of design information (specifications, morphisms, and interpretations) and methods for applying algebraic specification design information (diagram refinement) cannot correctly represent and apply design information involving higher level design information. This investigation develops innovative methods for constructing and refining structured algebraic requirement specifications, as …


Analysis Of N-Tier Architecture Applied To Distributed-Database Systems, Alexandre G. Valente Jun 1999

Analysis Of N-Tier Architecture Applied To Distributed-Database Systems, Alexandre G. Valente

Theses and Dissertations

N-tier architecture has been more commonly used as a methodology for developing large database applications. This work evaluates the use of this architecture instead of the classical Client/Server architecture in developing corporate applications based on distributed databases. The comparison between architectures is performed using applications that execute transactions similar to those defined in the Transaction Process Council Type C benchmark (TPC-C). The environment used for development and testing was the AFIT Bimodal Cluster (ABC); a heterogeneous cluster of PCs, running Microsoft Windows NT 4.0 OS. The comparative experimental analysis demonstrated that the N-tier architecture allows more efficient bandwidth utilization between …


Methodology For Integrating The Scenario Databases Of Simulation Systems, Emilia M. Colonese Jun 1999

Methodology For Integrating The Scenario Databases Of Simulation Systems, Emilia M. Colonese

Theses and Dissertations

The use of many different simulation systems by the United States Department of Defense has resulted in many different scenario data representations contained in heterogeneous databases. These heterogeneous databases all represent the same data concept, but have different semantics due to intrinsic variations among the data models. In this research, I describe a unified scenario database to allow interoperability and reuse of the scenario data components while avoiding the problems of data redundancy. Using the object oriented approach, the data and schema of the scenario databases, represented in an object oriented model, are integrated into a global database also represented …


Scan-It: A Computer Vision Model Motivated By Human Physiology And Behavior, John G. Keller Jun 1999

Scan-It: A Computer Vision Model Motivated By Human Physiology And Behavior, John G. Keller

Theses and Dissertations

This dissertation details the development of a new computational vision model motivated by physiological and behavioral aspects of the human visual system. Using this model, intensity features within an artificial visual field of view are extracted and transformed into a simulated cortical representation, and a saccadic guidance system scans this field of view over an object within an image to memorize that object. The object representation is thus stored as a sequence of feature matrices describing sub-regions of the object. A new image can then be searched for the object (possibly scaled and rotated), where evidence of its presence is …


Cobol Reengineering Using The Parameter Based Object Identification (Pboi) Methodology, Sonia De Jesus Rodrigues Jun 1999

Cobol Reengineering Using The Parameter Based Object Identification (Pboi) Methodology, Sonia De Jesus Rodrigues

Theses and Dissertations

This research focuses on how to reengineer Cobol legacy systems into object oriented systems using Sward's Parameter Based Object Identification (PBOI) methodology. The method is based on relating categories of imperative subprograms to classes written in object oriented language based on how parameters are handled and shared among them. The input language of PBOI is a canonical form called the generic imperative model (GIM), which is an abstract syntax tree (AST) representation of a simple imperative programming language. The output is another AST, the generic object model (GOM), a generic object oriented language. Conventional languages must be translated into the …


Parallel Digital Signal Processing On A Network Of Personal Computers Case Study: Space-Time Adaptive Processing, Fernando Silva Jun 1999

Parallel Digital Signal Processing On A Network Of Personal Computers Case Study: Space-Time Adaptive Processing, Fernando Silva

Theses and Dissertations

Network based parallel computing using personal computers is currently a popular choice for concurrent scientific computing. This work evaluates the capabilities and the performance of the AFIT Bimodal Cluster (ABC); a heterogeneous cluster of PCs connected by switched fast Ethernet and using MPICH 1.1 for interprocess communication for parallel digital signal processing using Space Time Adaptive Processing (STAP) as the case study. The MITRE RT_STAP Benchmark version 1.1 is ported and executed on the ABC, as well as on a cluster of six Sun SPARC workstations connected by a Myrinet network (the AFIT NOW), and on a IBM SP for …


Multiobjective Evolutionary Algorithms: Classifications, Analyses, And New Innovations, David A. Van Veldhuizen Jun 1999

Multiobjective Evolutionary Algorithms: Classifications, Analyses, And New Innovations, David A. Van Veldhuizen

Theses and Dissertations

This research organizes, presents, and analyzes contemporary Multiobjective Evolutionary Algorithm (MOEA) research and associated Multiobjective Optimization Problems (MOPs). Using a consistent MOEA terminology and notation, each cited MOEAs' key factors are presented in tabular form for ease of MOEA identification and selection. A detailed quantitative and qualitative MOEA analysis is presented, providing a basis for conclusions about various MOEA-related issues. The traditional notion of building blocks is extended to the MOP domain in an effort to develop more effective and efficient MOEAs. Additionally, the MOEA community's limited test suites contain various functions whose origins and rationale for use are often …


Implementation Of A Two-Dimensional Hydrodynamic Shock Code Based Upon The Weighted Average Flux Method, Mark P. Wittig Jun 1999

Implementation Of A Two-Dimensional Hydrodynamic Shock Code Based Upon The Weighted Average Flux Method, Mark P. Wittig

Theses and Dissertations

Numerical modeling of shock propagation and reflection is of interest to the Department of Defense (DoD). Propriety state-of-the-art codes based upon E. F. Toro's weighted average flux (WAF) method are being used to investigate complex shock reflection phenomena. Here we develop, test, and validate a one-dimensional hydrodynamic shock code. We apply WAF to Gudonov's first-order upwind method to achieve second-order accuracy. Oscillations, typical of second-order methods, are then removed using adaptive weight limiter functions based upon total variation diminishing (TVD) flux limiters. An adaptive Riemann solver routine is also implemented to improve computational efficiency. This one-dimensional code is then extended …


Evolving Compact Decision Rule Sets, Robert E. Marmelstein Jun 1999

Evolving Compact Decision Rule Sets, Robert E. Marmelstein

Theses and Dissertations

While data mining technology holds the promise of automatically extracting useful patterns (such as decision rules) from data, this potential has yet to be realized. One of the major technical impediments is that the current generation of data mining tools produce decision rule sets that are very accurate, but extremely complex and difficult to interpret. As a result, there is a clear need for methods that yield decision rule sets that are both accurate and compact. The development of the Genetic Rule and Classifier Construction Environment (GRaCCE) is proposed as an alternative to existing decision rule induction (DRI) algorithms. GRaCCE …


Estimation And Goodness-Of-Fit In The Case Of Randomly Censored Lifetime Data, David M. Reineke Jun 1999

Estimation And Goodness-Of-Fit In The Case Of Randomly Censored Lifetime Data, David M. Reineke

Theses and Dissertations

A new continuous distribution function estimator for randomly censored data is developed, discussed, and compared to existing estimators. Minimum distance estimation is shown to be effective in estimating Weibull location parameters when random censoring is present. A method of estimating all 3 parameters of the 3-parameter Weibull distribution using a combination of minimum distance and maximum likelihood is also given. Cramer-von Mises and Anderson-Darling goodness-of-fit test statistics are modified to measure the discrepancy between the maximum likelihood estimate and the Kaplan-Meier product-limit estimate of the distribution function of the random variable of interest. These modified test statistics are used to …


Image Fusion Using Autoassociative-Heteroassociative Neural Networks, Claudia V. Kropas-Hughes May 1999

Image Fusion Using Autoassociative-Heteroassociative Neural Networks, Claudia V. Kropas-Hughes

Theses and Dissertations

Images are easily recognized, classified, and segmented; in short, analyzed by humans. The human brain/nervous system; the biological computer, performs rapid and accurate image processing. In the current research the concepts of the biological neural system provide the impetus for developing a computational means of fusing image data. Accomplishing this automatic image processing requires features be extracted from each image data set, and the information content fused. Biologically inspired computational models are examined for extracting features by transformations such as Fourier, Gabor, and wavelets, and for processing and fusing the information from multiple images through evaluation of autoassociative neural networks …


Fan-Beam Multiplexed Compton Scatter Tomography For Single-Sided Noninvasive Inspection, Brian L. Evans Apr 1999

Fan-Beam Multiplexed Compton Scatter Tomography For Single-Sided Noninvasive Inspection, Brian L. Evans

Theses and Dissertations

Multiplexed Compton Scatter Tomography (MCST) is explored as a method of nondestructively generating cross-sectional images of a sample's electron density. MCST is viable when access is available to only one side of the sample because it registers scattered gamma radiation. Multiplexing in scattered photon energy and in detector position allows simultaneous interrogation of many voxels with comparatively wide collimation. Primary components include a radioisotope source, fan beam collimators, and energy-discriminating detectors. The application of MCST to inspecting aluminum airframes for corrosion is considered. This application requires source gammas near 100 keV where the scattered gamma energy is severely broadened by …


An Interactive Tool For Refining Software Specifications From A Formal Domain Model, Gary L. Anderson Mar 1999

An Interactive Tool For Refining Software Specifications From A Formal Domain Model, Gary L. Anderson

Theses and Dissertations

This work examines the process for refining a software specification from a formal object-oriented domain model. This process was implemented with interactive software to demonstrate the feasibility and benefits of automating what has been a tedious and often error-prone manual task. The refinement process operates within the framework of a larger Knowledge-Based Software Engineering system. A generic object-oriented representation is used to store a domain model, which allows the specification tool to access, select, and manipulate the required objects to form a customized specification. The specification is also stored as an object-oriented model, which in turn can be accessed by …


Characterization Of The Double Scatter Spectrum In Multiplexed Compton Scatter Tomography, David W. Gerts Mar 1999

Characterization Of The Double Scatter Spectrum In Multiplexed Compton Scatter Tomography, David W. Gerts

Theses and Dissertations

The Multiplexed Compton Scatter Tomograph (MCST) uses single back-scattered photons to image electron density in aluminum. A source of error in this imaging technique is the presence of multiple scatters. This thesis studies the double scatter spectrum as an approximation of the multiple scatter spectrum. A deterministic code called Monte Carlo Double Scatter (MOCADS) was developed to investigate the double scatter spectrum. The code includes calculations of the Rayleigh scatter, Compton scatter, Doppler broadening effects of the spectrum, and polarization effects following the Compton scatter. The Doppler broadening portion of the code was validated by a deterministic code called Scatgram. …


Improving Cape Canaveral's Next-Day Thunderstorm Forecasting Using A Meso-Eta Model-Based Index, John C. Crane Mar 1999

Improving Cape Canaveral's Next-Day Thunderstorm Forecasting Using A Meso-Eta Model-Based Index, John C. Crane

Theses and Dissertations

Reliable thunderstorm forecasts are essential to safety and resource protection at Cape Canaveral. Current methods of forecasting day-2 thunderstorms provide little improvement over forecasting by persistence alone and are therefore in need of replacement. This research focused on using the mesoscale eta model to develop an index for improved forecasting of day-2 thunderstorms. Logistic regression techniques were used to regress the occurrence of a thunderstorm at Cape Canaveral against day-2 forecast variables output, or derived, from the mesoscale eta model. Accuracy and bias scores were calculated for the forecasts made by the regression equations, and the forecast results were compared …


A Validation Study Of Cloud Scene Simulation Model Temporal Performance, Glenn Kerr Mar 1999

A Validation Study Of Cloud Scene Simulation Model Temporal Performance, Glenn Kerr

Theses and Dissertations

Cloud Scene Simulation Model (CSSM) temporal performance was validated by comparing the cloud forcing signatures on observed radiometric time series with those derived from CSSM output for initial conditions similar to that for the observed data. Observed radiometric data was collected by a normal incidence pyraheliometer sensitive to wavelengths in the range .3mm to 3mm. Simulation radiometric time series data was derived by applying the following process to each case study. CSSM cloud liquid water content (CLWC) grids were converted to grids of slant path optical depth values by the Fast Map post processor to the CSSM. A ray tracing …


Predicting Launch Pad Winds At The Kennedy Space Center With A Neural Network Model, Steven J. Storch Mar 1999

Predicting Launch Pad Winds At The Kennedy Space Center With A Neural Network Model, Steven J. Storch

Theses and Dissertations

This thesis uses neural networks to forecast winds at the Kennedy Space Center and the Cape Canaveral Air Station launch pads. Variables are developed from WINDS tower observations, surface and buoy observations, and an upper-air sounding. From these variables, a smaller set of predictive inputs is chosen using a signal-to-noise variable screening method. A neural network is then trained to forecast launch pad winds from the inputs. The network forecasts are compared to persistence, and peak wind predictions are found skillful compared to persistence. An ensemble modeling technique using Toth's and Kalnay's breeding of growing modes method is explored with …


Active Multispectral Band Selection And Reflectance Measurement System, Bradley D. Rennich Mar 1999

Active Multispectral Band Selection And Reflectance Measurement System, Bradley D. Rennich

Theses and Dissertations

Due to system design requirements, an active multispectral laser radar system may be limited in the number of spectral bands that can be integrated into the system. To aid in the selection of these bands, a novel multispectral band selection technique is presented based on the cross-correlation of the material class reflectance spectra over a wavelength range of 1 - 5 microns. The algorithm uses directional hemispherical reflectance data from the Nonconventional Exploitation Factors database to select a number of spectral bands for classification purposes. Because the target material spectral reflectance is so important to the performance of an active …


Methodology For Application Design Using Information Dissemination And Active Database Technologies, Robert H. Hartz Mar 1999

Methodology For Application Design Using Information Dissemination And Active Database Technologies, Robert H. Hartz

Theses and Dissertations

In dynamic data environments, the large volume of transactions requires flexible control structures to effectively balance the flow of information between producers and consumers. Information dissemination-based systems, using both data push and pull delivery mechanisms, provide a possible scalable solution for data-intensive applications. In this research, a methodology is proposed to capture information dissemination design features in the form of active database rules to effectively control dynamic data applications. As part of this design methodology, information distribution properties are analyzed, data dissemination mechanisms are transformed into an active rule framework, and the desired reactive behavior is achieved through rule customization. …