Open Access. Powered by Scholars. Published by Universities.®
- Institution
-
- Purdue University (75)
- Syracuse University (36)
- Washington University in St. Louis (32)
- Air Force Institute of Technology (28)
- Dartmouth College (26)
-
- Singapore Management University (25)
- Portland State University (17)
- Wright State University (16)
- Taylor University (12)
- Old Dominion University (10)
- Missouri University of Science and Technology (9)
- California Polytechnic State University, San Luis Obispo (8)
- New Jersey Institute of Technology (8)
- University of Nebraska - Lincoln (8)
- University of Massachusetts Amherst (7)
- San Jose State University (6)
- Brigham Young University (5)
- Edith Cowan University (5)
- SelectedWorks (4)
- William & Mary (4)
- Loyola University Chicago (3)
- Maurer School of Law: Indiana University (3)
- Nova Southeastern University (3)
- Sacred Heart University (3)
- Western Kentucky University (3)
- Western Michigan University (3)
- Southern University and A&M College (2)
- The University of Southern Mississippi (2)
- University of Central Florida (2)
- University of North Florida (2)
- Keyword
-
- College of Engineering and Computer Science (10)
- Computer Science (10)
- Engineering (10)
- Newsletters (10)
- Science news (10)
-
- Technical writing (10)
- Parallel computing (9)
- File system (7)
- Neural networks (Computer science) (5)
- Parallel-io (5)
- Software engineering (5)
- Parallel processing (Electronic computers) (4)
- Virtual reality (4)
- Computer science (3)
- E-Lert (3)
- Electronic Databases (3)
- Expert systems (Computer science) (3)
- Genetic algorithms (3)
- HPCC (3)
- HPF (3)
- Indiana University Electronic Resources Newsletter (3)
- Indiana University Law Library (3)
- Indiana University School of Law (3)
- Legal Research (3)
- Object-oriented programming (Computer science) (3)
- Programming languages (3)
- Robotics (3)
- Yolanda Jones (3)
- Academic – UNF – Computer Science (2)
- Adaptive computing systems (2)
- Publication
-
- Department of Computer Science Technical Reports (75)
- All Computer Science and Engineering Research (32)
- Theses and Dissertations (28)
- Research Collection School Of Computing and Information Systems (25)
- Computer Science Technical Reports (16)
-
- ACMS Conference Proceedings 1995 (12)
- College of Engineering and Computer Science - Former Departments, Centers, Institutes and Projects (12)
- Northeast Parallel Architecture Center (12)
- Computer Science Faculty Publications and Presentations (11)
- BITs and PCs Newsletter (10)
- Computer Science and Software Engineering (8)
- Faculty Publications (8)
- Computer Science Department Faculty Publication Series (7)
- Computer Science Faculty Research & Creative Works (7)
- Dartmouth Scholarship (7)
- Electrical Engineering and Computer Science - All Scholarship (6)
- Electrical Engineering and Computer Science - Technical Reports (6)
- Kno.e.sis Publications (6)
- CSE Conference and Workshop Papers (5)
- Computer Science Faculty Publications (5)
- Dissertations (5)
- SWITCH (5)
- Dissertations and Theses (4)
- Dissertations, Theses, and Masters Projects (4)
- CCE Theses and Dissertations (3)
- Computer Science: Faculty Publications and Other Works (3)
- Dartmouth College Undergraduate Theses (3)
- E-lert (3)
- Masters Theses (3)
- School of Computer Science & Engineering Faculty Publications (3)
- Publication Type
Articles 361 - 390 of 390
Full-Text Articles in Entire DC Network
Ensuring The Satisfaction Of A Temporal Specification At Run-Time, Grace Tsai, Matt Insall, Bruce M. Mcmillin
Ensuring The Satisfaction Of A Temporal Specification At Run-Time, Grace Tsai, Matt Insall, Bruce M. Mcmillin
Mathematics and Statistics Faculty Research & Creative Works
A responsive computing system is a hybrid of real-time, distributed and fault-tolerant systems. In such a system, severe consequences can occur if the run-time behavior does not conform to the expected behavior or specifications. In this paper, we present a formal approach to ensure satisfaction of the specifications in the operational environment as follows. First we specify behavior of the systems using Interval Temporal Logic (ITL). Next we give algorithms for trace checking of programs in such systems. Finally, we present a fully distributed run-time evaluation system which causally orders the events of the system during its execution and checks …
Experimenting With The Finite Element Method In The Calculation Of Radiosity Form Factors, Donna Marie Chesteen
Experimenting With The Finite Element Method In The Calculation Of Radiosity Form Factors, Donna Marie Chesteen
Retrospective Theses and Dissertations
Radiosity has been used to create some of the most photorealistic computer-generated images to date. The problem, however, is that radiosity algorithms are so computationally and memory expensive that few applications can employ them successfully. Form factor calculation is the most costly part of the process. This report describes an algorithm for using the finite element method to reduce the amount of time that is used in the form factor calculation portion of the radiosity algorithm. This technique for form factor calculation significantly reduces the number of projections done at each iteration by using shape functions to determine the distribution …
Analysis And Extension Of Model-Based Software Executives, Keith E. Lewis
Analysis And Extension Of Model-Based Software Executives, Keith E. Lewis
Theses and Dissertations
This research developed a comprehensive description of the simulation environment of Architect, a domain-oriented application composition system being developed at the Air Force Institute of Technology to explore new software engineering technologies. The description combines information from several previous research efforts and Architect's source code into a single, comprehensive document. A critical evaluation of the simulation environment was also performed, identifying improvements and modifications that enhance Architecture's application execution capabilities by reducing complexity and execution time. The analysis was then taken one step further and presented extensions to the current simulation environment. The extensions included investigating the feasibility of mixed-mode …
Visage: Improving The Ballistic Vulnerability Modeling And Analysis Process, Brett F. Grimes
Visage: Improving The Ballistic Vulnerability Modeling And Analysis Process, Brett F. Grimes
Theses and Dissertations
The purpose of this thesis was to improve the process of modeling and analyzing ballistic vulnerability data. This was accomplished by addressing two of the more urgent needs of vulnerability analysts; the ability to display fault tree data and to edit target descriptions. A vulnerability data visualization program called VISAGE was modified to meet these needs. VISAGE was originally created to preview static shotline plots and subsequently grew into a full-featured visualization package for vulnerability target descriptions and analyses data. The next logical step in the programs evolution was to include the needed editing and fault tree display capabilities. The …
Set-Theoretic Reconstructability Of Elementary Cellular Automata, Martin Zwick, Hui Shu
Set-Theoretic Reconstructability Of Elementary Cellular Automata, Martin Zwick, Hui Shu
Systems Science Faculty Publications and Presentations
Set-theoretic reconstructability analysis is used to characterize the structures of the mappings of elementary cellular automata. The minimum complexity structure for each ECA mapping, indexed by parameter σ, is more effective than the λ parameter of Langton as a predictor of chaotic dynamics.
Multiple Query Optimization With Depth-First Branch-And-Bound And Dynamic Query Ordering, Ee Peng Lim, Ahmet Cosar, Jaideep Srivastava
Multiple Query Optimization With Depth-First Branch-And-Bound And Dynamic Query Ordering, Ee Peng Lim, Ahmet Cosar, Jaideep Srivastava
Research Collection School Of Computing and Information Systems
In certain database applications such as deductive databases, batch query processing, and recursive query processing etc., usually a single query gets transformed into a set of closely related database queries. Also, great benefits can be obtained by executing a group of related queries all together in a single unified multi-plan instead of executing each query separately. In order to achieve this Multiple Query Optimization (MQO) identifies common task(s) (e.g. common subexpressions, joins, etc.) among a set of query plans and creates a single unified plan (multi-plan) which can be executed to obtain the required outputs for all queries at once. …
Improved Tau Polarisation Measurement, D. Buskulic, M. Thulasidas
Improved Tau Polarisation Measurement, D. Buskulic, M. Thulasidas
Research Collection School Of Computing and Information Systems
Using 22 pb−1 of data collected at LEP in 1992 on the peak of the Z resonance, the ALEPH collaboration has measured the polarisation of the tau leptons decaying intoevv¯,μvv¯evv¯,μvv¯,πν, ρν and a1 ν from their individual decay product distributions. The measurement of the tau polarisation as a function of the production polar angle yields the two parametersN τ andN e, where, in terms of the axial and vector couplingsg Al andg Vl,N l=2g Vl gAl/(g 2Vl+g2Al). This analysis follows to a large extent the methods devised for the 1990 and 1991 data but with improvements which bring a better …
Implementation And Evaluation Of Enhanced Areal Interpolation Using Mapinfo And Mapbasic, Gordon Wragg
Implementation And Evaluation Of Enhanced Areal Interpolation Using Mapinfo And Mapbasic, Gordon Wragg
Theses : Honours
Many researchers today have a need to analyse data in a spatial context. An inherent problem is the mismatch of boundaries between the geographic regions for which data is collected and those regions for which the data is required. Often the solution is to interpolate data from one set of regions to another. This project examines and implements a method of areal interpolation that enables the user to use extra information in areal interpolation to increase the "intelligence ' of the process. This method of Enhanced Areal Interpolation uses a conditional Poisson distribution and the EM algorithm to provide estimated …
A Mathematical Model Of Cycle Chemotherapy, J. C. Panetta, J. Adam
A Mathematical Model Of Cycle Chemotherapy, J. C. Panetta, J. Adam
Mathematics & Statistics Faculty Publications
A mathematical model is used to discuss the effects of cycle-specific chemotherapy. The model includes a constraint equation which describes the effects of the drugs on sensitive normal tissue such as bone marrow. This model investigates both pulsed and piecewise-continuous chemotherapeutic effects and calculates the parameter regions of acceptable dose and period. It also identifies the optimal period needed for maximal tumor reduction. Examples are included concerning the use of growth factors and how they can enhance the cell kill of the chemotherapeutic drugs.
Some Issues In The Sliding Mode Control Of Rigid Robotic Manipulators, Sanjay Rao
Some Issues In The Sliding Mode Control Of Rigid Robotic Manipulators, Sanjay Rao
Theses: Doctorates and Masters
This thesis investigates the problem of robust adaptive sliding mode control for nonlinear rigid robotic manipulators. A number of robustness and convergence results are presented for sliding mode control of robotic manipulators with bounded unknown disturbances, nonlinearities, dynamical couplings and parameter uncertainties. The highlights of the research work are summarized below : • A robust adaptive tracking control for rigid robotic manipulators is proposed. In this scheme, the parameters of the upper bound of system uncertainty are adaptively estimated. The controller estimates are then used as controller parameters to eliminate the effects of system uncertainty and guarantee asymptotic error convergence. …
Production Of Excited Beauty States In Z Decays, D. Buskulic, Manoj Thulasidas
Production Of Excited Beauty States In Z Decays, D. Buskulic, Manoj Thulasidas
Research Collection School Of Computing and Information Systems
A data sample of about 3.0 million hadronic Z decays collected by the ALEPH experiment at LEP in the years 1991 through 1994, is used to make an inclusive selection of B hadron events.
First Measurement Of The Quark-To-Photon Fragmentation Function, D. Buskulic, Manoj Thulasidas
First Measurement Of The Quark-To-Photon Fragmentation Function, D. Buskulic, Manoj Thulasidas
Research Collection School Of Computing and Information Systems
Earlier measurements at LEP of isolated hard photons in hadronic Z decays, attributed to radiation from primary quark pairs, have been extended in the ALEPH experiment to include hard photon production inside hadron jets. Events are selected where all particles combine democratically to form hadron jets, one of which contains a photon with a fractional energy z 0:7. After statistical subtraction of non-prompt photons, the quark-to-photon fragmentation function, D(z), is extracted directly from the measured 2-jet rate. By taking into account the perturbative contributions to D(z) obtained from an O(S ) QCD calculation, the unknown non-perturbative component of D(z) is …
Inclusive Production Of Neutral Vector Mesons In Hadronic Z Decays, D. Buskulic, Manoj Thulasidas
Inclusive Production Of Neutral Vector Mesons In Hadronic Z Decays, D. Buskulic, Manoj Thulasidas
Research Collection School Of Computing and Information Systems
Data on the inclusive production of the neutral vector mesonsρ 0(770),ω(782), K*0(892), andφ(1020) in hadronic Z decays recorded with the ALEPH detector at LEP are presented and compared to Monte Carlo model predictions. Bose-Einstein effects are found to be important in extracting a reliable value for theρ 0 production rate. An averageρ 0 multiplicity of 1.45±0.21 per event is obtained. Theω is detected via its three pion decay modeω→π + π − π 0 and has a total rate of 1.07±0.14 per event. The multiplicity of the K*0 is 0.83±0.09, whilst that of theφ is 0.122±0.009, both measured using their …
A Proposal For A Development Platform For Microcontroller-Based Devices, Michael L. Wetton
A Proposal For A Development Platform For Microcontroller-Based Devices, Michael L. Wetton
Theses: Doctorates and Masters
This thesis is concerned with designing, implementing and testing a miniaturised temperature data logging device. Investigations demonstrated that a microcontroller could provide a low-cost single-chip solution to this problem and after a detailed review of 8-bit microcontrollers, the MC68HCll was chosen for this task. This document also includes discussion on an environment that was developed for creating and testing MC68HC11 software and the use of Motorola's evaluation boards. To ensure that the device was designed to software engineering standards an investigation into software engineering analysis techniques took place. This resulted in the Jackson Structured Programming (JSP) methodology being adapted to …
Comparing Traditional Statistical Models With Neural Network Models: The Case Of The Relation Of Human Performance Factors To The Outcomes Of Military Combat, William Oliver Hedgepeth
Comparing Traditional Statistical Models With Neural Network Models: The Case Of The Relation Of Human Performance Factors To The Outcomes Of Military Combat, William Oliver Hedgepeth
Engineering Management & Systems Engineering Theses & Dissertations
Statistics and neural networks are analytical methods used to learn about observed experience. Both the statistician and neural network researcher develop and analyze data sets, draw relevant conclusions, and validate the conclusions. They also share in the challenge of creating accurate predictions of future events with noisy data.
Both analytical methods are investigated. This is accomplished by examining the veridicality of both with real system data. The real system used in this project is a database of 400 years of historical military combat. The relationships among the variables represented in this database are recognized as being hypercomplex and nonlinear.
The …
A Comprehensive, Automated Approach To Determining Sea Ice Thickness From Sar Data, Donna Haverkamp, Leen-Kiat Soh, Costas Tsatsoulis
A Comprehensive, Automated Approach To Determining Sea Ice Thickness From Sar Data, Donna Haverkamp, Leen-Kiat Soh, Costas Tsatsoulis
School of Computing: Faculty Publications
This paper documents an approach to sea ice classification through a combination of methods, both algorithmic and heuristic. The resulting system is a comprehensive technique, which uses dynamic local thresholding as a classification basis and then supplements that initial classification using heuristic geophysical knowledge organized in expert systems. The dynamic local thresholding method allows separation of the ice into thickness classes based on local intensity distributions. Because it utilizes the data within each image, it can adapt to varying ice thickness intensities to regional and seasonal charges and is not subject to limitations caused by using predefined parameters.
Software Reliability Issues: An Experimental Approach, Mary Ann Hoppa
Software Reliability Issues: An Experimental Approach, Mary Ann Hoppa
Computer Science Theses & Dissertations
In this thesis, we present methodologies involving a data structure called the debugging graph whereby the predictive performance of software reliability models can be analyzed and improved under laboratory conditions. This procedure substitutes the averages of large sample sets for the single point samples normally used as inputs to these models and thus supports scrutiny of their performances with less random input data.
Initially, we describe the construction of an extensive database of empirical reliability data which we derived by testing each partially debugged version of subject software represented by complete or partial debugging graphs. We demonstrate how these data …
Hardware Assists For High Performance Computing Using A Mathematics Of Arrays, Hardy J. Pottinger, W. Eatherton, J. Kelly, T. Schiefelbein, Lenore Mullin, R. Ziegler
Hardware Assists For High Performance Computing Using A Mathematics Of Arrays, Hardy J. Pottinger, W. Eatherton, J. Kelly, T. Schiefelbein, Lenore Mullin, R. Ziegler
Electrical and Computer Engineering Faculty Research & Creative Works
Work in progress at the University of Missouri-Rolla on hardware assists for high performance computing is presented. This research consists of a novel field programmable gate array (FPGA) based reconfigurable coprocessor board (the Chameleon Coprocessor) being used to evaluate hardware architectures for speedup of array computation algorithms. These algorithms are developed using a Mathematics of Arrays (MOA). They provide a means to generate addresses for data transfers that require less data movement than more traditional algorithms. In this manner, the address generation algorithms are acting as an intelligent data prefetching mechanism or special purpose cache controller. Software implementations have been …
Using Multiple Statistical Prototypes To Classify Continuously Valued Data, Tony R. Martinez, Dan A. Ventura
Using Multiple Statistical Prototypes To Classify Continuously Valued Data, Tony R. Martinez, Dan A. Ventura
Faculty Publications
Multiple Statistical Prototypes (MSP) is a modification of a standard minimum distance classification scheme that generates muItiple prototypes per class using a modified greedy heuristic. Empirical comparison of MSP with other well-known learning algorithms shows MSP to be a robust algorithm that uses a very simple premise to produce good generalization and achieve parsimonious hypothesis representation.
A Study Of Voluntary Participation In Computer User Groups, Alan Engels
A Study Of Voluntary Participation In Computer User Groups, Alan Engels
Electronic Theses & Dissertations
The purpose of this study was to determine ways to increase participation of members in Computer User Groups. The problem addressed was that a small select group, less than ten percent, in the Parsons Apple/Macintosh Users Group was doing ninety-five percent or more of the work. If this scenario does not change soon, the overworked and overburdened select few may suffer burnout and quit. Case in point, Joplin, MO, had a large Computer User Group, but about seven years ago, it vanished when the select few refused to serve anymore. The same process of decay and erosion has happened in …
Scheduling Of Parallel Jobs On Dynamic, Heterogenous Networks, Dan Clark, Jeremy Casas, Steve Otto, Robert Prouty, Jonathan Walpole
Scheduling Of Parallel Jobs On Dynamic, Heterogenous Networks, Dan Clark, Jeremy Casas, Steve Otto, Robert Prouty, Jonathan Walpole
Computer Science Faculty Publications and Presentations
In using a shared network of workstations for parallel processing, it is not only important to consider heterogeneity and differences in processing power between the workstations but also the dynamics of the system as a whole. In such a computing environment where the use of resources vary as other applications consume and release resources, intelligent scheduling of the parallel jobs onto the available resources is essential to maximize resource utilization. Despite this realization, however, there are few systems available that provide an infrastructure for the easy development and testing of these intelligent schedulers. In this paper, an infrastructure is presented …
Adaptive Resonance Associative Map, Ah-Hwee Tan
Adaptive Resonance Associative Map, Ah-Hwee Tan
Research Collection School Of Computing and Information Systems
This article introduces a neural architecture termed Adaptive Resonance Associative Map (ARAM) that extends unsupervised Adaptive Resonance Theory (ART) systems for rapid, yet stable, heteroassociative learning. ARAM can be visualized as two overlapping ART networks sharing a single category field. Although ARAM is simpler in architecture than another class of supervised ART models known as ARTMAP, it produces classification performance equivalent to that of ARTMAP. As ARAM network structure and operations are symmetrical, associative recall can be performed in both directions. With maximal vigilance settings, ARAM encodes pattern pairs explicitly as cognitive chunks and thus guarantees perfect storage and recall …
Feasible Offset And Optimal Offset For General Single-Layer Channel Routing, Ronald I. Greenberg, Jau-Der Shih
Feasible Offset And Optimal Offset For General Single-Layer Channel Routing, Ronald I. Greenberg, Jau-Der Shih
Computer Science: Faculty Publications and Other Works
This paper provides an efficient method to find all feasible offsets for a given separation in a very large-scale integration (VLSI) channel-routing problem in one layer. The previous literature considers this task only for problems with no single-sided nets. When single-sided nets are included, the worst-case solution time increases from $\Theta ( n )$ to $\Omega ( n^2 )$, where n is the number of nets. But if the number of columns c is $O( n )$, the problem can be solved in time $O( n^{1.5} \lg n )$, which improves upon a “naive” $O( cn )$ approach. As a …
Finding Connected Components On A Scan Line Array Processor, Ronald I. Greenberg
Finding Connected Components On A Scan Line Array Processor, Ronald I. Greenberg
Computer Science: Faculty Publications and Other Works
This paper provides a new approach to labeling the connected components of an n x n image on a scan line array processor (comprised of n processing elements). Variations of this approach yield an algorithm guaranteed to complete in o(n lg n) time as well as algorithms likely to approach O(n) time for all or most images. The best previous solutions require using a more complicated architecture or require Omega(n lg n) time. We also show that on a restricted version of the architecture, any algorithm requires Omega(n lg n) time in the worst case.
Large-Scale Client/Server Migration Methodology, A. Steven Krantz
Large-Scale Client/Server Migration Methodology, A. Steven Krantz
CCE Theses and Dissertations
The purpose of this dissertation is to explain how to migrate a medium-sized or large company to client/server computing. It draws heavily on the recent IBM Boca Raton migration experience. The client/server computing model is introduced and related, by a Business Reengineering Model, to the major trends that are affecting most businesses today, including business process reengineering, empowered teams, and quality management. A recommended information technology strategy is presented. A business case development approach, necessary to justify the large expenditures required for a client/server migration, is discussed. A five-phase migration management methodology is presented to explain how a business can …
A Taxonomy Of Workgroup Computing Applications, Warren Von Worley
A Taxonomy Of Workgroup Computing Applications, Warren Von Worley
CCE Theses and Dissertations
The goal of workgroup computing is to help individuals and groups efficiently perform a wide range of functions on networked computer systems (Ellis, Gibbs, & Rein, 1991). Early workgroup computing tools were designed for limited functionality and group interaction (Craighill, 1992). Current workgroup computing applications do not allow enough control of group processes and they provide little correlation between various workgroup computing application areas (Rodden and Blair, 1991). An integrated common architecture may produce more effective workgroup computing applications. Integrating common support functions into a common framework will avoid duplication of these functions for each workgroup computing application (Pastor & …
Constrained Least-Squares Digital Image Restoration, Rajeeb Hazra
Constrained Least-Squares Digital Image Restoration, Rajeeb Hazra
Dissertations, Theses, and Masters Projects
The design of a digital image restoration filter must address four concerns: the completeness of the underlying imaging system model, the validity of the restoration metric used to derive the filter, the computational efficiency of the algorithm for computing the filter values and the ability to apply the filter in the spatial domain. Consistent with these four concerns, this dissertation presents a constrained least-squares (CLS) restoration filter for digital image restoration. The CLS restoration filter is based on a comprehensive, continuous-input/discrete- processing/continuous-output (c/d/c) imaging system model that accounts for acquisition blur, spatial sampling, additive noise and imperfect image reconstruction. The …
Patentability Of Computer Inventions, Robert C. F. Perez
Patentability Of Computer Inventions, Robert C. F. Perez
Dissertations, Theses, and Masters Projects
No abstract provided.
File-System Workload On A Scientific Multiprocessor, David Kotz, Nils Nieuwejaar
File-System Workload On A Scientific Multiprocessor, David Kotz, Nils Nieuwejaar
Dartmouth Scholarship
No abstract provided.
Encryption/Decryption Dickwads Of Cipherspace, Raleigh Muns