Open Access. Powered by Scholars. Published by Universities.®

Physical Sciences and Mathematics Commons

Open Access. Powered by Scholars. Published by Universities.®

Articles 1 - 30 of 50

Full-Text Articles in Physical Sciences and Mathematics

Learning Successful Strategies In Repeated General-Sum Games, Jacob W. Crandall Dec 2005

Learning Successful Strategies In Repeated General-Sum Games, Jacob W. Crandall

Theses and Dissertations

Many environments in which an agent can use reinforcement learning techniques to learn profitable strategies are affected by other learning agents. These situations can be modeled as general-sum games. When playing repeated general-sum games with other learning agents, the goal of a self-interested learning agent is to maximize its own payoffs over time. Traditional reinforcement learning algorithms learn myopic strategies in these games. As a result, they learn strategies that produce undesirable results in many games. In this dissertation, we develop and analyze algorithms that learn non-myopic strategies when playing many important infinitely repeated general-sum games. We show that, in …


Trust Broker: A Defense Against Identity Theft From Online Transactions, Michael George Edvalson Dec 2005

Trust Broker: A Defense Against Identity Theft From Online Transactions, Michael George Edvalson

Theses and Dissertations

The proliferation of online services over the years has encouraged more and more people to participate in Internet activities. Many web sites request personal and sensitive information needed to deliver the desired service. Unfortunately, it is difficult to distinguish the sites that can be trusted to protect such information from those that cannot. Many attempts to make the Internet easier to use introduce new security and privacy problems. On the other hand, most attempts at creating a safe online environment produce systems that are cryptic and hard to use. The TrustBroker system is based on a specialized online repository that …


Industrial Technology Education Teachers Perceptions Of National Standards For Technological Literacy In The State Of Arizona, Allan R. Mcrae Dec 2005

Industrial Technology Education Teachers Perceptions Of National Standards For Technological Literacy In The State Of Arizona, Allan R. Mcrae

Theses and Dissertations

Today, it is becoming increasingly clear that there is a growing interest, concern, and need for technological literacy. To this end, the International Technology Education Association (ITEA) through the Technology for All Americans Project, has developed and promulgated the Standards for Technological Literacy: Content for the Study of Technology. This effort is part of the ongoing initiative to develop technology standards on a national level, and to focus on what every student in grades K-12 should know and be able to do in order to achieve technological literacy (ITEA, 2000). The purpose of this study was to investigate the perceived …


Generating Data-Extraction Ontologies By Example, Yuanqiu Zhou Nov 2005

Generating Data-Extraction Ontologies By Example, Yuanqiu Zhou

Theses and Dissertations

Ontology-based data-extraction is a resilient web data-extraction approach. A major limitation of this approach is that ontology experts must manually develop and maintain data-extraction ontologies. The limitation prevents ordinary users who have little knowledge of conceptual models from making use of this resilient approach. In this thesis we have designed and implemented a general framework, OntoByE, to generate data-extraction ontologies semi-automatically through a small set of examples collected by users. With the assistance of a limited amount of prior knowledge, experimental evidence shows that OntoByE is capable of interacting with users to generate data-extraction ontologies for domains of interest to …


A Software Development Environment For Building Context-Aware Systems For Family Technology, Jeremiah Kenton Jones Nov 2005

A Software Development Environment For Building Context-Aware Systems For Family Technology, Jeremiah Kenton Jones

Theses and Dissertations

The purpose of this thesis was to utilize existing technologies to create a development environment suitable for creating context-aware applications and systems specific to home and family living conditions. After outlining the history of context-aware applications and the challenges that face family-centric systems in this field, a development environment was implemented that solves the unique challenges that face application development for family-centric, context-aware applications. In particular, research cited in this document indicates that a browser-based user interface is the most appropriate interface for a family environment. The flexibility of the interface, as well as the familiarity of the application structure …


Generating Medical Logic Modules For Clinical Trial Eligibility, Craig G. Parker Nov 2005

Generating Medical Logic Modules For Clinical Trial Eligibility, Craig G. Parker

Theses and Dissertations

Clinical trials are important to the advancement of medical science. They provide the experimental and statistical basis needed to determine the benefit of diagnostic and therapeutic agents and procedures. The more patients enrolled in a clinical trial, the more confidence we can have in the trial's results. However, current practices for identifying eligible patients can be expensive and time-consuming. To assist in making identification of eligible patients more cost effective, we have developed a system for translating the eligibility criteria for clinical trials to an executable form. This system takes as input the eligibility criteria for a trial formatted as …


Real-Time Motion Transition By Example, Cameron Quinn Egbert Nov 2005

Real-Time Motion Transition By Example, Cameron Quinn Egbert

Theses and Dissertations

Motion transitioning is a common task in real-time applications such as games. While most character motions can be created a priori using motion capture or hand animation, transitions between these motions must be created by an animation system at runtime. Because of this requirement, it is often difficult to create a transition that preserves the feel that the actor or animator has put into the motion. An additional difficulty is that transitions must be created in real-time. This paper provides a method of creating motion transitions that is both computationally feasible for interactive speeds, and preserves the feel of the …


Verification Of Digital Controller Verifications, Xuan Wang Nov 2005

Verification Of Digital Controller Verifications, Xuan Wang

Theses and Dissertations

This thesis presents an analysis framework to verify the stablility property of a closed-loop control system with a software controller implementation. The usual approach to verifying stability for software uses experiments which are costly and can be dangerous. More recently, mathematical models of software have been proposed which can be used to reason about the correctness of controllers. However, these mathematical models ignore computational details that may be important in verification. We propose a method to determine the instability of a closed-loop system with a software controller implementation under l^2 inputs using simulation. This method avoids the cost of experimentation …


A Context-Sensitive Structural Heuristic For Guided Search Model Checking, Eric G. Mercer, Neha Rungta Nov 2005

A Context-Sensitive Structural Heuristic For Guided Search Model Checking, Eric G. Mercer, Neha Rungta

Faculty Publications

Software verification using model checking often translates programs into corresponding transition systems that model the program behavior. As software systems continue to grow in complexity and size, exhaustively checking a property on a transition graph becomes difficult. The goal of guided search heuristics in model checking is to find a counterexample to the property being verified as quickly as possible in the transition graph. The FSM distance heuristic builds an interprocedural control flow graph of the program to estimate distance to a possible error state. It ignores calling context and underestimates the true distance to the error.


Phylogenetic Analysis Of Large Sequence Data Sets, Hyrum Carroll, Mark J. Clement, Keith Crandall, Quinn O. Snell Oct 2005

Phylogenetic Analysis Of Large Sequence Data Sets, Hyrum Carroll, Mark J. Clement, Keith Crandall, Quinn O. Snell

Faculty Publications

Phylogenetic analysis is an integral part of biological research. As the number of sequenced genomes increases, available data sets are growing in number and size. Several algorithms have been proposed to handle these larger data sets. A family of algorithms known as disc covering methods (DCMs), have been selected by the NSF funded CIPRes project to boost the performance of existing phylogenetic algorithms. Recursive Iterative Disc Covering Method 3 (Rec-I-DCM3), recursively decomposes the guide tree into subtrees, executing a phylogenetic search on the subtree and merging the subtrees, for a set number of iterations. This paper presents a detailed analysis …


Rate-Adaptive Runlength Limited Encoding For High-Speed Infrared Communication, James Cyril Funk Sep 2005

Rate-Adaptive Runlength Limited Encoding For High-Speed Infrared Communication, James Cyril Funk

Theses and Dissertations

My thesis will demonstrate that Rate Adaptive Runlength Limited encoding (RA-RLL) achieves high data rates with acceptable error rate over a wide range of signal distortion/attenuation, and background noise. RA-RLL has performance superior to other infrared modulation schemes in terms of bandwidth efficiency, duty cycle control, and synchronization frequency. Rate adaptive techniques allow for quick convergence of RA-RLL parameters to acceptable values. RA-RLL may be feasibly implemented on systems with non-ideal timing and digital synchronization.


Importance Resampling For Global Illumination, Justin F. Talbot Sep 2005

Importance Resampling For Global Illumination, Justin F. Talbot

Theses and Dissertations

This thesis develops a generalized form of Monte Carlo integration called Resampled Importance Sampling. It is based on the importance resampling sample generation technique. Resampled Importance Sampling can lead to significant variance reduction over standard Monte Carlo integration for common rendering problems. We show how to select the importance resampling parameters for near optimal variance reduction. We also combine RIS with stratification and with Multiple Importance Sampling for further variance reduction. We demonstrate the robustness of this technique on the direct lighting problem and achieve up to a 33% variance reduction over standard techniques. We also suggest using RIS as …


On-Disk Sequence Cache (Odsc): Using Excess Disk Capacity To Increase Performance, Christopher Ryan Slade Sep 2005

On-Disk Sequence Cache (Odsc): Using Excess Disk Capacity To Increase Performance, Christopher Ryan Slade

Theses and Dissertations

We present an on-disk sequence cache (ODSC), which improves disk drive performance. An ODSC uses a separate disk partition to store disk data in the order that the operating system requests it. Storing data in this order reduces the amount of seeking that the disk drive must do. As a result, the average disk access time is reduced. Reducing the disk access time improves the performance of the system, especially when booting the operating system, loading applications, and when main memory is limited. Experiments show that our ODSC speeds up application loads by as much as 413%. Our ODSC also …


Linear Equality Constraints And Homomorphous Mappings In Pso, Christopher K. Monson, Kevin Seppi Sep 2005

Linear Equality Constraints And Homomorphous Mappings In Pso, Christopher K. Monson, Kevin Seppi

Faculty Publications

We present a homomorphous mapping that converts problems with linear equality constraints into fully unconstrained and lower-dimensional problems for optimization with PSO. This approach, in contrast with feasibility preservation methods, allows any unconstrained optimization algorithm to be applied to a problem with linear equality constraints, making available tools that are known to be effective and simplifying the process of choosing an optimizer for these kinds of constrained problems. The application of some PSO algorithms to a problem that has undergone the mapping presented here is shown to be more effective and more consistent than other approaches to handling linear equality …


Studies In The Dynamics Of Economic Systems, Christophe G. Giraud-Carrier, Kevin Seppi, Nghia Tran, Sean C. Warnick, W. Samuel Weyerman, R. Johnson Aug 2005

Studies In The Dynamics Of Economic Systems, Christophe G. Giraud-Carrier, Kevin Seppi, Nghia Tran, Sean C. Warnick, W. Samuel Weyerman, R. Johnson

Faculty Publications

This paper demonstrates the utility of systems and control theory in the analysis of economic systems. Two applications demonstrate how the analysis of simple dynamic models sheds light on important practical problems. The first problem considers the design of a retail laboratory, where the small gain theorem enables the falsification of pricing policies. The second problem explores industrial organization using the equilibria of profit-maximizing dynamics to quantify the percentage of a firm’s profits due strictly to the cooperative effects among its products. This ”Value of Cooperation” suggests an important measure for both organizational and antitrust applications.


Challenging Policies That Do Not Play Fair: A Credential Relevancy Framework Using Trust Negotiation Ontologies, Travis S. Leithead Aug 2005

Challenging Policies That Do Not Play Fair: A Credential Relevancy Framework Using Trust Negotiation Ontologies, Travis S. Leithead

Theses and Dissertations

This thesis challenges the assumption that policies will "play fair" within trust negotiation. Policies that do not "play fair" contain requirements for authentication that are misleading, irrelevant, and/or incorrect, based on the current transaction context. To detect these unfair policies, trust negotiation ontologies provide the context to determine the relevancy of a given credential set for a particular negotiation. We propose a credential relevancy framework for use in trust negotiation that utilizes ontologies to process the set of all available credentials C and produce a subset of credentials C' relevant to the context of a given negotiation. This credential relevancy …


Dynamic Dead Variable Analysis, Micah S. Lewis Aug 2005

Dynamic Dead Variable Analysis, Micah S. Lewis

Theses and Dissertations

Dynamic dead variable analysis (DDVA) extends traditional static dead variable analysis (SDVA) in the context of model checking through the use of run-time information. The analysis is run multiple times during the course of model checking to create a more precise set of dead variables. The DDVA is evaluated based on the amount of memory used to complete model checking relative to SDVA while considering the extra overhead required to implement DDVA. On several models with a complex control flow graph, DDVA reduces the amount of memory needed by 38-88MB compared to SDVA with a cost of 36 bytes of …


Task Similarity Measures For Transfer In Reinforcement Learning Task Libraries, James Carroll, Kevin Seppi Aug 2005

Task Similarity Measures For Transfer In Reinforcement Learning Task Libraries, James Carroll, Kevin Seppi

Faculty Publications

Recent research in task transfer and task clustering has necessitated the need for task similarity measures in reinforcement learning. Determining task similarity is necessary for selective transfer where only information from relevant tasks and portions of a task are transferred. Which task similarity measure to use is not immediately obvious. It can be shown that no single task similarity measure is uniformly superior. The optimal task similarity measure is dependent upon the task transfer method being employed. We define similarity in terms of tasks, and propose several possible task similarity measures, dT, dp, dQ, and dR which are based on …


Edge Inference For Image Interpolation, Bryan S. Morse, Neil Toronto, Dan A. Ventura Aug 2005

Edge Inference For Image Interpolation, Bryan S. Morse, Neil Toronto, Dan A. Ventura

Faculty Publications

Image interpolation algorithms try to fit a function to a matrix of samples in a "natural-looking" way. This paper presents edge inference, an algorithm that does this by mixing neural network regression with standard image interpolation techniques. Results on gray level images are presented, and it is demonstrated that edge inference is capable of producing sharp, natural-looking results. A technique for reintroducing noise is given, and it is shown that, with noise added using a bicubic interpolant, edge inference can be regarded as a generalization of bicubic interpolation. Extension into RGB color space and additional applications of the algorithm are …


Detecting Similar Html Documents Using A Sentence-Based Copy Detection Approach, Rajiv Yerra Jul 2005

Detecting Similar Html Documents Using A Sentence-Based Copy Detection Approach, Rajiv Yerra

Theses and Dissertations

Web documents that are either partially or completely duplicated in content are easily found on the Internet these days. Not only these documents create redundant information on the Web, which take longer to filter unique information and cause additional storage space, but also they degrade the efficiency of Web information retrieval. In this thesis, we present a new approach for detecting similar (HTML)Web documents and evaluate its performance. To detect similar documents, we first apply our sentence-based copy detection approach to determine whether sentences in any two documents should be treated as the same or different according to the degrees …


Constraint-Based Interpolation, Daniel David Goggins Jul 2005

Constraint-Based Interpolation, Daniel David Goggins

Theses and Dissertations

Image reconstruction is the process of converting a sampled image into a continuous one prior to transformation and resampling. This reconstruction can be more accurate if two things are known: the process by which the sampled image was obtained and the general characteristics of the original image. We present a new reconstruction algorithm known as Constraint-Based Interpolation, which estimates the sampling functions found in cameras and analyzes properties of real world images in order to produce quality real-world image magnifications. To accomplish this, Constraint-Based Interpolation uses a sensor model that pushes the pixels in an interpolation to more closely match …


Cache Characterization And Performance Studies Using Locality Surfaces, Elizabeth Schreiner Sorenson Jul 2005

Cache Characterization And Performance Studies Using Locality Surfaces, Elizabeth Schreiner Sorenson

Theses and Dissertations

Today's processors commonly use caches to help overcome the disparity between processor and main memory speeds. Due to the principle of locality, most of the processor's requests for data are satisfied by the fast cache memory, resulting in a signficant performance improvement. Methods for evaluating workloads and caches in terms of locality are valuable for cache design. In this dissertation, we present a locality surface which displays both temporal and spatial locality on one three-dimensional graph. We provide a solid, mathematical description of locality data and equations for visualization. We then use the locality surface to examine the locality of …


Establishing Public Confidence In The Viability Of Fingerprint Biometric Technology, Nathan Alan Green Jul 2005

Establishing Public Confidence In The Viability Of Fingerprint Biometric Technology, Nathan Alan Green

Theses and Dissertations

The most common personal authentication techniques used for identity management employ a secret PIN or password that must be remembered. The challenge, for a given user, is that a multitude of such codes must be recalled over the course of the day for transactions involving distinct computer applications. Password mania prevails. Fingerprint biometric technology is an ideal alternate solution to this password recall problem. In spite of their availability for nearly thirty years, fingerprint biometric systems still remain uncommon in public sectors of industry such as education, government, and technology. Technology has improved sufficiently that false acceptance and rejection rates …


Accelerated Ray Traced Animations Exploiting Temporal Coherence, Darwin Tarry Baines Jul 2005

Accelerated Ray Traced Animations Exploiting Temporal Coherence, Darwin Tarry Baines

Theses and Dissertations

Ray tracing is a well-know technique for producing realistic graphics. However, the time necessary to generate images is unacceptably long. When producing the many frames that are necessary for animations, the time is magnified. Many methods have been proposed to reduce the calculations necessary in ray tracing. Much of the effort has attempted to reduce the number of rays cast or to reduce the number of intersection calculations. Both of these techniques exploit spatial coherence. These acceleration techniques are expanded not only to exploit spatial coherence but also to exploit temporal coherence in order to reduce calculations by treating animation …


Suitability Of The Nist Shop Data Model As A Neutral File Format For Simulation, Gregory Brent Harward Jul 2005

Suitability Of The Nist Shop Data Model As A Neutral File Format For Simulation, Gregory Brent Harward

Theses and Dissertations

Due to the successful application in internet related fields, Extensible Markup Language (XML) and its related technologies are being explored as a revolutionary software file format technology used to provide increased interoperability in the discrete-event simulation (DES) arena. The National Institute of Standards and Technology (NIST) has developed an XML-based information model (XSD) called the Shop Data Model (SDM), which is used to describe the contents of a neutral file format (NFF) that is being promoted as a means to make manufacturing simulation technology more accessible to a larger group of potential customers. Using a two step process, this thesis …


Task Localization, Similarity, And Transfer; Towards A Reinforcement Learning Task Library System, James Lamond Carroll Jul 2005

Task Localization, Similarity, And Transfer; Towards A Reinforcement Learning Task Library System, James Lamond Carroll

Theses and Dissertations

This thesis develops methods of task localization, task similarity discovery, and task transfer for eventual use in a reinforcement learning task library system, which can effectively “learn to learn,” improving its performance as it encounters various tasks over the lifetime of the learning system.


Time Invariance And Liquid State Machines, Eric Goodman, Dan A. Ventura Jul 2005

Time Invariance And Liquid State Machines, Eric Goodman, Dan A. Ventura

Faculty Publications

Time invariant recognition of spatiotemporal patterns is a common task of signal processing. Liquid state machines (LSMs) are a paradigm which robustly handle this type of classification. Using an artificial dataset with target pattern lengths ranging from 0.1 to 1.0 seconds, we train an LSM to find the start of the pattern with a mean absolute error of 0.18 seconds. Also, LSMs can be trained to identify spoken digits, 1-9, with an accuracy of 97.6%, even with scaling by factors ranging from 0.5 to 1.5.


Categorizing And Extracting Information From Multilingual Html Documents, Yiu-Kai D. Ng, Seungjin Lim Jul 2005

Categorizing And Extracting Information From Multilingual Html Documents, Yiu-Kai D. Ng, Seungjin Lim

Faculty Publications

The amount of online information written in different natural languages and the number of non-English speaking Internet users have been increasing tremendously during the past decade. In order to provide high-performance access of multilingual information on the Internet, we have developed a data analysis and querying system (DatAQs) that (i) analyzes, identifies, and categorizes languages used in HTML documents, (ii) extracts information from HTML documents of interest written in different languages, (iii) allows the user to submit queries for retrieving extracted information in the same natural language provided by the query engine of DatAQs using a menu-driven user interface, and …


Detecting Similar Html Documents Using A Fuzzy Set Information Retrieval Approach, Yiu-Kai D. Ng, Rajiv Yerra Jul 2005

Detecting Similar Html Documents Using A Fuzzy Set Information Retrieval Approach, Yiu-Kai D. Ng, Rajiv Yerra

Faculty Publications

Web documents that are either partially or completely duplicated in content are easily found on the Internet these days. Not only do these documents create redundant information on the Web, which take longer to filter unique information and cause additional storage space, but also they degrade the efficiency of Web information retrieval. In this paper, we present a new approach for detecting similar Web documents, especially HTML documents. Our detection approach determines the odd ratio of any two documents, which makes use of the degrees of resemblance of the documents, and graphically displays the locations of similar (not necessarily the …


Validating Human–Robot Interaction Schemes In Multitasking Environments, Jacob W. Crandall, Michael A. Goodrich, Curtis W. Nielsen, Dan R. Olsen Jr. Jul 2005

Validating Human–Robot Interaction Schemes In Multitasking Environments, Jacob W. Crandall, Michael A. Goodrich, Curtis W. Nielsen, Dan R. Olsen Jr.

Faculty Publications

The ability of robots to autonomously perform tasks is increasing. More autonomy in robots means that the human managing the robot may have available free time. It is desirable to use this free time productively, and a current trend is to use this available free time to manage multiple robots. We present the notion of neglect tolerance as a means for determining how robot autonomy and interface design determine how free time can be used to support multitasking, in general, and multirobot teams, in particular. We use neglect tolerance to 1) identify the maximum number of robots that can be …