Open Access. Powered by Scholars. Published by Universities.®

Digital Commons Network

Open Access. Powered by Scholars. Published by Universities.®

UNLV Retrospective Theses & Dissertations

Theses/Dissertations

1993

Algorithms

Articles 1 - 7 of 7

Full-Text Articles in Entire DC Network

Performance Evaluation Of Distributed Mutual Exclusion Algorithms, Kenneth B Been Jan 1993

Performance Evaluation Of Distributed Mutual Exclusion Algorithms, Kenneth B Been

UNLV Retrospective Theses & Dissertations

In any system in which concurrent processes share resources, mutual exclusion refers to the problem of guaranteeing the integrity of those resources by restricting their use to one process at a time. Due the complex nature of distributed systems, distributed mutual exclusion algorithms are often not amenable to theoretical analysis for performance or even correctness. Experimental inquiries are therefore warranted. This thesis investigates seven well known distributed mutual exclusion algorithms in detail, and uses computer simulation to evaluate the performance and applicability of these various algorithms. Toward this end, a realistic and general model for evaluating distributed algorithms is proposed. …


Self-Stabilizing Deadlock Algorithms In Distributed Systems, Mitchell Elliott Flatebo Jan 1993

Self-Stabilizing Deadlock Algorithms In Distributed Systems, Mitchell Elliott Flatebo

UNLV Retrospective Theses & Dissertations

A self-stabilizing system is a network of processors, which, when started from an arbitrary (and possibly illegal) initial state, always returns to a legal state in a finite number of steps. Self-stabilization is an evolving paradigm in fault-tolerant computing. This research will be the first time self-stabilization is used in the areas of deadlock detection and prevention. Traditional deadlock detection algorithms have a process initiate a probe. If that probe travels around the system and is received by the initiator, there is a cycle in the system, and deadlock is detected. In order to prevent deadlocks, algorithms usually rank nodes …


Formal Verification Of Distributed Deadlock Detection Algorithms, Brian Matt Johnston Jan 1993

Formal Verification Of Distributed Deadlock Detection Algorithms, Brian Matt Johnston

UNLV Retrospective Theses & Dissertations

The problem of distributed deadlock detection has undergone extensive study. Formal verification of deadlock detection algorithms in distributed systems is an area of research that has largely been ignored. Instead, most proposed distributed deadlock detection algorithms have used informal or intuitive arguments, simulation or just neglect the entire aspect of verification of correctness; As a consequence, many of these algorithms have been shown incorrect. This research will abstract the notion of deadlock in terms of a temporal logic of actions and discuss the invariant and eventuality properties. The contributions of this research are the development of a distributed deadlock detection …


Practical Algorithms For Image Compression And Surface Estimation, Matthew Y Au Jan 1993

Practical Algorithms For Image Compression And Surface Estimation, Matthew Y Au

UNLV Retrospective Theses & Dissertations

Practical Algorithms for Image Compression and Surface Estimation describes three algorithms for image compression and one algorithm for surface estimation that incorporates kriging and parametric cubic splines. Two of the image compression algorithms are innovative extensions of the Run Length Encoding image compression algorithm and the third is an image compression technique based on kriging. In general the modified Run Length Encoding algorithms yield a better compression ratio by a factor of two while retaining fast decompression of the image. The algorithm based on kriging achieves a compression ratio up to 250:1 and is unique in that the compressed image …


The Use Of Synthesized Images To Evaluate The Performance Of Ocr Devices And Algorithms, Frank Robert Jenkins Jan 1993

The Use Of Synthesized Images To Evaluate The Performance Of Ocr Devices And Algorithms, Frank Robert Jenkins

UNLV Retrospective Theses & Dissertations

This thesis will attempt to establish if synthesized images can be used to predict the performance of Optical Character Recognition (OCR) algorithms and devices. The value of this research lies in reducing the considerable costs associated with preparing test images for OCR research. The paper reports on a series of experiments in which synthesized images of text files in nine different fonts and sizes are input to eight commercial OCR devices. The method used to create the images is explained and a detailed analysis of the character and word confusion between the output and the true text files is presented. …


Algorithms For Automatic Parallelism, Optimization, And Optimal Processor Utilization, Katharine J Macke Jan 1993

Algorithms For Automatic Parallelism, Optimization, And Optimal Processor Utilization, Katharine J Macke

UNLV Retrospective Theses & Dissertations

In this thesis we first investigate the reaching definitions optimization. This compiler optimization collects and stores information about where a variable is defined and how long that definition of the variable stays alive before it is redefined. We compare the old iterative solution to a new algorithm that uses the partialout concept. The partialout solution decreases execution time by eliminating the multiple passes required in the iterative solution. Currently, compilers that find a data dependence between two statements in a loop do not parallelize the loop. Next we investigate automatic parallelism for these loops by breaking the loop into a …


Distributed Mutual Exclusion Algorithms, Paul Banta Jan 1993

Distributed Mutual Exclusion Algorithms, Paul Banta

UNLV Retrospective Theses & Dissertations

In this thesis we present three original algorithms which solve the distributed mutual exclusion problem. Two of the three solve the problem of allowing only one site at a time into the critical section. The third solves the more difficult problem of allowing a specific number of sites (k sites) into the critical section at a time; All three algorithms are "Token Based". That is, they make use of a token and token queue in order to guarantee mutual exclusion. Only the site that currently has the token is allowed to enter its critical section in the 1 mutual exclusion …