Open Access. Powered by Scholars. Published by Universities.®

Digital Commons Network

Open Access. Powered by Scholars. Published by Universities.®

Louisiana State University

1998

Computer science

Articles 1 - 5 of 5

Full-Text Articles in Entire DC Network

Techniques For Resolving Incomplete Systems In K-Systems Analysis., Gary J. Asmus Jan 1998

Techniques For Resolving Incomplete Systems In K-Systems Analysis., Gary J. Asmus

LSU Historical Dissertations and Theses

K-systems analysis is a generalization of reconstructability analysis (RA), where any general, complete multivariate system (g-system) can be transformed into an isomorphic, dimensionless system (a K-system) that has sufficient properties to be analyzed using probabilistic RA algorithms. In particular, a g-system consists of a set of states formed from a complete combination of the variables assigned specific values from a finite set of possible values and an associated system function value. The g-system must be complete in that all possible states must have an associated system function value. K-systems analysis has been applied to a variety of systems, but many …


Lossless Set Compression Of Correlated Information., Oleg Stanislavovich Pianykh Jan 1998

Lossless Set Compression Of Correlated Information., Oleg Stanislavovich Pianykh

LSU Historical Dissertations and Theses

Set compression allows the compression a set of similar (correlated) images more efficiently than compressing the same images independently. Currently, set compression is performed with different inter-image predictive models, that forecast the common image properties from a few reference images. With sufficient inter-image correlation, one can predict any database image from a few templates, hence avoiding inter-image redundancy and achieving much improved compression ratios. This research focused on two major aspects of this technique: the practical limits of the predictive set compression, and the theoretical estimates of the compression efficiency. This includes a review of the previous work in set …


Form: The Fortran Object Recovery Model. A Methodology To Extract Object-Oriented Designs From Imperative Code., Bonnie Lynn Achee Jan 1998

Form: The Fortran Object Recovery Model. A Methodology To Extract Object-Oriented Designs From Imperative Code., Bonnie Lynn Achee

LSU Historical Dissertations and Theses

A majority of legacy systems in use in the scientific and engineering application domains are coded in imperative languages, specifically, COBOL or FORTRAN-77. These systems have an average age of 15 years or more and have undergone years of extensive maintenance. They suffer from either poor documentation or no documentation, and antiquated coding practices and paradigms (Chik94) (Osbo90). The purpose of this research is to develop a reverse-engineering methodology to extract an object-oriented design from legacy systems written in imperative languages. This research defines a three-phase methodology that inputs source code and outputs an object-oriented design. The three phases of …


Assessing The Reuse Potential Of Objects., Maria Lorna Reyes Jan 1998

Assessing The Reuse Potential Of Objects., Maria Lorna Reyes

LSU Historical Dissertations and Theses

In this research, we investigate whether reusable classes can be characterized by object-oriented (OO) software metrics. Three class-level reuse measures for the OO paradigm are defined: inheritance-based reuse, inter-application reuse by extension, and inter-application reuse as a server. Using data from a software company, we collected metrics on Smalltalk classes. Among the 20 metrics collected are cyclomatic complexity, Lorenz complexity, lines of code, class coupling, reuse ratio, specialization ratio and number of direct subclasses. We used stepwise regression to derive prediction models incorporating the 20 metrics as the independent variables and the reuse measures, applied separately, as the dependent variable. …


Directed Search In K-System Reconstruction., Christopher W. Branton Jan 1998

Directed Search In K-System Reconstruction., Christopher W. Branton

LSU Historical Dissertations and Theses

K-systems analysis is a factor analysis technique created through the generalization of key reconstructability analysis definitions and algorithms. The method is applied to functions on systems of discrete variables to discover a set of factors which can explain the bulk of the function's variation from the mean. K-systems analysis uses principles of information theory to reveal interactions which are often masked by the assumptions implicit in traditional methods. The method has been used successfully to analyze systems in several disciplines. Despite the success of k-systems analysis, obstacles to the creation of a mature methodology still exist. Some issues and open …