Open Access. Powered by Scholars. Published by Universities.®

Physical Sciences and Mathematics Commons

Open Access. Powered by Scholars. Published by Universities.®

Articles 31 - 60 of 89

Full-Text Articles in Physical Sciences and Mathematics

Parameter Estimation In Nonstationary M/M/S Queueing Models, Pensri Vajanaphanich May 1982

Parameter Estimation In Nonstationary M/M/S Queueing Models, Pensri Vajanaphanich

All Graduate Theses and Dissertations, Spring 1920 to Summer 2023

If either the arrival rate or the service rate in an M/M/S queue exhibit variability over time, then no steady state solution is available for examining the system behavior. The arrival and service rates can be represented through Fourier series approximations. This permits numerical approximation of the system characteristics over time.

An example of an M/M/S representation of the operations of emergency treatment at Logan Regional hospital is presented. It requires numerical integration of the differential equation for L(t), the expected number of customers in the system at time t.


Least Squares Estimation Of The Pareto Type I And Ii Distribution, Ching-Hua Chien May 1982

Least Squares Estimation Of The Pareto Type I And Ii Distribution, Ching-Hua Chien

All Graduate Theses and Dissertations, Spring 1920 to Summer 2023

The estimation of the Pareto distribution can be computationally expensive and the method is badly biased. In this work, an improved Least Squares derivation is used and the estimation will be less biased. Numerical examples and figures are provided so that one may observe the solution more clearly. Furthermore, by varying the different methods of estimation, a comparing of the estimators of the parameters is given. The improved Least Squares derivation is confidently employed for it is economic and efficient.


Nonparametric Analysis Of Right Censored Data With Multiple Comparisons, Hwei-Weng Shih Jan 1982

Nonparametric Analysis Of Right Censored Data With Multiple Comparisons, Hwei-Weng Shih

All Graduate Plan B and other Reports, Spring 1920 to Spring 2023

This report demonstrates the use of a computer program written in FORTRAN for the Burroughs B6800 computer at Utah State University to perform Breslow's (1970) generalization of the Kruskal-Wallis test for right censored data. A pairwise multiple comparison procedure using Bonferroni's inequality is also introduced and demonstrated. Comparisons are also made with a parametric F test and the original Kruskal-Wallis test. Application of these techniques to two data sets indicate that there is little difference among the procedures with the F test being slightly more liberal (too many differences) and the Kruskal-Wallis test corrected for ties being slightly more conservative …


Explanation Of The Fast Fourier Transform And Some Applications, Alan Kazuo Endo May 1981

Explanation Of The Fast Fourier Transform And Some Applications, Alan Kazuo Endo

All Graduate Theses and Dissertations, Spring 1920 to Summer 2023

This report describes the Fast Fourier Transform and some of its applications. It describes the continuous Fourier transform and some of its properties. Finally, it describes the Fast Fourier Transform and its applications to hurricane risk analysis, ocean wave analysis, and hydrology.


Evaluation Of Multivariate Homogenous Arma Model, Lucy Chienhua Tseng May 1980

Evaluation Of Multivariate Homogenous Arma Model, Lucy Chienhua Tseng

All Graduate Theses and Dissertations, Spring 1920 to Summer 2023

The purpose of this thesis is to study a restricted multivariate AFRMA model, called the Homogeneous Model. This model is defined as one in which each univariate component of the multivariate model is of the same order in p and q as it is in the multivariate model.

From a mathematical respect, multivariate ARMA model is homogeneous if, and only if, its coefficient matrices are diagonal. From a physical respect, the present observation of a phenomenon can be modeled only by its own past observation and its present and past "errors."

The estimation procedures are developed based on maximum likelihood …


Exact Analysis Of Variance With Unequal Variances, Noriaki Yanagi May 1980

Exact Analysis Of Variance With Unequal Variances, Noriaki Yanagi

All Graduate Theses and Dissertations, Spring 1920 to Summer 2023

The purpose of this paper was to present the exact analysis of variance with unequal variances. Bishop presented the new procedure for the r-way layout ANOVA. In this paper, one and two way layout ANOVA were explained and Bishop's method and Standard method were compared by using a Monte Carlo method.


Extreme Value Distribution In Hydrology, Bill (Tzeng-Lwen) Chen May 1980

Extreme Value Distribution In Hydrology, Bill (Tzeng-Lwen) Chen

All Graduate Theses and Dissertations, Spring 1920 to Summer 2023

The problems encountered when empirical fit is used as the sole criterion for choosing a distribution to represent annual flood data are discussed. Some theoretical direction is needed for this choice. Extreme value theory is established as a viable tool for analyzing annual flood data. Extreme value distributions have been used in previous analyses of flood data. However, no systematic investigation of the theory has previously been applied. Properties of the extreme value distributions are examined. The most appropriate distribution for flood data has not previously been fit to such data. The fit of the chosen extreme value distribution compares …


A Confidence Interval Estimate Of Percentile, How Coung Jou May 1980

A Confidence Interval Estimate Of Percentile, How Coung Jou

All Graduate Theses and Dissertations, Spring 1920 to Summer 2023

The confidence interval estimate of percentile and its applications were studied. The three methods of estimating a confidence interval were introduced. Some properties of order statistics were reviewed. The Monte Carlo Method -- used to estimate the confidence interval was the most important one among the three methods. The generation of ordered random variables and the estimation of parameters were discussed clearly. The comparison of the three methods showed that the Monte Carlo method would always work, but the K-S and the simplified methods would not.


Factorial Analysis Of Variance And Covariance On A Minicomputer, Ladonna Black Kemmerle Jan 1980

Factorial Analysis Of Variance And Covariance On A Minicomputer, Ladonna Black Kemmerle

All Graduate Plan B and other Reports, Spring 1920 to Spring 2023

Statistical analysis of large data sets is commonly performed on computers using one of the many available programs. Most of these programs have been written for computers with internal storage large enough to handle nearly any data set. Recently, however, there has been a trend to computers with more limited storage capabilities. New programs must be written or old programs adapted so that large data sets may also be analyzed on these smaller machines.

This report describes a program to analyze data from a balanced experiment of crossed and/or nested design. It was written for the Data General Nova minicomputer …


The Prior Distribution In Bayesian Statistics, Kai-Tang Chen May 1979

The Prior Distribution In Bayesian Statistics, Kai-Tang Chen

All Graduate Theses and Dissertations, Spring 1920 to Summer 2023

A major problem associated with Bayesian estimation is selecting the prior distribution. The more recent literature on the selection of the prior is reviewed. Very little of a general nature on the selection of the prior is formed in the literature except for non-informative priors. This class of priors is seen to have limited usefulness. A method of selecting an informative prior is generalized in this thesis to include estimation of several parameters using a multivariate prior distribution. The concepts required for quantifying prior information is based on intuitive principles. In this way, it can be understood and controlled by …


A Μ-Model Approach On The Cell Means: The Analysis Of Full, Design Models With Non-Orthogonal Data, Richard Van Koningsveld May 1979

A Μ-Model Approach On The Cell Means: The Analysis Of Full, Design Models With Non-Orthogonal Data, Richard Van Koningsveld

All Graduate Theses and Dissertations, Spring 1920 to Summer 2023

This work considers the application of a µ-model approach on the cell means to a special yet important class of experimental designs. These include full factorial, completely nested, and mixed models with one or more observations per cell. By limiting attention to full models, an approach to the general data situation is developed which is both conceptually simple and computationally advantageous.

Conceptually, the method is simple because the design related effects are defined as if the cell means are single observations. This leads to a rather simple algorithm for generating main effect contrasts, from which associated interaction contrasts can also …


Estimation Of Floods When Runoff Originates From Nonhomogeneous Sources, David Ray Olson May 1979

Estimation Of Floods When Runoff Originates From Nonhomogeneous Sources, David Ray Olson

All Graduate Theses and Dissertations, Spring 1920 to Summer 2023

Extreme value theory is used as a basis for deriving a distribution function for flood frequency analysis when runoff originates from nonhomogeneous sources. A modified least squares technique is used to estimate the parameters of the distribution function for eleven rivers. Goodness-of-fit statistics are computed and the distribution function is found to fit the data very well.

The derived distribution function is recommended as a base method for flood frequency analysis for rivers exhibiting nonhomogeneous sources of runoff if further investigation also proves to be positive.


The Evolution Of Ibm's Information Management System-- A Significant Data Base/Data Communications Software Product, Brent W. Anderson Jan 1979

The Evolution Of Ibm's Information Management System-- A Significant Data Base/Data Communications Software Product, Brent W. Anderson

All Graduate Plan B and other Reports, Spring 1920 to Spring 2023

In the early 1970's it was said that data base management systems (DBMS) would be to the 70's what COBOL was to the 60's. Clearly, recognition of the need to manage and effectively utilize data has resulted in significant efforts to develop computer hardware and software to meet this great challenge.


A Discussion Of An Empirical Bayes Multiple Comparison Technique, Donna Baranowski Jan 1979

A Discussion Of An Empirical Bayes Multiple Comparison Technique, Donna Baranowski

All Graduate Plan B and other Reports, Spring 1920 to Spring 2023

This paper considers the application and comparison of Bayesian and nonBayesian multiple comparison techniques applied to sets of chemical analysis data. Suggestions are also made as to which methods should be used.


Multicollinearity And The Estimation Of Regression Coefficients, John Charles Teed May 1978

Multicollinearity And The Estimation Of Regression Coefficients, John Charles Teed

All Graduate Theses and Dissertations, Spring 1920 to Summer 2023

The precision of the estimates of the regression coefficients in a regression analysis is affected by multicollinearity. The effect of certain factors on multicollinearity and the estimates was studied. The response variables were the standard error of the regression coefficients and a standarized statistic that measures the deviation of the regression coefficient from the population parameter.

The estimates are not influenced by any one factor in particular, but rather some combination of factors. The larger the sample size, the better the precision of the estimates no matter how "bad" the other factors may be.

The standard error of the regression …


Interpretation Of Principal Components, Marwan A. Dabdoub May 1978

Interpretation Of Principal Components, Marwan A. Dabdoub

All Graduate Theses and Dissertations, Spring 1920 to Summer 2023

The principal component analysis can be carried out two ways. First the R-mode:

R = K'K

and the second is the Q-mode:

Q = K K'

where K is a data matrix centered by column or by row. The most commonly used method is the R-mode.

It has been suggested that principal components computed from either the R-mode or the Q-mode may have the same interpretation. If this is true, then interpretation of the principal components could be put on a much more intuitive level in many applications. This will occur whenever one type of principal component …


Estimation Of Μy Using The General Regression Model (In Sampling), Michael R. Manieri Jan 1978

Estimation Of Μy Using The General Regression Model (In Sampling), Michael R. Manieri

All Graduate Plan B and other Reports, Spring 1920 to Spring 2023

The methods of ratio and regression estimators discussed by Cochran(l977) are given as background materials and extended to the estimation of µy, the population mean of the Y's, using a general regression model.

The propagation of error technique given by Deming(l948) is used as an approximation to find the variance of the estimator µy.

Examples are given for each of the various models. Variances of μy are calculated and compared


Factor Analysis Method, Stephen Hauwah Kan Jan 1978

Factor Analysis Method, Stephen Hauwah Kan

All Graduate Plan B and other Reports, Spring 1920 to Spring 2023

The logical steps performed when doing a factor analysis can be classified into three operation s. The first step concerns the exact mode of analysis and involves the type of centering, scaling and formation o f sums of squares . The second step involves extraction of initial factors. The algebraic basis of the factors are rotated in the last step to obtain a more easily interpreted set of factors. At each step several different methods have been suggested and appear in the literature. Two primary modes of factor analysis are commonly used an d they are denoted as R-mode and …


Comparison Of The Fisher's Method Of Randomization With Other Tests Based On Ranks And The F-Test, Francisco J. González Jan 1978

Comparison Of The Fisher's Method Of Randomization With Other Tests Based On Ranks And The F-Test, Francisco J. González

All Graduate Plan B and other Reports, Spring 1920 to Spring 2023

Classical statistical inference methods (parametric methods) have a common denominator, i.e. a population para meter (μ, o, n) about which we wish to draw inferences from a random sample. R) are selected. Point estimators of the parameters (X, S, Their sampling distribution is used to construct hypothesis testing decision rules or, confidence interval formulas. This is the reason for calling this method of obtaining inferences a parametric method. They are based on knowing the distribution of the population random variable from which the sampling distribution of the point estimator is determined. In addition, it is generally assumed that the population, …


Specific Hypotheses In Linear Models And Their Power Function In Unbalanced Data, Seyed Mohtaba Taheri Jan 1977

Specific Hypotheses In Linear Models And Their Power Function In Unbalanced Data, Seyed Mohtaba Taheri

All Graduate Plan B and other Reports, Spring 1920 to Spring 2023

A hypothesis is a statement or claim about the state of nature. Scientific investigators, market researchers, governmental decision makers, among others, will often have hypotheses about the particular facet of nature, hypotheses that need verification or rejection, for one purpose or another. Statisticians concerned with testing hypotheses using unbalanced data on the basis of linear models have talked about the difficulties involved for many years but, probably because the problems are not easily resolved, there is yet no satisfactory solution to these problems


An Empirical Comparison Of Confidence Interval For Relative Potency, Catherine H. Lung Jan 1976

An Empirical Comparison Of Confidence Interval For Relative Potency, Catherine H. Lung

All Graduate Plan B and other Reports, Spring 1920 to Spring 2023

Biological assays are essentially biological experiments. To compare the potencies of treatments on an agreed scale is generally of more interest than to compare the magnitude of effects of different treatments.

The relative potency, R = a/b, is defined as the ratio of the means of two equally effective doses where a is the mean of A and bis the mean of B. It is an estimate of the potency of one preparation, A, relative to that of the other, B.

Different procedures have been proposed to obtain the values of R and its confidence interval. Three of the these …


Linear Comparisons In Multivariate Analysis Of Variance, Hsin-Ming Tzeng Jan 1976

Linear Comparisons In Multivariate Analysis Of Variance, Hsin-Ming Tzeng

All Graduate Plan B and other Reports, Spring 1920 to Spring 2023

The analysis of variance was created by Ronald Fisher in 1923. It is most widely used and basically useful approach to study differences among treatment averages.


A Study Of Four Statistics, Used In Analysis Of Contingency Tables, In The Presence Of Low Expected Frequencies, Jane R. Post May 1975

A Study Of Four Statistics, Used In Analysis Of Contingency Tables, In The Presence Of Low Expected Frequencies, Jane R. Post

All Graduate Theses and Dissertations, Spring 1920 to Summer 2023

Four statistics used for the analysis of categorical data were observed in the presence of many zero cell frequencies in two way classification contingency tables. The purpose of this study was to determine the effect of many zero cell frequencies upon the distribution properties of each of the four statistics studied. It was found that Light and Margolin's C and Pearson's Chi-square statistic closely approximated the Chi-square distribution as long as less than one-third of the table cells were empty. It was found that the mean and variance of Kullbach's 2I were larger than the expected values in the presence …


An Evaluation Of Truncated Sequential Test, Ryh-Thinn Chang May 1975

An Evaluation Of Truncated Sequential Test, Ryh-Thinn Chang

All Graduate Theses and Dissertations, Spring 1920 to Summer 2023

The development of sequential analysis has led to the proposal of tests that are more economical in that the Average Sample Number (A. S. N.) of the sequential test is smaller than the sample size of the fixed sample test. Although these tests usually have a smaller A. S. N. than the equivelent fixed sample procedure, there still remains the possibility that an extremely large sample size will be necessary to make a decision. To remedy this, truncated sequential tests have been developed.

A method of truncation for testing a composite hypotheses is studied. This method is formed by mixing …


The Computation Of Eigenvalues And Eigenvectors Of An Nxn Real General Matrix, Yeh-Hao Ma Jan 1975

The Computation Of Eigenvalues And Eigenvectors Of An Nxn Real General Matrix, Yeh-Hao Ma

All Graduate Plan B and other Reports, Spring 1920 to Spring 2023

The eigenvalues of the matrix eigenproblem Ax = λx are computed by the QR double-step method and the eigenvectors by inverse power method.

The matrix A is preliminarily scaled by the equilibration and normalization procedure. The scaled matrix is then reduced to an upper-Hessenberg form by Householder's method. The QR double-step iteration is performed on the upper-Hessenberg matrix. After all the eigenvalues are found, the inverse power method is performed on the upper-Hessenberg matrix to obtain the corresponding eigenvectors.

The program consists of five subroutines which is able to find real and/or complex eigen value/vector of an nxn real matrix.


Program For Missing Data In The Multivariate Normal Distribution, Chi-Ping Lu Jan 1975

Program For Missing Data In The Multivariate Normal Distribution, Chi-Ping Lu

All Graduate Plan B and other Reports, Spring 1920 to Spring 2023

Missing data can often cause many problems in research work. Therefore for carrying out analysis, some procedure for obtaining estimates in the presence of missing data should be applied. Various theories and techniques have been developed for different types of problems.

Analysis of the Multivariate Normal Distribution with missing data is one of the areas studied. It has been discussed earlier by Wilkes (1932), Lord (1955), Edgett (1956) and Hartley (1958). They have established some basic concepts and an outline in the way of estimation.

In the last ten years, A. A. Afifi and R. M. Elasfoff also have contributed …


Discriminant Function Analysis, Kuo Hsiung Su Jan 1975

Discriminant Function Analysis, Kuo Hsiung Su

All Graduate Plan B and other Reports, Spring 1920 to Spring 2023

The technique of discriminant function analysis was originated by R.A. Fisher and first applied by Barnard (1935). Two very useful summaries of the recent work in this technique can be found in Hodges (1950) and in Tosuoka and Tiedeman (1954). The techniques have been used primarily in the fields of anthropology, psychology, biology, medicine, and education, and have only begun to be applied to other fields in recent years.

Classification and discriminant function analyses are two phases in the attempt to predict which of several populations an observation might be a member of, on the basis of multivariate measurements. Both …


An Evaluation Of Bartlett's Chi-Square Approximation For The Determinant Of A Matrix Of Sample Zero-Order Correlation Coefficients, Stephen M. Hattori Jan 1975

An Evaluation Of Bartlett's Chi-Square Approximation For The Determinant Of A Matrix Of Sample Zero-Order Correlation Coefficients, Stephen M. Hattori

All Graduate Plan B and other Reports, Spring 1920 to Spring 2023

The single equation least-squares regression model has been extensively studied by economists and statisticians alike in order to determine the problems which arise when particular assumptions are violated. Much literature is available in terms of the properties and limitations of the model. However, on the multicollinearity problem, there has been little research, and consequently, limited literature is available when the problem is encountered. Farrar & Glauber (1967) present a collection of techniques to use in order to detect or diagnose the occurrence of multicollinearity within a regression analysis. They attempt to define multicollinearity in terms of departures from a hypothesized …


Multivariate Analysis Of Variance For Simple Designs, Yin-Yin Chen Jan 1975

Multivariate Analysis Of Variance For Simple Designs, Yin-Yin Chen

All Graduate Plan B and other Reports, Spring 1920 to Spring 2023

The analysis of variance is a well known tool for testing how treatments change the average response of experimental units. The essence of the procedure is to compare the variation among means of groups of units subjected to the same treatment with the within treatment variation. If the variation among means is large with respect to the within group variation we are likely to conclude that the treatments caused the variation and hence we say the treatments cause some change in the group means.

The usual analysis of variance checks how far apart the group means are in a single …


Comparison Of Transition Matrices Between Metropolitan And Non-Metropolitan Areas In The State Of Utah Using Juvenile Court Data, Sung-Ik Song May 1974

Comparison Of Transition Matrices Between Metropolitan And Non-Metropolitan Areas In The State Of Utah Using Juvenile Court Data, Sung-Ik Song

All Graduate Theses and Dissertations, Spring 1920 to Summer 2023

The purpose of this paper is to use Markov Chains for the study of youths referred to the juvenile court in the metropolitan and non-metropolitan areas of the state of Utah.

Two computer programs were written for creating case histories for each person referred to the court and for testing for the significance of the difference among several transition matrices.

Another computer program, which was written by Soo Hong Uh, was used for analyzing realizations of a Markov chains up to the 4th order; a third computer program, originally written by David White, was used for interpreting Markov chains.

The …