Open Access. Powered by Scholars. Published by Universities.®
![Digital Commons Network](http://assets.bepress.com/20200205/img/dcn/DCsunburst.png)
Physical Sciences and Mathematics Commons™
Open Access. Powered by Scholars. Published by Universities.®
- Institution
Articles 1 - 6 of 6
Full-Text Articles in Physical Sciences and Mathematics
Comparison Of The Performance Of Simple Linear Regression And Quantile Regression With Non-Normal Data: A Simulation Study, Marjorie Howard
Comparison Of The Performance Of Simple Linear Regression And Quantile Regression With Non-Normal Data: A Simulation Study, Marjorie Howard
Theses and Dissertations
Linear regression is a widely used method for analysis that is well understood across a wide variety of disciplines. In order to use linear regression, a number of assumptions must be met. These assumptions, specifically normality and homoscedasticity of the error distribution can at best be met only approximately with real data. Quantile regression requires fewer assumptions, which offers a potential advantage over linear regression. In this simulation study, we compare the performance of linear (least squares) regression to quantile regression when these assumptions are violated, in order to investigate under what conditions quantile regression becomes the more advantageous method …
Examination And Comparison Of The Performance Of Common Non-Parametric And Robust Regression Models, Gregory F. Malek
Examination And Comparison Of The Performance Of Common Non-Parametric And Robust Regression Models, Gregory F. Malek
Electronic Theses and Dissertations
ABSTRACT
Examination and Comparison of the Performance of Common Non-Parametric and Robust Regression Models
By
Gregory Frank Malek
Stephen F. Austin State University, Masters in Statistics Program,
Nacogdoches, Texas, U.S.A.
This work investigated common alternatives to the least-squares regression method in the presence of non-normally distributed errors. An initial literature review identified a variety of alternative methods, including Theil Regression, Wilcoxon Regression, Iteratively Re-Weighted Least Squares, Bounded-Influence Regression, and Bootstrapping methods. These methods were evaluated using a simple simulated example data set, as well as various real data sets, including math proficiency data, Belgian telephone call data, and faculty …
Associated Hypotheses In Linear Models For Unbalanced Data, Carlos J. Soto
Associated Hypotheses In Linear Models For Unbalanced Data, Carlos J. Soto
Theses and Dissertations
When looking at factorial experiments there are several natural hypotheses that can be tested. In a two-factor or a by b design, the three null hypotheses of greatest interest are the absence of each main effect and the absence of interaction. There are two ways to construct the numerator sum of squares for testing these, namely either adjusted or sequential sums of squares (also known as type I and type III in SAS). Searle has pointed out that, for unbalanced data, a sequential sum of squares for one of these hypotheses is equal (with probability 1) to an adjusted sum …
Curds And Whey: Little Miss Muffit's Contribution To Multivariate Linear Regression, John Cameron Kidd
Curds And Whey: Little Miss Muffit's Contribution To Multivariate Linear Regression, John Cameron Kidd
Undergraduate Honors Capstone Projects
A common multivariate statistical problem is the prediction of two or more response variables using two or more predictor variables. The simplest model for this situation is the multivariate linear regression model. The standard least squares estimation for this model involves regressing each response variable separately on all the predictor variables. Breiman and Friedman [1] show how to take advantage of correlations among the response variables to increase the predictive accuracy for each of the response variable with an algorithm they call Curds and Whey. In this report, I describe an implementation of the Curds and Whey algorithm in …
Linear Regression Of The Poisson Mean, Duane Steven Brown
Linear Regression Of The Poisson Mean, Duane Steven Brown
All Graduate Theses and Dissertations, Spring 1920 to Summer 2023
The purpose of this thesis was to compare two estimation procedures, the method of least squares and the method of maximum likelihood, on sample data obtained from a Poisson distribution. Point estimates of the slope and intercept of the regression line and point estimates of the mean squared error for both the slope and intercept were obtained. It is shown that least squares, the preferred method due to its simplicity, does yield results as good as maximum likelihood.
Also, confidence intervals were computed by Monte Carlo techniques and then were tested for accuracy. For the method of least squares, confidence …
Linear Comparisons In Multivariate Analysis Of Variance, Hsin-Ming Tzeng
Linear Comparisons In Multivariate Analysis Of Variance, Hsin-Ming Tzeng
All Graduate Plan B and other Reports, Spring 1920 to Spring 2023
The analysis of variance was created by Ronald Fisher in 1923. It is most widely used and basically useful approach to study differences among treatment averages.