Open Access. Powered by Scholars. Published by Universities.®

Physical Sciences and Mathematics Commons

Open Access. Powered by Scholars. Published by Universities.®

2016

Journal of Modern Applied Statistical Methods

Multicollinearity

Articles 1 - 5 of 5

Full-Text Articles in Physical Sciences and Mathematics

Multicollinearity And A Ridge Parameter Estimation Approach, Ghadban Khalaf, Mohamed Iguernane Nov 2016

Multicollinearity And A Ridge Parameter Estimation Approach, Ghadban Khalaf, Mohamed Iguernane

Journal of Modern Applied Statistical Methods

One of the main goals of the multiple linear regression model, Y = + u, is to assess the importance of independent variables in determining their predictive ability. However, in practical applications, inference about the coefficients of regression can be difficult because the independent variables are correlated and multicollinearity causes instability in the coefficients. A new estimator of ridge regression parameter is proposed and evaluated by simulation techniques in terms of mean squares error (MSE). Results of the simulation study indicate that the suggested estimator dominates ordinary least squares (OLS) estimator and other ridge estimators with respect to …


Improved Ridge Estimator In Linear Regression With Multicollinearity, Heteroscedastic Errors And Outliers, Ashok Vithoba Dorugade Nov 2016

Improved Ridge Estimator In Linear Regression With Multicollinearity, Heteroscedastic Errors And Outliers, Ashok Vithoba Dorugade

Journal of Modern Applied Statistical Methods

This paper introduces a new estimator, of ridge parameter k for ridge regression and then evaluated by Monte Carlo simulation. We examine the performance of the proposed estimators compared with other well-known estimators for the model with heteroscedastics and/or correlated errors, outlier observations, non-normal errors and suffer from the problem of multicollinearity. It is shown that proposed estimators have a smaller MSE than the ordinary least squared estimator (LS), Hoerl and Kennard (1970) estimator (RR), jackknifed modified ridge (JMR) estimator, and Jackknifed Ridge M‑estimator (JRM).


The Goldilocks Dilemma: Impacts Of Multicollinearity -- A Comparison Of Simple Linear Regression, Multiple Regression, And Ordered Variable Regression Models, Grayson L. Baird, Stephen L. Bieber May 2016

The Goldilocks Dilemma: Impacts Of Multicollinearity -- A Comparison Of Simple Linear Regression, Multiple Regression, And Ordered Variable Regression Models, Grayson L. Baird, Stephen L. Bieber

Journal of Modern Applied Statistical Methods

A common consideration concerning the application of multiple linear regression is the lack of independence among predictors (multicollinearity). The main purpose of this article is to introduce an alternative method of regression originally outlined by Woolf (1951), which completely eliminates the relatedness between the predictors in a multiple predictor setting.


Liu-Type Logistic Estimators With Optimal Shrinkage Parameter, Yasin Asar May 2016

Liu-Type Logistic Estimators With Optimal Shrinkage Parameter, Yasin Asar

Journal of Modern Applied Statistical Methods

Multicollinearity in logistic regression affects the variance of the maximum likelihood estimator negatively. In this study, Liu-type estimators are used to reduce the variance and overcome the multicollinearity by applying some existing ridge regression estimators to the case of logistic regression model. A Monte Carlo simulation is given to evaluate the performances of these estimators when the optimal shrinkage parameter is used in the Liu-type estimators, along with an application of real case data.


Solution To The Multicollinearity Problem By Adding Some Constant To The Diagonal, Hanan Duzan, Nurul Sima Binti Mohamaed Shariff May 2016

Solution To The Multicollinearity Problem By Adding Some Constant To The Diagonal, Hanan Duzan, Nurul Sima Binti Mohamaed Shariff

Journal of Modern Applied Statistical Methods

Ridge regression is an alternative to ordinary least-squares (OLS) regression. It is believed to be superior to least-squares regression in the presence of multicollinearity. The robustness of this method is investigated and comparison is made with the least squares method through simulation studies. Our results show that the system stabilizes in a region of k, where k is a positive quantity less than one and whose values depend on the degree of correlation between the independent variables. The results also illustrate that k is a linear function of the correlation between the independent variables.