Open Access. Powered by Scholars. Published by Universities.®

Physical Sciences and Mathematics Commons

Open Access. Powered by Scholars. Published by Universities.®

Wayne State University

Multicollinearity

Articles 1 - 17 of 17

Full-Text Articles in Physical Sciences and Mathematics

Sampling The Porridge: A Comparison Of Ordered Variable Regression With F And R2 And Multiple Linear Regression With Corrected F And R2 In The Presence Of Multicollinearity, Grayson L. Baird, Stephen L. Bieber Mar 2020

Sampling The Porridge: A Comparison Of Ordered Variable Regression With F And R2 And Multiple Linear Regression With Corrected F And R2 In The Presence Of Multicollinearity, Grayson L. Baird, Stephen L. Bieber

Journal of Modern Applied Statistical Methods

Differences between the multiple linear regression model with Corrected R2 and Corrected F and the ordered variable regression model with R2 and F when intercorrelation is present are illustrated with simulated and real-world data.


A New Liu Type Of Estimator For The Restricted Sur Estimator, Kristofer Månsson, B. M. Golam Kibria, Ghazi Shukur Mar 2020

A New Liu Type Of Estimator For The Restricted Sur Estimator, Kristofer Månsson, B. M. Golam Kibria, Ghazi Shukur

Journal of Modern Applied Statistical Methods

A new Liu type of estimator for the seemingly unrelated regression (SUR) models is proposed that may be used when estimating the parameters vector in the presence of multicollinearity if the it is suspected to belong to a linear subspace. The dispersion matrices and the mean squared error (MSE) are derived. The new estimator may have a lower MSE than the traditional estimators. It was shown using simulation techniques the new shrinkage estimator outperforms the commonly used estimators in the presence of multicollinearity.


Regressions Regularized By Correlations, Stan Lipovetsky Jun 2018

Regressions Regularized By Correlations, Stan Lipovetsky

Journal of Modern Applied Statistical Methods

The regularization of multiple regression by proportionality to correlations of predictors with dependent variable is applied to the least squares objective and normal equations to relax the exact equalities and to get a robust solution. This technique produces models not prone to multicollinearity and is very useful in practical applications.


Monte Carlo Study Of Some Classification-Based Ridge Parameter Estimators, Adewale Folaranmi Lukman, Kayode Ayinde, Adegoke S. Ajiboye May 2017

Monte Carlo Study Of Some Classification-Based Ridge Parameter Estimators, Adewale Folaranmi Lukman, Kayode Ayinde, Adegoke S. Ajiboye

Journal of Modern Applied Statistical Methods

Ridge estimator in linear regression model requires a ridge parameter, K, of which many have been proposed. In this study, estimators based on Dorugade (2014) and Adnan et al. (2014) were classified into different forms and various types using the idea of Lukman and Ayinde (2015). Some new ridge estimators were proposed. Results shows that the proposed estimators based on Adnan et al. (2014) perform generally better than the existing ones.


Multicollinearity And A Ridge Parameter Estimation Approach, Ghadban Khalaf, Mohamed Iguernane Nov 2016

Multicollinearity And A Ridge Parameter Estimation Approach, Ghadban Khalaf, Mohamed Iguernane

Journal of Modern Applied Statistical Methods

One of the main goals of the multiple linear regression model, Y = + u, is to assess the importance of independent variables in determining their predictive ability. However, in practical applications, inference about the coefficients of regression can be difficult because the independent variables are correlated and multicollinearity causes instability in the coefficients. A new estimator of ridge regression parameter is proposed and evaluated by simulation techniques in terms of mean squares error (MSE). Results of the simulation study indicate that the suggested estimator dominates ordinary least squares (OLS) estimator and other ridge estimators with respect to …


Improved Ridge Estimator In Linear Regression With Multicollinearity, Heteroscedastic Errors And Outliers, Ashok Vithoba Dorugade Nov 2016

Improved Ridge Estimator In Linear Regression With Multicollinearity, Heteroscedastic Errors And Outliers, Ashok Vithoba Dorugade

Journal of Modern Applied Statistical Methods

This paper introduces a new estimator, of ridge parameter k for ridge regression and then evaluated by Monte Carlo simulation. We examine the performance of the proposed estimators compared with other well-known estimators for the model with heteroscedastics and/or correlated errors, outlier observations, non-normal errors and suffer from the problem of multicollinearity. It is shown that proposed estimators have a smaller MSE than the ordinary least squared estimator (LS), Hoerl and Kennard (1970) estimator (RR), jackknifed modified ridge (JMR) estimator, and Jackknifed Ridge M‑estimator (JRM).


The Goldilocks Dilemma: Impacts Of Multicollinearity -- A Comparison Of Simple Linear Regression, Multiple Regression, And Ordered Variable Regression Models, Grayson L. Baird, Stephen L. Bieber May 2016

The Goldilocks Dilemma: Impacts Of Multicollinearity -- A Comparison Of Simple Linear Regression, Multiple Regression, And Ordered Variable Regression Models, Grayson L. Baird, Stephen L. Bieber

Journal of Modern Applied Statistical Methods

A common consideration concerning the application of multiple linear regression is the lack of independence among predictors (multicollinearity). The main purpose of this article is to introduce an alternative method of regression originally outlined by Woolf (1951), which completely eliminates the relatedness between the predictors in a multiple predictor setting.


Liu-Type Logistic Estimators With Optimal Shrinkage Parameter, Yasin Asar May 2016

Liu-Type Logistic Estimators With Optimal Shrinkage Parameter, Yasin Asar

Journal of Modern Applied Statistical Methods

Multicollinearity in logistic regression affects the variance of the maximum likelihood estimator negatively. In this study, Liu-type estimators are used to reduce the variance and overcome the multicollinearity by applying some existing ridge regression estimators to the case of logistic regression model. A Monte Carlo simulation is given to evaluate the performances of these estimators when the optimal shrinkage parameter is used in the Liu-type estimators, along with an application of real case data.


Solution To The Multicollinearity Problem By Adding Some Constant To The Diagonal, Hanan Duzan, Nurul Sima Binti Mohamaed Shariff May 2016

Solution To The Multicollinearity Problem By Adding Some Constant To The Diagonal, Hanan Duzan, Nurul Sima Binti Mohamaed Shariff

Journal of Modern Applied Statistical Methods

Ridge regression is an alternative to ordinary least-squares (OLS) regression. It is believed to be superior to least-squares regression in the presence of multicollinearity. The robustness of this method is investigated and comparison is made with the least squares method through simulation studies. Our results show that the system stabilizes in a region of k, where k is a positive quantity less than one and whose values depend on the degree of correlation between the independent variables. The results also illustrate that k is a linear function of the correlation between the independent variables.


Robust Winsorized Shrinkage Estimators For Linear Regression Model, Nileshkumar H. Jadhav, D N. Kashid Nov 2014

Robust Winsorized Shrinkage Estimators For Linear Regression Model, Nileshkumar H. Jadhav, D N. Kashid

Journal of Modern Applied Statistical Methods

In multiple linear regression, the ordinary least squares estimator is very sensitive to the presence of multicollinearity and outliers in the response variable. To handle these problems in the data, Winsorized shrinkage estimators are proposed and the performance of these estimators is evaluated through mean square error sense.


A Comparison Between Biased And Unbiased Estimators In Ordinary Least Squares Regression, Ghadban Khalaf Nov 2013

A Comparison Between Biased And Unbiased Estimators In Ordinary Least Squares Regression, Ghadban Khalaf

Journal of Modern Applied Statistical Methods

During the past years, different kinds of estimators have been proposed as alternatives to the Ordinary Least Squares (OLS) estimator for the estimation of the regression coefficients in the presence of multicollinearity. In the general linear regression model, Y = Xβ + e, it is known that multicollinearity makes statistical inference difficult and may even seriously distort the inference. Ridge regression, as viewed here, defines a class of estimators of β indexed by a scalar parameter k. Two methods of specifying k are proposed and evaluated in terms of Mean Square Error (MSE) by …


A Proposed Ridge Parameter To Improve The Least Square Estimator, Ghadban Khalaf Nov 2012

A Proposed Ridge Parameter To Improve The Least Square Estimator, Ghadban Khalaf

Journal of Modern Applied Statistical Methods

Ridge regression, a form of biased linear estimation, is a more appropriate technique than ordinary least squares (OLS) estimation in the case of highly intercorrelated explanatory variables in the linear regression model Y = β + u. Two proposed ridge regression parameters from the mean square error (MSE) perspective are evaluated. A simulation study was conducted to demonstrate the performance of the proposed estimators compared to the OLS, HK and HKB estimators. Results show that the suggested estimators outperform the OLS and the other estimators regarding the ridge parameters in all situations examined.


Improved Estimator In The Presence Of Multicollinearity, Ghadban Khalaf May 2012

Improved Estimator In The Presence Of Multicollinearity, Ghadban Khalaf

Journal of Modern Applied Statistical Methods

The performances of two biased estimators for the general linear regression model under conditions of collinearity are examined and a new proposed ridge parameter is introduced. Using Mean Square Error (MSE) and Monte Carlo simulation, the resulting estimator’s performance is evaluated and compared with the Ordinary Least Square (OLS) estimator and the Hoerl and Kennard (1970a) estimator. Results of the simulation study indicate that, with respect to MSE criteria, in all cases investigated the proposed estimator outperforms both the OLS and the Hoerl and Kennard estimators.


Ridge Regression Based On Some Robust Estimators, Hatice Samkar, Ozlem Alpu Nov 2010

Ridge Regression Based On Some Robust Estimators, Hatice Samkar, Ozlem Alpu

Journal of Modern Applied Statistical Methods

Robust ridge methods based on M, S, MM and GM estimators are examined in the presence of multicollinearity and outliers. GMWalker, using the LS estimator as the initial estimator is used. S and MM estimators are also used as initial estimators with the aim of evaluating the two alternatives as biased robust methods.


Multiple Regression In Pair Correlation Solution, Stan Lipovetsky May 2009

Multiple Regression In Pair Correlation Solution, Stan Lipovetsky

Journal of Modern Applied Statistical Methods

Behavior of the coefficients of ordinary least squares (OLS) regression with the coefficients regularized by the one-parameter ridge (Ridge-1) and two-parameter ridge (Ridge-2) regressions are compared. The ridge models are not prone to multicollinearity. The fit quality of Ridge-2 does not decrease with the profile parameter increase, but the Ridge-2 model converges to a solution proportional to the coefficients of pair correlation between the dependent variable and predictors. The Correlation-Regression (CORE) model suggests meaningful coefficients and net effects for the individual impact of the predictors, high quality model fit, and convenient analysis and interpretation of the regression. Simulation with three …


Entropy Criterion In Logistic Regression And Shapley Value Of Predictors, Stan Lipovetsky May 2006

Entropy Criterion In Logistic Regression And Shapley Value Of Predictors, Stan Lipovetsky

Journal of Modern Applied Statistical Methods

Entropy criterion is used for constructing a binary response regression model with a logistic link. This approach yields a logistic model with coefficients proportional to the coefficients of linear regression. Based on this property, the Shapley value estimation of predictors’ contribution is applied for obtaining robust coefficients of the linear aggregate adjusted to the logistic model. This procedure produces a logistic regression with interpretable coefficients robust to multicollinearity. Numerical results demonstrate theoretical and practical advantages of the entropy-logistic regression.


Determining Predictor Importance In Multiple Regression Under Varied Correlational And Distributional Conditions, Tiffany A. Whittaker, Rachel T. Fouladi, Natasha J. Williams Nov 2002

Determining Predictor Importance In Multiple Regression Under Varied Correlational And Distributional Conditions, Tiffany A. Whittaker, Rachel T. Fouladi, Natasha J. Williams

Journal of Modern Applied Statistical Methods

This study examines the performance of eight methods of predictor importance under varied correlational and distributional conditions. The proportion of times a method correctly identified the dominant predictor was recorded. Results indicated that the new methods of importance proposed by Budescu (1993) and Johnson (2000) outperformed commonly used importance methods.