Open Access. Powered by Scholars. Published by Universities.®

Physical Sciences and Mathematics Commons

Open Access. Powered by Scholars. Published by Universities.®

Wayne State University

Ridge regression

Articles 1 - 10 of 10

Full-Text Articles in Physical Sciences and Mathematics

Multicollinearity And A Ridge Parameter Estimation Approach, Ghadban Khalaf, Mohamed Iguernane Nov 2016

Multicollinearity And A Ridge Parameter Estimation Approach, Ghadban Khalaf, Mohamed Iguernane

Journal of Modern Applied Statistical Methods

One of the main goals of the multiple linear regression model, Y = + u, is to assess the importance of independent variables in determining their predictive ability. However, in practical applications, inference about the coefficients of regression can be difficult because the independent variables are correlated and multicollinearity causes instability in the coefficients. A new estimator of ridge regression parameter is proposed and evaluated by simulation techniques in terms of mean squares error (MSE). Results of the simulation study indicate that the suggested estimator dominates ordinary least squares (OLS) estimator and other ridge estimators with respect to …


Solution To The Multicollinearity Problem By Adding Some Constant To The Diagonal, Hanan Duzan, Nurul Sima Binti Mohamaed Shariff May 2016

Solution To The Multicollinearity Problem By Adding Some Constant To The Diagonal, Hanan Duzan, Nurul Sima Binti Mohamaed Shariff

Journal of Modern Applied Statistical Methods

Ridge regression is an alternative to ordinary least-squares (OLS) regression. It is believed to be superior to least-squares regression in the presence of multicollinearity. The robustness of this method is investigated and comparison is made with the least squares method through simulation studies. Our results show that the system stabilizes in a region of k, where k is a positive quantity less than one and whose values depend on the degree of correlation between the independent variables. The results also illustrate that k is a linear function of the correlation between the independent variables.


Ridge Regression And Ill-Conditioning, Ghadban Khalaf, Mohamed Iguernane Nov 2014

Ridge Regression And Ill-Conditioning, Ghadban Khalaf, Mohamed Iguernane

Journal of Modern Applied Statistical Methods

Hoerl and Kennard (1970) suggested the ridge regression estimator as an alternative to the Ordinary Least Squares (OLS) estimator in the presence of multicollinearity. This article proposes new methods for estimating the ridge parameter in case of ordinary ridge regression. A simulation study evaluates the performance of the proposed estimators based on the Mean Squared Error (MSE) criterion and indicates that, under certain conditions, the proposed estimators perform well compared to the OLS estimator and another well-known estimator reviewed.


A Comparison Between Biased And Unbiased Estimators In Ordinary Least Squares Regression, Ghadban Khalaf Nov 2013

A Comparison Between Biased And Unbiased Estimators In Ordinary Least Squares Regression, Ghadban Khalaf

Journal of Modern Applied Statistical Methods

During the past years, different kinds of estimators have been proposed as alternatives to the Ordinary Least Squares (OLS) estimator for the estimation of the regression coefficients in the presence of multicollinearity. In the general linear regression model, Y = Xβ + e, it is known that multicollinearity makes statistical inference difficult and may even seriously distort the inference. Ridge regression, as viewed here, defines a class of estimators of β indexed by a scalar parameter k. Two methods of specifying k are proposed and evaluated in terms of Mean Square Error (MSE) by …


A Proposed Ridge Parameter To Improve The Least Square Estimator, Ghadban Khalaf Nov 2012

A Proposed Ridge Parameter To Improve The Least Square Estimator, Ghadban Khalaf

Journal of Modern Applied Statistical Methods

Ridge regression, a form of biased linear estimation, is a more appropriate technique than ordinary least squares (OLS) estimation in the case of highly intercorrelated explanatory variables in the linear regression model Y = β + u. Two proposed ridge regression parameters from the mean square error (MSE) perspective are evaluated. A simulation study was conducted to demonstrate the performance of the proposed estimators compared to the OLS, HK and HKB estimators. Results show that the suggested estimators outperform the OLS and the other estimators regarding the ridge parameters in all situations examined.


Improved Estimator In The Presence Of Multicollinearity, Ghadban Khalaf May 2012

Improved Estimator In The Presence Of Multicollinearity, Ghadban Khalaf

Journal of Modern Applied Statistical Methods

The performances of two biased estimators for the general linear regression model under conditions of collinearity are examined and a new proposed ridge parameter is introduced. Using Mean Square Error (MSE) and Monte Carlo simulation, the resulting estimator’s performance is evaluated and compared with the Ordinary Least Square (OLS) estimator and the Hoerl and Kennard (1970a) estimator. Results of the simulation study indicate that, with respect to MSE criteria, in all cases investigated the proposed estimator outperforms both the OLS and the Hoerl and Kennard estimators.


Ridge Regression Based On Some Robust Estimators, Hatice Samkar, Ozlem Alpu Nov 2010

Ridge Regression Based On Some Robust Estimators, Hatice Samkar, Ozlem Alpu

Journal of Modern Applied Statistical Methods

Robust ridge methods based on M, S, MM and GM estimators are examined in the presence of multicollinearity and outliers. GMWalker, using the LS estimator as the initial estimator is used. S and MM estimators are also used as initial estimators with the aim of evaluating the two alternatives as biased robust methods.


Nonlinear Parameterization In Bi-Criteria Sample Balancing, Stan Lipovetsky May 2010

Nonlinear Parameterization In Bi-Criteria Sample Balancing, Stan Lipovetsky

Journal of Modern Applied Statistical Methods

Sample balancing is widely used in applied research to adjust a sample data to achieve better correspondence to Census statistics. The classic Deming-Stephan iterative proportional approach finds the weights of observations by fitting the cross-tables of sample counts to known margins. This work considers a bi-criteria objective for finding weights with maximum possible effective base size. This approach is presented as a ridge regression with the exponential nonlinear parameterization that produces nonnegative weights for sample balancing.


Multiple Regression In Pair Correlation Solution, Stan Lipovetsky May 2009

Multiple Regression In Pair Correlation Solution, Stan Lipovetsky

Journal of Modern Applied Statistical Methods

Behavior of the coefficients of ordinary least squares (OLS) regression with the coefficients regularized by the one-parameter ridge (Ridge-1) and two-parameter ridge (Ridge-2) regressions are compared. The ridge models are not prone to multicollinearity. The fit quality of Ridge-2 does not decrease with the profile parameter increase, but the Ridge-2 model converges to a solution proportional to the coefficients of pair correlation between the dependent variable and predictors. The Correlation-Regression (CORE) model suggests meaningful coefficients and net effects for the individual impact of the predictors, high quality model fit, and convenient analysis and interpretation of the regression. Simulation with three …


Applications Of Some Improved Estimators In Linear Regression, B. M. Golam Kibria Nov 2005

Applications Of Some Improved Estimators In Linear Regression, B. M. Golam Kibria

Journal of Modern Applied Statistical Methods

The problem of estimation of the regression coefficients under multicollinearity situation for the restricted linear model is discussed. Some improve estimators are considered, including the unrestricted ridge regression estimator (URRE), restricted ridge regression estimator (RRRE), shrinkage restricted ridge regression estimator (SRRRE), preliminary test ridge regression estimator (PTRRE), and restricted Liu estimator (RLIUE). The were compared based on the sampling variance-covariance criterion. The RRRE dominates other ridge estimators when the restriction does or does not hold. A numerical example was provided. The RRRE performed equivalently or better than the RLIUE in the sense of having smaller sampling variance.