International Journal of Advances in Scientific Research and Engineering (ijasre) E-ISSN : 2454-8006 DOI: 10.31695/IJASRE.2019.33217 Volume 5, Issue 5 May - 2019 www.ijasre.net Page 116 Licensed Under Creative Commons Attribution CC BY Investigating the Effects of Multicollinearity on the Model Parameters of Ordinary Least Squares Estimator Nusirat Funmilayo Gatta 1 and Banjoko Alabi Waheed 2 1,2 Faculty of Physical Sciences, Department of Statistics, University of Ilorin, Ilorin, Kwara State Nigeria ABSTRACT This study investigated the effects of multicollinearity on the model parameters of the ordinary least squares regression model. The aim was to examine the impacts of multicollinearity on the efficiency of classical Ordinary least squares (OLS). Data were simulated from a multivariate normal distribution with mean zero and variance-covariance matrix at various sample sizes 25, 50, 100, 200, 500 and 1000. To assess the asymptotic efficiency and consistency of the regression models in the presence of multicollinearity, the evaluation criteria used were the Variance, Absolute bias, Mean Square Error (MSE) and Mean Square Error of Prediction (MSEP). Results from the analysis revealed that the OLS is not efficient given the large MSE, MSEP, and Absolute bias. Keywords: Ordinary least squares, Multicollinearity, Mean Square Error, Absolute Bias, Mean Square Error of Prediction _______________________________________________________________________________________________________ 1. INTRODUCTION The problem that often arises when the assumption of OLS “that the independent variables are independent of each other in the regression model in the sense that they do not move together in the same pattern” is disregarded is Known as multicollinearity problem. The word multicollinearity refers to a situation where there are perfect or near linear relationships among some (or all) of independent variables in the model. Exact multicollinearity occurs when there is a perfectly linear relationship among the explanatory variables and violates the assumption that data matrix X is full rank. In the case of perfect multicollinearity, the correlation coefficient of these variables is equal to unity and inverse of   does not exist since determinant of is zero, making OLS estimate impossible. Recall the Gauss- Markov theorem which states that among all linear unbiased estimators, the least squares estimator has the smallest variance. consider a regression model that has two explanatory variables and a constant. The variance of parameter estimate is V( ) =   ∑    (1) = (  ) Where  is the correlation between and . If the two variables are perfectly correlated, then the variance is inestimable. The case of an exact linear relationship among the predictors is a serious failure of the assumptions of the model not of the data. When predictors are highly correlated but not perfectly correlated that is the column of X is close to linearly independent then variables are suffering from severe multicollinearity problem though the regression model retains all its assumed properties. The most common implication of severe multicollinearity is that individual parameter estimates will not be precise and the method of OLS breaks down. (Bruce 2008). Considering the regression model with two explanatory variables, Y=   (2)