<P> In statistics, multicollinearity (also collinearity) is a phenomenon in which one predictor variable in a multiple regression model can be linearly predicted from the others with a substantial degree of accuracy . In this situation the coefficient estimates of the multiple regression may change erratically in response to small changes in the model or the data . Multicollinearity does not reduce the predictive power or reliability of the model as a whole, at least within the sample data set; it only affects calculations regarding individual predictors . That is, a multivariate regression model with collinear predictors can indicate how well the entire bundle of predictors predicts the outcome variable, but it may not give valid results about any individual predictor, or about which predictors are redundant with respect to others . </P> <P> In the case of perfect multicollinearity (in which one independent variable is an exact linear combination of the others) the design matrix X (\ displaystyle X) has less than full rank, and therefore the moment matrix X T X (\ displaystyle X ^ (\ mathsf (T)) X) cannot be inverted . Under these circumstances, for a general linear model y = X β + ε (\ displaystyle y = X \ beta + \ epsilon), the ordinary least - squares estimator β ^ O L S = (X T X) − 1 X T y (\ displaystyle (\ hat (\ beta)) _ (OLS) = (X ^ (\ mathsf (T)) X) ^ (- 1) X ^ (\ mathsf (T)) y) does not exist . </P>

When are two predictor variables said to be collinear
find me the text answering this question