The Coefficient of Determination, also known as R Squared, is interpreted as the goodness of fit of a regression. The higher the coefficient of determination, the better the variance that the dependent variable is explained by the independent variable. The coefficient of determination is the overall measure of the usefullness of a regression. For example, you are looking at an ANOVA table, and you see that your R2 is given at 0.95. This means that the variation in the regression is 95% explained by the independent variable. That is a good regression.

Now, looking at at different ANOVA table, you see that you have a Coefficient of Determination, or R2, of 0.50. That means that the variation in the regression is 50% explained by the independent variable. This is not a good regression.

The Coefficient of Determination can be calculated as the Regression sum of squares, RSS, divided by the total sum of squares, SST

Coefficient of Determination =

__RSS__

SST

Some things to consider about the Coefficient of Determination, aka R2. The regression must be examined for Multicollinearity. Multicollinearity, correlated independent variables, can have the effect of causing a higher R Squared (Coefficient of Determination).

That means that the Coefficient of determination will increase, as you add more independent variables, even if those independent variable do not assist in explaining the variation of the dependent variable. This brings us to the topic of Adjusted R Squared, or the Adjusted Coefficient of Determination, that fixes this problem. The adjusted R Squared can be negative, but must always be less than or equal to the Coefficient of Determination.

The adjusted R Square is not always better than the R Square, but generally will be when you have more than 1 independent variable. Only if the new variables added explain more of the variation. Also, the adjusted R square, is better when looking at samples.