<P> In statistics, the coefficient of determination, denoted R or r and pronounced "R squared", is the proportion of the variation in the dependent variable that is predictable from the independent variable (s). </P> <P> It is a statistic used in the context of statistical models whose main purpose is either the prediction of future outcomes or the testing of hypotheses, on the basis of other related information . It provides a measure of how well observed outcomes are replicated by the model, based on the proportion of total variation of outcomes explained by the model . </P> <P> There are several definitions of R that are only sometimes equivalent . One class of such cases includes that of simple linear regression where r is used instead of R. When an intercept is included, then r is simply the square of the sample correlation coefficient (i.e., r) between the observed outcomes and the observed predictor values . If additional regressors are included, R is the square of the coefficient of multiple correlation . In both such cases, the coefficient of determination ranges from 0 to 1 . </P> <P> Important cases where the computational definition of R can yield negative values, depending on the definition used, arise where the predictions that are being compared to the corresponding outcomes have not been derived from a model - fitting procedure using those data, and where linear regression is conducted without including an intercept . Additionally, negative values of R may occur when fitting non-linear functions to data . In cases where negative values arise, the mean of the data provides a better fit to the outcomes than do the fitted function values, according to this particular criterion . </P>

In simple linear regression r2 is the​