<P> In statistical data analysis the total sum of squares (TSS or SST) is a quantity that appears as part of a standard way of presenting results of such analyses . It is defined as being the sum, over all observations, of the squared differences of each observation from the overall mean . </P> <P> In statistical linear models, (particularly in standard regression models), the TSS is the sum of the squares of the difference of the dependent variable and its mean: </P> <Dl> <Dd> T S S = ∑ i = 1 n (y i − y _̄) 2 (\ displaystyle \ mathrm (TSS) = \ sum _ (i = 1) ^ (n) \ left (y_ (i) - (\ bar (y)) \ right) ^ (2)) </Dd> </Dl> <Dd> T S S = ∑ i = 1 n (y i − y _̄) 2 (\ displaystyle \ mathrm (TSS) = \ sum _ (i = 1) ^ (n) \ left (y_ (i) - (\ bar (y)) \ right) ^ (2)) </Dd>

Total variation ( sst = total sum of squares ) is made up of two parts