Explained sum of squares
From Wikipedia, the free encyclopedia
In statistics, an explained sum of squares (ESS) is the sum of squared predicted values in a standard regression model (for example yi = a + bxi + εi), where yi is the response variable, xi is the explanatory variable, a and b are coefficients, i indexes the observations from 1 to n, and εi is the error term. In general, the less the ESS, the better the model performs in its estimation.
If and are the estimated coefficients, then
is the predicted variable. The ESS is the sum of the squares of the differences of the predicted values and the grand mean:
In general: total sum of squares = explained sum of squares + residual sum of squares.
[edit] Type I SS
Type one estimates of the sum of squares explained by a model in a variable are obtained when sums of squares for a model are calculated sequentially (e.g. with the model Y = aX + bX + cX). Sums of squares are calculated for a using the model Y = aX and sums of squares for b are collected using the model Y = aX + bX, and sums of squares for c are collected using the model Y = aX + bX + cX.
[edit] Type III SS
The type III sum of squares is calculated by comparing the full model, to the full model without the variable of interest. It is the same as the Type I ss when the variable is the last variable in the model.