Explained sum of squares: Difference between revisions

Content deleted Content added
m A description what TSS means.
mNo edit summary
Line 4:
}}
 
In [[statistics]], the '''explained sum of squares (ESS),''' alternatively known as the ''' model sum of squares''' or '''sum of squares due to regression''' ('''"SSR"''' – not to be confused with the [[residual sum of squares]] '''RSS''' or sum of squares of errors), is a quantity used in describing how well a model, often a [[regression analysis|regression model]], represents the data being modelled. In particular, the explained sum of squares measures how much variation there is in the modelled values and this is compared to the [[total sum of squares]] ( TSS ), which measures how much variation there is in the observed data, and to the [[residual sum of squares]], which measures the variation in the modelling errors.
 
==Definition==
Line 16:
 
:<math>\text{ESS} = \sum_{i=1}^n \left(\hat{y}_i - \bar{y}\right)^2.</math>
:where <math>\hat{y}_i</math>the value estimated by the regression line .<ref>{{Cite web|title=Sum of Squares - Definition, Formulas, Regression Analysis|url=https://corporatefinanceinstitute.com/resources/knowledge/other/sum-of-squares/|access-date=2020-06-11|website=Corporate Finance Institute|language=en-US}}</ref>
 
In some cases (see below): [[total sum of squares]] ( TSS )&nbsp;=&nbsp;'''explained sum of squares'''&nbsp;'''(ESS)'''+&nbsp;[[residual sum of squares]] ('''RSS''').
 
==Partitioning in simple linear regression==
The following equality, stating that the total sum of squares (TSS) equals the residual sum of squares (=SSE : the sum of squared errors of prediction) plus the explained sum of squares (SSR :the sum of squares due to regression or explained sum of squares), is generally true in simple linear regression:
 
:<math>\sum_{i=1}^n \left(y_i - \bar{y}\right)^2 = \sum_{i=1}^n \left(y_i - \hat{y}_i\right)^2 + \sum_{i=1}^n \left(\hat{y}_i - \bar{y}\right)^2.</math>