Explained sum of squares: Difference between revisions

Content deleted Content added
term "explained" is confusing
Undid revision 503394676 by 63.225.80.98 (talk)uncited opinion and inappropriate in lead section
Line 1:
{{Expert-subject|Statistics|date=September 2009}}
 
In [[statistics]], the '''explained sum of squares (ESS),''' alternatively known as the '''Model Sum of Squares''' or '''Sum of Squares due to Regression (SSR),''' is a quantity used in describing how well a model, often a [[regression analysis|regression model]], represents the data being modelled. In particular, the explained sum of squares measures how much variation there is in the modelled values and this is compared to the [[total sum of squares]], which measures how much variation there is in the observed data, and to the [[residual sum of squares]], which measures the variation in the modelling errors.
 
It is important not to read too much into the word "explained" for a regressed sum of squares, which is rather too strong a word for what is no more than a ''[[hypothesis]]'' or attempt to explain the variability of some data. Not only may a linear regression '''[[Overfitting|overfit]]''' the data in question, but '''[[correlation does not imply causation]]'''.
 
==Definition==