In mathematics, a moment matrix is a special symmetric square matrix whose rows and columns are indexed by monomials. The entries of the matrix depend on the product of the indexing monomials only (cf. Hankel matrices.)

Moment matrices play an important role in polynomial fitting, polynomial optimization (since positive semidefinite moment matrices correspond to polynomials which are sums of squares)[1] and econometrics.[2]

Application in regression edit

A multiple linear regression model can be written as

 

where   is the explained variable,   are the explanatory variables,   is the error, and   are unknown coefficients to be estimated. Given observations  , we have a system of   linear equations that can be expressed in matrix notation.[3]

 

or

 

where   and   are each a vector of dimension  ,   is the design matrix of order  , and   is a vector of dimension  . Under the Gauss–Markov assumptions, the best linear unbiased estimator of   is the linear least squares estimator  , involving the two moment matrices   and   defined as

 

and

 

where   is a square normal matrix of dimension  , and   is a vector of dimension  .

See also edit

References edit

  1. ^ Lasserre, Jean-Bernard, 1953- (2010). Moments, positive polynomials and their applications. World Scientific (Firm). London: Imperial College Press. ISBN 978-1-84816-446-8. OCLC 624365972.{{cite book}}: CS1 maint: multiple names: authors list (link) CS1 maint: numeric names: authors list (link)
  2. ^ Goldberger, Arthur S. (1964). "Classical Linear Regression". Econometric Theory. New York: John Wiley & Sons. pp. 156–212. ISBN 0-471-31101-4.
  3. ^ Huang, David S. (1970). Regression and Econometric Methods. New York: John Wiley & Sons. pp. 52–65. ISBN 0-471-41754-8.

External links edit