In statistics, Basu's theorem states that any boundedly complete minimal sufficient statistic is independent of any ancillary statistic. This is a 1955 result of Debabrata Basu.[1]

It is often used in statistics as a tool to prove independence of two statistics, by first demonstrating one is complete sufficient and the other is ancillary, then appealing to the theorem.[2] An example of this is to show that the sample mean and sample variance of a normal distribution are independent statistics, which is done in the Example section below. This property (independence of sample mean and sample variance) characterizes normal distributions.

Statement edit

Let   be a family of distributions on a measurable space   and a statistic   maps from   to some measurable space  . If   is a boundedly complete sufficient statistic for  , and   is ancillary to  , then conditional on  ,   is independent of  . That is,  .

Proof edit

Let   and   be the marginal distributions of   and   respectively.

Denote by   the preimage of a set   under the map  . For any measurable set   we have

 

The distribution   does not depend on   because   is ancillary. Likewise,   does not depend on   because   is sufficient. Therefore

 

Note the integrand (the function inside the integral) is a function of   and not  . Therefore, since   is boundedly complete the function

 

is zero for   almost all values of   and thus

 

for almost all  . Therefore,   is independent of  .

Example edit

Independence of sample mean and sample variance of a normal distribution edit

Let X1, X2, ..., Xn be independent, identically distributed normal random variables with mean μ and variance σ2.

Then with respect to the parameter μ, one can show that

 

the sample mean, is a complete and sufficient statistic – it is all the information one can derive to estimate μ, and no more – and

 

the sample variance, is an ancillary statistic – its distribution does not depend on μ.

Therefore, from Basu's theorem it follows that these statistics are independent conditional on  , conditional on  .

This independence result can also be proven by Cochran's theorem.

Further, this property (that the sample mean and sample variance of the normal distribution are independent) characterizes the normal distribution – no other distribution has this property.[3]

Notes edit

  1. ^ Basu (1955)
  2. ^ Ghosh, Malay; Mukhopadhyay, Nitis; Sen, Pranab Kumar (2011), Sequential Estimation, Wiley Series in Probability and Statistics, vol. 904, John Wiley & Sons, p. 80, ISBN 9781118165911, The following theorem, due to Basu ... helps us in proving independence between certain types of statistics, without actually deriving the joint and marginal distributions of the statistics involved. This is a very powerful tool and it is often used ...
  3. ^ Geary, R.C. (1936). "The Distribution of "Student's" Ratio for Non-Normal Samples". Supplement to the Journal of the Royal Statistical Society. 3 (2): 178–184. doi:10.2307/2983669. JFM 63.1090.03. JSTOR 2983669.

References edit