Is entropy relative


Next:Sufficiency and the monotonic likelihood quotient Up:Consistently best tests Previous:Lemma of Neyman-Pearson & nbsp Contents


Consistency and Relative Entropy

Notice
 
  • We look again at the hypotheses
    or.
    for any two but predefined parameter values With .
  • For the sake of simplicity, we assume that the distributions are absolutely continuous and that the densities of are positive everywhere.
  • It follows in particular that
    (47)


The following limit theorem is used in the literature Lemma of Stein named (in honor of Charles Stein, Professor Emeritus in Statistics, Stanford University, USA)

Theorem 4.3 Be . For each sample size be a Neyman-Pearson test with . Then applies
(48)

proof
 
  • We use similar considerations as in the proof of Theorem 2.10.
  • Be and
    (49)

    in which is the likelihood quotient defined in (28).
  • Then applies , and
  • according to Theorem 4.2 there are constants and , so that for the Neyman-Pearson tests With
    (50)

  • We first show that
    (51)

  • From (50) it follows that and so too if .
  • This results in the validity of the inequalities
    or.
  • Thus, in order to prove (51) it suffices to show that it is for every a natural number there so that for each .
  • But this follows from the fact that
    for each and
    for each , where the last convergence results from (49) and from the strong law of large numbers, see the proof of Theorem 2.10.
  • To end the proof, let us show that
    (52)

  • Here we can o.B.d.A. assume that .
  • For each then applies
       

    for each sufficiently large , where the last inequality results from the strong law of large numbers (cf. Theorem WR-5.15), because under .
  • It follows that

  • This proves (52).

Notice
 
example
Normally distributed sample variables


Next:Sufficiency and the monotonic likelihood quotient Up:Consistently best tests Previous:Lemma of Neyman-Pearson & nbsp Contents Roland Maier 2003-03-06