SlideShare una empresa de Scribd logo
1 de 40
Descargar para leer sin conexión
Testing a Point Null Hypothesisi: The
  Irreconcilability of P Values and Evidence

       JAMES O.BERGER and THOMAS SELLKE


                           25.02.2013




JAMES O.BERGER and THOMAS SELLKE   Testing a Point Null Hypothesisi: The Irreconcilability of P Val
Content


   1   INTRODUCTION




       JAMES O.BERGER and THOMAS SELLKE   Testing a Point Null Hypothesisi: The Irreconcilability of P Val
Content


   1   INTRODUCTION
   2   POSTERIOR PROBABILITIES AND ODDS




       JAMES O.BERGER and THOMAS SELLKE   Testing a Point Null Hypothesisi: The Irreconcilability of P Val
Content


   1   INTRODUCTION
   2   POSTERIOR PROBABILITIES AND ODDS
   3   LOWER BOUNDS ON POSTERIOR PROBABILI-
       TIES
            Introduction
            Lower Bounds    for   GA ={All Distributions}
            Lower Bounds    for   GS ={Symmetric Distributions}
            Lower Bounds    for   GUS ={Unimodal,Symmetric Distributions}
            Lower Bounds    for   GNOR ={Normal Distributions}




       JAMES O.BERGER and THOMAS SELLKE   Testing a Point Null Hypothesisi: The Irreconcilability of P Val
Content


   1   INTRODUCTION
   2   POSTERIOR PROBABILITIES AND ODDS
   3   LOWER BOUNDS ON POSTERIOR PROBABILI-
       TIES
            Introduction
            Lower Bounds    for   GA ={All Distributions}
            Lower Bounds    for   GS ={Symmetric Distributions}
            Lower Bounds    for   GUS ={Unimodal,Symmetric Distributions}
            Lower Bounds    for   GNOR ={Normal Distributions}
   4   MORE GENERAL HYPOTHESES AND CONDITIONAL
       CALCULATOINS
            General Formulation
            More General Hypotheses



       JAMES O.BERGER and THOMAS SELLKE   Testing a Point Null Hypothesisi: The Irreconcilability of P Val
Content


   1   INTRODUCTION
   2   POSTERIOR PROBABILITIES AND ODDS
   3   LOWER BOUNDS ON POSTERIOR PROBABILI-
       TIES
            Introduction
            Lower Bounds    for   GA ={All Distributions}
            Lower Bounds    for   GS ={Symmetric Distributions}
            Lower Bounds    for   GUS ={Unimodal,Symmetric Distributions}
            Lower Bounds    for   GNOR ={Normal Distributions}
   4   MORE GENERAL HYPOTHESES AND CONDITIONAL
       CALCULATOINS
            General Formulation
            More General Hypotheses
   5   CONCLUSIONS

       JAMES O.BERGER and THOMAS SELLKE   Testing a Point Null Hypothesisi: The Irreconcilability of P Val
1. Introduction




  The paper studies the problem of testing a point null hypothesis,
  of interest is the relationship between the P value and conditional
  and Bayesian measures of evidence against the null hypothesis

  ∗ The overall conclusion is that P value can be highly
  misleading measures of the evidence provided by le data
  against the null hypothesis




       JAMES O.BERGER and THOMAS SELLKE   Testing a Point Null Hypothesisi: The Irreconcilability of P Val
1. Introduction




  Consider the simple situation of observing a random quantity
  X having density f (x | θ) , θ ⊂ R 1 , it is desired to test the
  null hypothesis H0 : θ = θ0 versus the alternative hypothesis
  H1 : θ = θ0 .

                        p = Prθ=θ0 (T (X ) ≥ T (x))




      JAMES O.BERGER and THOMAS SELLKE   Testing a Point Null Hypothesisi: The Irreconcilability of P Val
1. lntroduction

  Example

  Suppose that X = (X1 , ......., Xn ) where the Xi are iid N(θ, σ 2 )
  Then the usual test statistic is
                               √
                                       ¯
                    T (X ) = n | X − θ0 | /σ
         ¯
   where X is the sample mean, and

                               p = 2(1 − Φ(t))

   where Φ is the standard normal cdf and
                                √
                     t = T (x) = n | x − θ0 | /σ
                                      ¯




       JAMES O.BERGER and THOMAS SELLKE   Testing a Point Null Hypothesisi: The Irreconcilability of P Val
1. Introduction


  We presume that the classical approcach is the report of p,
  rather than the report of a Neyman-Perason error probability.
  This is because
      Most statistician prefer use of P values, feeling it to be impor-
      tant to indicate how strong the evidence against H0 is .
      The alternative measures of evidence we consider are based on
      knowledge of x.
  There are several well-known criticisms of testing a point null
  hypothesis.
      One is the issue of ’statistical’ versus ’practical’ significance,
      that one can get a very small p even when | θ − θ0 | is so small
      as to make θ equivalent to θ0 for practical purposes.
      Another well known is ’Jeffrey’s paradox’

      JAMES O.BERGER and THOMAS SELLKE   Testing a Point Null Hypothesisi: The Irreconcilability of P Val
1. Introduction



  Example
  Consider a Bayesian who chooses the prior distribution on θ, which
  gives probability 0,5 to H0 and H1 and spreads mass out on H1
  according to an N(θ, σ 2 ) density. It will be seen in Section 2 that
  the posterior probability, Pr(H0 | x), of H0 is given by

       Pr (H0 | x) = (1 + (1 + n)−1/2 exp{t 2 /[2[(1 + 1/n)]})−1




       JAMES O.BERGER and THOMAS SELLKE   Testing a Point Null Hypothesisi: The Irreconcilability of P Val
1. Introduction




               Table 1 : Pr(H0 | x) for Jeffreys-Type Prior)

                                          n
      p       t        1       5           10        20         50        100         1,000
     .10    1.645     .42     .44         .47       .56         .65        .72          .89
     .05   .1.960     .35     .33         .37       .42         .52        .60          .82
     .01   .2.576     .21     .13         .14       .16         .22        .27          .53
    .001    3.291    .086    .026        .024      .026        .034       .045         .124

  The conflict between p and Pr(H0 | x) is apparent.




      JAMES O.BERGER and THOMAS SELLKE        Testing a Point Null Hypothesisi: The Irreconcilability of P Val
1. Introduction

  Example
  Again consider a Bayesian who gives each hypothesis prior probabil-
  ity 0.5, but now suppose that he decides to spread out the mass on
  H1 in the symmetric fashion that is as favorable to H1 as possible.
  The corresponding values of Pr (H0 | x) are determined in Section 3
  and are given in Table 2 for certain values of t.


             Table 2 : Pr(H0 | x) for a Prior Biased Towar H1

                     P value(p)         t       Pr (H0 | x)
                         .10         1.645          .340
                         .05         .1.960         .227
                         .01         .2.576         .068
                        .001          3.291        .0088

       JAMES O.BERGER and THOMAS SELLKE   Testing a Point Null Hypothesisi: The Irreconcilability of P Val
1. Introduction


  Example (A Likelihood Analysis)
  It is common to perceive the comparative evidence provided by x for
  two pssible parameter values, θ1 andθ2 , as being measured by the
  likelihood ratio

                       lx (θ1 : θ2 ) = f (x | θ1 )/f (x | θ2 )

   A lower bound on the comparative evidence would be

                                       f (x | θ0 )
            lx = inf lx (θ0 : θ) =                  = exp{−t 2 /2}
                   θ                 supθ f (x | θ)




       JAMES O.BERGER and THOMAS SELLKE     Testing a Point Null Hypothesisi: The Irreconcilability of P Val
1. Introduction


          Values of lx for various t are given in Table 3

            Table 3 : Bounds on the Comparative Likelihood

                                            Likelihood ratio
                 P value(p)         t      lower bound (lx )
                     .10          1.645            .340
                     .05         .1.960            .227
                     .01         .2.576            .068
                    .001          3.291           .0088




      JAMES O.BERGER and THOMAS SELLKE    Testing a Point Null Hypothesisi: The Irreconcilability of P Val
2. Posterior probabilities and odds




  let 0< π0 < 1 denote the prior probability of H0 , and let π1 = 1−π0
  denote the prior probability of H1 , suppose that the mass on H1 is
  spread out according to the density g (θ).

      Realistic hypothesis: H0 :| θ − θ0 |≤ b
      Prior probability π0 would be assigned to {θ :| θ − θ0 |≤ b}
  (To a Bayesian, a point null test is typically reasonable only when
  the prior distribution is of this form)




       JAMES O.BERGER and THOMAS SELLKE   Testing a Point Null Hypothesisi: The Irreconcilability of P Val
2. Posterior probabilities and odds
      Noting that the marginal density of X is

                    m(x) = f (x | θ0 )π0 + (1 − π0 )mg (x)

      Where
                          mg (x) =       f (x | θ)g (θ)dθ

      The posterior probability of H0 is given by

                      Pr (H0 | x) = f (x | θ0 ) × π0 /m(x)

                                 (1 − π0 )     mg (x) −1
                        = [1 +             ×             ]
                                    π0       f (x | θ0 )
      Also of interest is the posterior odds ratio of H0 to H1 which is
                     Pr (H0 | x)        π0       f (x | θ0 )
                                   =           ×
                   1 − Pr (H0 | x)   (1 − π0 )     mg (x)

      JAMES O.BERGER and THOMAS SELLKE   Testing a Point Null Hypothesisi: The Irreconcilability of P Val
2. Posterior probabilities and odds



  The posterior odds ratio of H0 to H1
               Pr (H0 | x)        π0                   f (x | θ0 )
                             =           ×
             1 − Pr (H0 | x)   (1 − π0 )                 mg (x)
                 Post odds         Prior odds     Bayes factor Bg (x)

  Interest in the Bayes factor centers around the fact that it does
  not involve the prior probabilities of the hypotheses and hence is
  sometimes interpreted as the actual odds of the hypotheses implied
  by the data alone.




      JAMES O.BERGER and THOMAS SELLKE   Testing a Point Null Hypothesisi: The Irreconcilability of P Val
2. Posterior probabilities and odds
  Example (Jeffreys-Lindley paradox)
  Suppose that π0 is arbitrary and g is again N(θ0 , σ 2 ).Since a suffi-
  cent statistic for θ is X ¯    N(θ0 , σ 2 /n) ,we have that mg (¯) is an
                                                                  x
  N(θ0 , σ 2 (1 + n−1 )) distribution. Thus


                           Bg (x) = f (x | θ0 )/mg (¯)
                                                    x

                       [2πσ 2 /n]−2 exp{− n (¯ − θ0 )2 /σ 2 }
                                          2 x
         =        2 (1 + n−1 )]−1/2 exp{− 1 (¯ − θ 2 )/[σ 2 (1 + n−1 ]}
             [2πσ                         2 x     0
                                       1
                     = (1 + n)1/2 exp{− t 2 /(1 + n−1 )}
                                       2
  and
                   Pr (H0 | x) = [1 + (1 − π0 )/(π0 Bg )]−1
                  (1 − π0 )                   1
         = [1 +             (1 + n)−1/2 × exp{ t 2 /(1 + n−1 )}]−1
                     π0                       2
        JAMES O.BERGER and THOMAS SELLKE   Testing a Point Null Hypothesisi: The Irreconcilability of P Val
3. Lower bounds on posterior probabilites
3.1 Introduction



   This section will examine some lower bounds on Pr (H0 | x)
   when g (θ), the distribution of θ given that H1 is true is
   allowed to vary within some class of distribitions G
        GA ={all distributions}
        GS ={all distributions symmetric about θ0 }
        GUS ={all unimodal distribution symmetric about θ0 }
        GNOR ={all N(θ0 , τ 2 )distributions, 0≤ τ 2 < ∞}
   Even though these G’s are supposed to consist only of distribution
   on {θ | θ = θ0 }, il will be convenient to allow them to include
   distributions with mass at θ0 , so the lower bounds we compute are
   always attained.



         JAMES O.BERGER and THOMAS SELLKE   Testing a Point Null Hypothesisi: The Irreconcilability of P Val
3. Lower bounds on posterior probabilites
3.1 Introduction



   Letting
                        Pr (H0 | x, G ) = inf Pr (H0 | x)
                                             g ∈G

   and
                              B(xmG ) = inf Bg (x)
                                             g ∈G

   we see immediately form formulas before that

                        B(x, G ) = f (x | θ0 )/ sup mg (x)
                                                    g ∈G

   and
                                            (1 − π0 )     1
               Pr (H0 | xmG ) = [1 +                  ×          ]−1
                                               π0       B(x, G )


         JAMES O.BERGER and THOMAS SELLKE     Testing a Point Null Hypothesisi: The Irreconcilability of P Val
3. Lower bounds on posterior probabilites
3.2 Lower bounds for GA ={All distributions}




   Theorem
   Suppose that a maximum likelihood estimate of θ0 , exists for the
   observed x. Then
                                                      ˆ
                       B(x, GA ) = f (x | θ0 )/f (x | θ(x))

   and
                                                              ˆ
                                            (1 − π0 ) f (x | θ(x)) −1
             Pr (H0 | x, GA ) = [1 +                 ×              ]
                                               π0       f (x | θ0 )




         JAMES O.BERGER and THOMAS SELLKE      Testing a Point Null Hypothesisi: The Irreconcilability of P Val
3. Lower bounds on posterior probabilites
3.2 Lower bounds for GA ={All distributions}




   Example
   In this situation,we have
                                                2
                                B(x, GA ) = e −t /2
                                ¯

   and
                                              (1 − π0 ) t 2 /2 −1
                   Pr (H0 | x, GA ) = [1 +             e      ]
                                                 π0


   For servral choices of t, Table 4 gives the corresponding two-sided
   P values,p, and the values of Pr (H0 | x, GA ),with π0 = 0.5.



         JAMES O.BERGER and THOMAS SELLKE   Testing a Point Null Hypothesisi: The Irreconcilability of P Val
3. Lower bounds on posterior probabilites
3.2 Lower bounds for GA ={All distributions}



   For servral choices of t, Table 4 gives the corresponding
   two-sided P values,p, and the values of Pr (H0 | x, GA ),with
   π0 = 0.5.

     Table 4 : Comparison of P values and Pr (H0 | x, GA ) when π0 = 0.5

           P value(p)        t       Pr (H0 | x, GA )        Pr (H0 | x, GA )/(pt)
               .10         1.645           .205                      1.25
               .05        .1.960           .128                      1.30
               .01        .2.576           .035                      1.36
              .001        3.291           .0044                      1.35




         JAMES O.BERGER and THOMAS SELLKE   Testing a Point Null Hypothesisi: The Irreconcilability of P Val
3. Lower bounds on posterior probabilites
3.2 Lower bounds for GA ={All distributions}




   Theorem
   For t > 1.68 and π0 = 0.5 in Example 1,

                     Pr (H0 | x, GA )/pt >        π/2         1.253


   Furthermore
                        lim Pr (H0 | x, GA )/pt =            π/2
                       t→∞




         JAMES O.BERGER and THOMAS SELLKE   Testing a Point Null Hypothesisi: The Irreconcilability of P Val
3. Lower bounds on posterior probabilites
3.3 Lower bounds for GS ={Symmetric distributions}


   There is a large gap between Pr (H0 | x, GA )andPr (H0 | x) for
   the Jeffreys-type single prior analysis.This reinforces the
   suspicion that using GA unduly biases the conclusion against
   H0 and suggests use of more reasonable classes of priors.
   Theorem
                            sup mg (x) = sup mg (x),
                          g ∈G2ps           g ∈GS
   so
                             B(x, G2PS ) = B(x, GS )
   and
                       Pr (H0 | x, G2ps ) = Pr (H0 | x, GS )




         JAMES O.BERGER and THOMAS SELLKE   Testing a Point Null Hypothesisi: The Irreconcilability of P Val
3. Lower bounds on posterior probabilites
3.3 Lower bounds for GS ={Symmetric distributions}



   Example
   If t ≤ 1, a calculus argument show that the symmetric two point
   distribution that strictly maximizes mg (x) is the degenerate ”two-
   point”distribution putting all mass at θ0 . Thus B(x, GS ) = 1 and
   Pr (H0 | x, GS ) = π0 for t ≤ 1.
   If t ≥ 1 , then mg (x) is maximized by a nondegenerate element
   of G2ps . For moderately large t, the maximum value of mg (x) for
   g∈ G2ps is very well approximated by taking g to be the two-point
                                       ˆ                ˆ
   distribution putting equal mass at θ(x) and at 2θ − θ(x). so

                                  ϕ(t)
            B(x, GS )                                    2 exp {−0.5t 2 }
                           0.5ϕ(0) + 0.5ϕ(2t)




        JAMES O.BERGER and THOMAS SELLKE   Testing a Point Null Hypothesisi: The Irreconcilability of P Val
3. Lower bounds on posterior probabilites
3.3 Lower bounds for GS ={Symmetric distributions}


   Example
   For t ≤ 1.645, the first approximation is accurate to within 1 in the
   fourth significant digit and the second approximation to within 2 in
   the third significant digit.
   Table 5 gives the value of Pr (H0 | x, Gs ) of several choices of t.

     Table 5 : Comparison of P values and Pr (H0 | x, GS ) when π0 = 0.5

          P value(p)        t       Pr (H0 | x, GS )        Pr (H0 | x, GS )/(pt)
              .10        1.645            .340                      2.07
              .05        .1.960           .227                      2.31
              .01        .2.576           .068                      2.62
             .001        3.291           .0088                      2.68



        JAMES O.BERGER and THOMAS SELLKE   Testing a Point Null Hypothesisi: The Irreconcilability of P Val
3. Lower bounds on posterior probabilites
3.4 Lower bounds for GUS ={Unimodal, Symmetric distributions}




   Minimizing Pr (H0 | x) over all symmetric priors still involves
   considerable bias against H0 . A further ’objective’ restriction,
   which would seem reasonable to many, is to require the prior
   to be unimodal, or non-increasing in | θ − θ0 | .
   Theorem
                           sup mg (x) = sup mg (x),
                          g ∈Gus           g ∈US

   with US ={all symmetric uniform distributions}
   so B(x, GUS ) = B(x, US ) and Pr (H0 | x, GUS ) = Pr (H0 | x, US )




        JAMES O.BERGER and THOMAS SELLKE   Testing a Point Null Hypothesisi: The Irreconcilability of P Val
3. Lower bounds on posterior probabilites
3.4 Lower bounds for GUS ={Unimodal, Symmetric distributions}


   Theorem
   If t ≤ 1 in example 1, then B(x, GUS ) = 1 and Pr (H0 | x.GUS ) = π0 .
   Ift > 1 then
                                             2ϕ(t)
                      B(x.GUS ) =
                                      ϕ(K + t) + ϕ(K − t)

   and

                                   (1 − π0 ) ϕ(K + t) + ϕ(K = t) −1
      Pr (H0 | x, GUS ) = [1 +              ×                   ]
                                      π0            2ϕ(t)
   where K > 0

   Figures 1 and 2 give values of K and B for various val-
   ues of t in this problem

         JAMES O.BERGER and THOMAS SELLKE   Testing a Point Null Hypothesisi: The Irreconcilability of P Val
3. Lower bounds on posterior probabilites
3.4 Lower bounds for GUS ={Unimodal, Symmetric distributions}




        JAMES O.BERGER and THOMAS SELLKE   Testing a Point Null Hypothesisi: The Irreconcilability of P Val
3. Lower bounds on posterior probabilites
3.4 Lower bounds for GUS ={Unimodal, Symmetric distributions}




   Table 6 gives Pr (H0 | x, GUS ) for some specific important
   values of t and π0 = 0.5

    Table 6 : Comparison of P values and Pr (H0 | x, GUS ) when π0 = 0.5


          P value(p)        t       Pr (H0 | x, GUS )         Pr (H0 | x, GUS )/(pt)
              .10         1.645           .390                         1.44
              .05        .1.960           .290                         1.51
              .01        .2.576           .109                         1.64
             .001        3.291            .018                         1.66




        JAMES O.BERGER and THOMAS SELLKE   Testing a Point Null Hypothesisi: The Irreconcilability of P Val
3. Lower bounds on posterior probabilites
3.5 Lower bounds for GNOR ={Normal distributions}

   We have seene that minimizing Pr (H | x)overg ∈ GUS is the same
   as minimizing over g ∈ US . Althought using US is much more
   reasonable than using GA ,there is still some residual bias against
   H0 involved in using US .
   Theorem
   If t ≤ 1in Example 1, then B(x, GNOR ) = 1 and Pr (H0 | x, GNOR ) =
   π0 . If t > 1, then
                                      √       2
                         B(x, GNOR ) = ete −t /2

   and
                                            (1 − π0 ) exp{t 2 /2} −1
            Pr (H0 | x, GNOR ) = [1 +                ×   √       ]
                                               π0          et


         JAMES O.BERGER and THOMAS SELLKE    Testing a Point Null Hypothesisi: The Irreconcilability of P Val
3. Lower bounds on posterior probabilites
3.5 Lower bounds for GNOR ={Normal distributions}




   Table 7 gives Pr (H0 | x, GNOR ) for servral values of t

    Table 7 : Comparison of P values and Pr (H0 | x, GNOR ) when π0 = 0.5


          P value(p)        t       Pr (H0 | x, GNOR )          Pr (H0 | x, GNOR )/(pt)
              .10         1.645            .412                           1.52
              .05        .1.960            .321                           1.67
              .01        .2.576            .133                           2.01
             .001        3.291            .0235                           2.18




        JAMES O.BERGER and THOMAS SELLKE   Testing a Point Null Hypothesisi: The Irreconcilability of P Val
4. More general hypotheses and conditional calculations
4.1 General formulation




   Consider the Bayesian calculation of Pr (H0 | A) , where H0 is
   of the form H0 : θ ∈ Θ0 and A is the set in which x is known
   to reside. Then letting π0 and π1 denote the prior
   probabilities of H0 and H1 and g1 and g2 as the densities on
   Θ0 and Θ1 . Then we have :

                                            1 − π0 mg 1 (A) −1
                   Pr (H0 | A) = [1 +             ×          ]
                                            π+0     mg 0 (A)

   where
                          mg 1 (A) =         Prθ (A)gi (θ)dθ
                                        Θ0




         JAMES O.BERGER and THOMAS SELLKE     Testing a Point Null Hypothesisi: The Irreconcilability of P Val
4. More general hypotheses and conditional calculations
4.1 General formulation




   For the general formulation, one can determine lower bounds
   on Pr (H0 | A) by choosing sets G0 and G1 of g0 and g1 ,
   respectively, calculating

                 B(A, G0 , G1 ) = inf mg0 (A)/ sup mg 1 (A)
                                    g0 ∈G0              g1 ∈G1

                                                     (1−π0 )             1         −1
   and defining Pr (H0 | A, G0 , G1 ) = [1 +            π0        ×   B(A,G0 ,G1 ) ]




         JAMES O.BERGER and THOMAS SELLKE    Testing a Point Null Hypothesisi: The Irreconcilability of P Val
4. More general hypotheses and conditional calculations
4.2 More general hypotheses




   Assume in this section that A= {x}. The lower bounds can
   be applied to a variety of generalizations of point null
   hypotheses. If Θ0 is a small set about θ0 , the negeral lower
   bounds turn out to be essentially equivalent to the point null
   lower bounds.

   In example 1, suppose that the hypotheses were H0 : θ ∈ (θ0 −
                                                        √
   b, θ0 + b) and H1 : θ ∈ (θ0 − b, θ0 + b) If | t − nb/σ |≥ 1
                           /
   and G0 = G1 = GS , then B(x, G0 , G1 ) and Pr (H0 | x, G0 , G1 ) are
   exactly the same as B and P for testing the point null.




        JAMES O.BERGER and THOMAS SELLKE   Testing a Point Null Hypothesisi: The Irreconcilability of P Val
5. Conclusion


  B(x, GUS ) and the P value calculated at (t − 1)+ in instead of t,
  this last will be called the P value of (t − 1)+ Figure shows that
  this comparative likelihood is close to the P value that would be
  obtained if we replaced t by (t − 1)+ . The implication is that :
  t = 1 means only mild evidence against H0 , t = 2 means
  significant evidence against H0 , and t = 3 means highly evidence
  against H0 , and t = 4 means over whelming evidence against H0 ,
  should at least replaced by the rule of thumb, that is :
  t = 1 means no evidence against H0 , t = 2 means only mild
  evidence against H0 , and t = 3 means significant evidence against
  H0 , and t = 4 means highly significant evidence against H0 .




       JAMES O.BERGER and THOMAS SELLKE   Testing a Point Null Hypothesisi: The Irreconcilability of P Val
5. Conclusion




         t = 1 means no evidence against H0 , t = 2 means only mild
   evidence against H0 , and t = 3 means significant evidence against
         H0 , and t = 4 means highly significant evidence against H0 .

       JAMES O.BERGER and THOMAS SELLKE   Testing a Point Null Hypothesisi: The Irreconcilability of P Val
References




  EDWARDS, W, LINDMAN, H, and SAVAGE, L.J (1963).
  Bayesian statistical inference for psychological research. Psycho-
  logical Review 70, 193-242
  Jayanta Ghosha, Sumitra Purkayastha and Tapas Samanta
  Role of P-values and other Measures of Evidence in Bayesian Anal-
  ysis. Handbook of Statistics, Vol. 25
  James O. Berger Statistical Decision Theory and Bayesian Analy-
  sis. Springer Series in Statistics




      JAMES O.BERGER and THOMAS SELLKE   Testing a Point Null Hypothesisi: The Irreconcilability of P Val

Más contenido relacionado

La actualidad más candente

Reading Birnbaum's (1962) paper, by Li Chenlu
Reading Birnbaum's (1962) paper, by Li ChenluReading Birnbaum's (1962) paper, by Li Chenlu
Reading Birnbaum's (1962) paper, by Li ChenluChristian Robert
 
Fixed Point Results In Fuzzy Menger Space With Common Property (E.A.)
Fixed Point Results In Fuzzy Menger Space With Common Property (E.A.)Fixed Point Results In Fuzzy Menger Space With Common Property (E.A.)
Fixed Point Results In Fuzzy Menger Space With Common Property (E.A.)IJERA Editor
 
Discussion of Persi Diaconis' lecture at ISBA 2016
Discussion of Persi Diaconis' lecture at ISBA 2016Discussion of Persi Diaconis' lecture at ISBA 2016
Discussion of Persi Diaconis' lecture at ISBA 2016Christian Robert
 
Reformulation of Nash Equilibrium with an Application to Interchangeability
Reformulation of Nash Equilibrium with an Application to InterchangeabilityReformulation of Nash Equilibrium with an Application to Interchangeability
Reformulation of Nash Equilibrium with an Application to InterchangeabilityYosuke YASUDA
 
Statistics (1): estimation Chapter 3: likelihood function and likelihood esti...
Statistics (1): estimation Chapter 3: likelihood function and likelihood esti...Statistics (1): estimation Chapter 3: likelihood function and likelihood esti...
Statistics (1): estimation Chapter 3: likelihood function and likelihood esti...Christian Robert
 
Statistics symposium talk, Harvard University
Statistics symposium talk, Harvard UniversityStatistics symposium talk, Harvard University
Statistics symposium talk, Harvard UniversityChristian Robert
 
A factorization theorem for generalized exponential polynomials with infinite...
A factorization theorem for generalized exponential polynomials with infinite...A factorization theorem for generalized exponential polynomials with infinite...
A factorization theorem for generalized exponential polynomials with infinite...Pim Piepers
 
Advanced Microeconomics - Lecture Slides
Advanced Microeconomics - Lecture SlidesAdvanced Microeconomics - Lecture Slides
Advanced Microeconomics - Lecture SlidesYosuke YASUDA
 
Optimization Approach to Nash Euilibria with Applications to Interchangeability
Optimization Approach to Nash Euilibria with Applications to InterchangeabilityOptimization Approach to Nash Euilibria with Applications to Interchangeability
Optimization Approach to Nash Euilibria with Applications to InterchangeabilityYosuke YASUDA
 
Statistics (1): estimation, Chapter 2: Empirical distribution and bootstrap
Statistics (1): estimation, Chapter 2: Empirical distribution and bootstrapStatistics (1): estimation, Chapter 2: Empirical distribution and bootstrap
Statistics (1): estimation, Chapter 2: Empirical distribution and bootstrapChristian Robert
 
Introduction to Decision Making Theory
Introduction to Decision Making TheoryIntroduction to Decision Making Theory
Introduction to Decision Making TheoryYosuke YASUDA
 
On the vexing dilemma of hypothesis testing and the predicted demise of the B...
On the vexing dilemma of hypothesis testing and the predicted demise of the B...On the vexing dilemma of hypothesis testing and the predicted demise of the B...
On the vexing dilemma of hypothesis testing and the predicted demise of the B...Christian Robert
 
11.[29 35]a unique common fixed point theorem under psi varphi contractive co...
11.[29 35]a unique common fixed point theorem under psi varphi contractive co...11.[29 35]a unique common fixed point theorem under psi varphi contractive co...
11.[29 35]a unique common fixed point theorem under psi varphi contractive co...Alexander Decker
 
testing as a mixture estimation problem
testing as a mixture estimation problemtesting as a mixture estimation problem
testing as a mixture estimation problemChristian Robert
 
Approximating Bayes Factors
Approximating Bayes FactorsApproximating Bayes Factors
Approximating Bayes FactorsChristian Robert
 
Laplace's Demon: seminar #1
Laplace's Demon: seminar #1Laplace's Demon: seminar #1
Laplace's Demon: seminar #1Christian Robert
 
Can we estimate a constant?
Can we estimate a constant?Can we estimate a constant?
Can we estimate a constant?Christian Robert
 
ABC based on Wasserstein distances
ABC based on Wasserstein distancesABC based on Wasserstein distances
ABC based on Wasserstein distancesChristian Robert
 

La actualidad más candente (20)

Reading Birnbaum's (1962) paper, by Li Chenlu
Reading Birnbaum's (1962) paper, by Li ChenluReading Birnbaum's (1962) paper, by Li Chenlu
Reading Birnbaum's (1962) paper, by Li Chenlu
 
Fixed Point Results In Fuzzy Menger Space With Common Property (E.A.)
Fixed Point Results In Fuzzy Menger Space With Common Property (E.A.)Fixed Point Results In Fuzzy Menger Space With Common Property (E.A.)
Fixed Point Results In Fuzzy Menger Space With Common Property (E.A.)
 
Reading Neyman's 1933
Reading Neyman's 1933 Reading Neyman's 1933
Reading Neyman's 1933
 
Discussion of Persi Diaconis' lecture at ISBA 2016
Discussion of Persi Diaconis' lecture at ISBA 2016Discussion of Persi Diaconis' lecture at ISBA 2016
Discussion of Persi Diaconis' lecture at ISBA 2016
 
Reformulation of Nash Equilibrium with an Application to Interchangeability
Reformulation of Nash Equilibrium with an Application to InterchangeabilityReformulation of Nash Equilibrium with an Application to Interchangeability
Reformulation of Nash Equilibrium with an Application to Interchangeability
 
Statistics (1): estimation Chapter 3: likelihood function and likelihood esti...
Statistics (1): estimation Chapter 3: likelihood function and likelihood esti...Statistics (1): estimation Chapter 3: likelihood function and likelihood esti...
Statistics (1): estimation Chapter 3: likelihood function and likelihood esti...
 
Statistics symposium talk, Harvard University
Statistics symposium talk, Harvard UniversityStatistics symposium talk, Harvard University
Statistics symposium talk, Harvard University
 
A factorization theorem for generalized exponential polynomials with infinite...
A factorization theorem for generalized exponential polynomials with infinite...A factorization theorem for generalized exponential polynomials with infinite...
A factorization theorem for generalized exponential polynomials with infinite...
 
Advanced Microeconomics - Lecture Slides
Advanced Microeconomics - Lecture SlidesAdvanced Microeconomics - Lecture Slides
Advanced Microeconomics - Lecture Slides
 
Optimization Approach to Nash Euilibria with Applications to Interchangeability
Optimization Approach to Nash Euilibria with Applications to InterchangeabilityOptimization Approach to Nash Euilibria with Applications to Interchangeability
Optimization Approach to Nash Euilibria with Applications to Interchangeability
 
Statistics (1): estimation, Chapter 2: Empirical distribution and bootstrap
Statistics (1): estimation, Chapter 2: Empirical distribution and bootstrapStatistics (1): estimation, Chapter 2: Empirical distribution and bootstrap
Statistics (1): estimation, Chapter 2: Empirical distribution and bootstrap
 
Introduction to Decision Making Theory
Introduction to Decision Making TheoryIntroduction to Decision Making Theory
Introduction to Decision Making Theory
 
On the vexing dilemma of hypothesis testing and the predicted demise of the B...
On the vexing dilemma of hypothesis testing and the predicted demise of the B...On the vexing dilemma of hypothesis testing and the predicted demise of the B...
On the vexing dilemma of hypothesis testing and the predicted demise of the B...
 
the ABC of ABC
the ABC of ABCthe ABC of ABC
the ABC of ABC
 
11.[29 35]a unique common fixed point theorem under psi varphi contractive co...
11.[29 35]a unique common fixed point theorem under psi varphi contractive co...11.[29 35]a unique common fixed point theorem under psi varphi contractive co...
11.[29 35]a unique common fixed point theorem under psi varphi contractive co...
 
testing as a mixture estimation problem
testing as a mixture estimation problemtesting as a mixture estimation problem
testing as a mixture estimation problem
 
Approximating Bayes Factors
Approximating Bayes FactorsApproximating Bayes Factors
Approximating Bayes Factors
 
Laplace's Demon: seminar #1
Laplace's Demon: seminar #1Laplace's Demon: seminar #1
Laplace's Demon: seminar #1
 
Can we estimate a constant?
Can we estimate a constant?Can we estimate a constant?
Can we estimate a constant?
 
ABC based on Wasserstein distances
ABC based on Wasserstein distancesABC based on Wasserstein distances
ABC based on Wasserstein distances
 

Destacado

Reading the Lindley-Smith 1973 paper on linear Bayes estimators
Reading the Lindley-Smith 1973 paper on linear Bayes estimatorsReading the Lindley-Smith 1973 paper on linear Bayes estimators
Reading the Lindley-Smith 1973 paper on linear Bayes estimatorsChristian Robert
 
Asymptotics for discrete random measures
Asymptotics for discrete random measuresAsymptotics for discrete random measures
Asymptotics for discrete random measuresJulyan Arbel
 
Species sampling models in Bayesian Nonparametrics
Species sampling models in Bayesian NonparametricsSpecies sampling models in Bayesian Nonparametrics
Species sampling models in Bayesian NonparametricsJulyan Arbel
 
A Gentle Introduction to Bayesian Nonparametrics
A Gentle Introduction to Bayesian NonparametricsA Gentle Introduction to Bayesian Nonparametrics
A Gentle Introduction to Bayesian NonparametricsJulyan Arbel
 
Bayesian Nonparametrics, Applications to biology, ecology, and marketing
Bayesian Nonparametrics, Applications to biology, ecology, and marketingBayesian Nonparametrics, Applications to biology, ecology, and marketing
Bayesian Nonparametrics, Applications to biology, ecology, and marketingJulyan Arbel
 
Presentation of Bassoum Abou on Stein's 1981 AoS paper
Presentation of Bassoum Abou on Stein's 1981 AoS paperPresentation of Bassoum Abou on Stein's 1981 AoS paper
Presentation of Bassoum Abou on Stein's 1981 AoS paperChristian Robert
 
Dependent processes in Bayesian Nonparametrics
Dependent processes in Bayesian NonparametricsDependent processes in Bayesian Nonparametrics
Dependent processes in Bayesian NonparametricsJulyan Arbel
 
A Gentle Introduction to Bayesian Nonparametrics
A Gentle Introduction to Bayesian NonparametricsA Gentle Introduction to Bayesian Nonparametrics
A Gentle Introduction to Bayesian NonparametricsJulyan Arbel
 
Gelfand and Smith (1990), read by
Gelfand and Smith (1990), read byGelfand and Smith (1990), read by
Gelfand and Smith (1990), read byChristian Robert
 
Testing point null hypothesis, a discussion by Amira Mziou
Testing point null hypothesis, a discussion by Amira MziouTesting point null hypothesis, a discussion by Amira Mziou
Testing point null hypothesis, a discussion by Amira MziouChristian Robert
 
Reading Efron's 1979 paper on bootstrap
Reading Efron's 1979 paper on bootstrapReading Efron's 1979 paper on bootstrap
Reading Efron's 1979 paper on bootstrapChristian Robert
 
Reading the Lasso 1996 paper by Robert Tibshirani
Reading the Lasso 1996 paper by Robert TibshiraniReading the Lasso 1996 paper by Robert Tibshirani
Reading the Lasso 1996 paper by Robert TibshiraniChristian Robert
 
Hypothesis testing and p values 06
Hypothesis testing and p values  06Hypothesis testing and p values  06
Hypothesis testing and p values 06DrZahid Khan
 

Destacado (17)

Reading the Lindley-Smith 1973 paper on linear Bayes estimators
Reading the Lindley-Smith 1973 paper on linear Bayes estimatorsReading the Lindley-Smith 1973 paper on linear Bayes estimators
Reading the Lindley-Smith 1973 paper on linear Bayes estimators
 
Asymptotics for discrete random measures
Asymptotics for discrete random measuresAsymptotics for discrete random measures
Asymptotics for discrete random measures
 
Species sampling models in Bayesian Nonparametrics
Species sampling models in Bayesian NonparametricsSpecies sampling models in Bayesian Nonparametrics
Species sampling models in Bayesian Nonparametrics
 
A Gentle Introduction to Bayesian Nonparametrics
A Gentle Introduction to Bayesian NonparametricsA Gentle Introduction to Bayesian Nonparametrics
A Gentle Introduction to Bayesian Nonparametrics
 
Bayesian Nonparametrics, Applications to biology, ecology, and marketing
Bayesian Nonparametrics, Applications to biology, ecology, and marketingBayesian Nonparametrics, Applications to biology, ecology, and marketing
Bayesian Nonparametrics, Applications to biology, ecology, and marketing
 
Nested sampling
Nested samplingNested sampling
Nested sampling
 
Presentation of Bassoum Abou on Stein's 1981 AoS paper
Presentation of Bassoum Abou on Stein's 1981 AoS paperPresentation of Bassoum Abou on Stein's 1981 AoS paper
Presentation of Bassoum Abou on Stein's 1981 AoS paper
 
Dependent processes in Bayesian Nonparametrics
Dependent processes in Bayesian NonparametricsDependent processes in Bayesian Nonparametrics
Dependent processes in Bayesian Nonparametrics
 
A Gentle Introduction to Bayesian Nonparametrics
A Gentle Introduction to Bayesian NonparametricsA Gentle Introduction to Bayesian Nonparametrics
A Gentle Introduction to Bayesian Nonparametrics
 
Bayesian Classics
Bayesian ClassicsBayesian Classics
Bayesian Classics
 
Gelfand and Smith (1990), read by
Gelfand and Smith (1990), read byGelfand and Smith (1990), read by
Gelfand and Smith (1990), read by
 
Testing point null hypothesis, a discussion by Amira Mziou
Testing point null hypothesis, a discussion by Amira MziouTesting point null hypothesis, a discussion by Amira Mziou
Testing point null hypothesis, a discussion by Amira Mziou
 
Reading Efron's 1979 paper on bootstrap
Reading Efron's 1979 paper on bootstrapReading Efron's 1979 paper on bootstrap
Reading Efron's 1979 paper on bootstrap
 
Reading the Lasso 1996 paper by Robert Tibshirani
Reading the Lasso 1996 paper by Robert TibshiraniReading the Lasso 1996 paper by Robert Tibshirani
Reading the Lasso 1996 paper by Robert Tibshirani
 
slides Céline Beji
slides Céline Bejislides Céline Beji
slides Céline Beji
 
Hypothesis testing and p values 06
Hypothesis testing and p values  06Hypothesis testing and p values  06
Hypothesis testing and p values 06
 
P value
P valueP value
P value
 

Similar a Reading Testing a point-null hypothesis, by Jiahuan Li, Feb. 25, 2013

Excursion 4 Tour II: Rejection Fallacies: Whose Exaggerating What?
Excursion 4 Tour II: Rejection Fallacies: Whose Exaggerating What?Excursion 4 Tour II: Rejection Fallacies: Whose Exaggerating What?
Excursion 4 Tour II: Rejection Fallacies: Whose Exaggerating What?jemille6
 
Probability/Statistics Lecture Notes 4: Hypothesis Testing
Probability/Statistics Lecture Notes 4: Hypothesis TestingProbability/Statistics Lecture Notes 4: Hypothesis Testing
Probability/Statistics Lecture Notes 4: Hypothesis Testingjemille6
 
Mayo Slides: Part I Meeting #2 (Phil 6334/Econ 6614)
Mayo Slides: Part I Meeting #2 (Phil 6334/Econ 6614)Mayo Slides: Part I Meeting #2 (Phil 6334/Econ 6614)
Mayo Slides: Part I Meeting #2 (Phil 6334/Econ 6614)jemille6
 
Likelihoodist vs. significance tester w bernoulli trials
Likelihoodist vs. significance tester w bernoulli trialsLikelihoodist vs. significance tester w bernoulli trials
Likelihoodist vs. significance tester w bernoulli trialsjemille6
 
Data mining assignment 2
Data mining assignment 2Data mining assignment 2
Data mining assignment 2BarryK88
 
chapter_8_20162.pdf
chapter_8_20162.pdfchapter_8_20162.pdf
chapter_8_20162.pdfSumitRijal1
 
Equational axioms for probability calculus and modelling of Likelihood ratio ...
Equational axioms for probability calculus and modelling of Likelihood ratio ...Equational axioms for probability calculus and modelling of Likelihood ratio ...
Equational axioms for probability calculus and modelling of Likelihood ratio ...Advanced-Concepts-Team
 
Categorical data analysis full lecture note PPT.pptx
Categorical data analysis full lecture note  PPT.pptxCategorical data analysis full lecture note  PPT.pptx
Categorical data analysis full lecture note PPT.pptxMinilikDerseh1
 
An overview of Bayesian testing
An overview of Bayesian testingAn overview of Bayesian testing
An overview of Bayesian testingChristian Robert
 
Problem_Session_Notes
Problem_Session_NotesProblem_Session_Notes
Problem_Session_NotesLu Mao
 
Fuzzy relations
Fuzzy relationsFuzzy relations
Fuzzy relationsnaugariya
 
Fisher, neyman pearson and beyond
Fisher, neyman pearson and beyondFisher, neyman pearson and beyond
Fisher, neyman pearson and beyondMuhammad Asfar
 
Welcome to International Journal of Engineering Research and Development (IJERD)
Welcome to International Journal of Engineering Research and Development (IJERD)Welcome to International Journal of Engineering Research and Development (IJERD)
Welcome to International Journal of Engineering Research and Development (IJERD)IJERD Editor
 
Inferring the number of components: dream or reality?
Inferring the number of components: dream or reality?Inferring the number of components: dream or reality?
Inferring the number of components: dream or reality?Christian Robert
 

Similar a Reading Testing a point-null hypothesis, by Jiahuan Li, Feb. 25, 2013 (20)

Excursion 4 Tour II: Rejection Fallacies: Whose Exaggerating What?
Excursion 4 Tour II: Rejection Fallacies: Whose Exaggerating What?Excursion 4 Tour II: Rejection Fallacies: Whose Exaggerating What?
Excursion 4 Tour II: Rejection Fallacies: Whose Exaggerating What?
 
Probability/Statistics Lecture Notes 4: Hypothesis Testing
Probability/Statistics Lecture Notes 4: Hypothesis TestingProbability/Statistics Lecture Notes 4: Hypothesis Testing
Probability/Statistics Lecture Notes 4: Hypothesis Testing
 
Mayo Slides: Part I Meeting #2 (Phil 6334/Econ 6614)
Mayo Slides: Part I Meeting #2 (Phil 6334/Econ 6614)Mayo Slides: Part I Meeting #2 (Phil 6334/Econ 6614)
Mayo Slides: Part I Meeting #2 (Phil 6334/Econ 6614)
 
Likelihoodist vs. significance tester w bernoulli trials
Likelihoodist vs. significance tester w bernoulli trialsLikelihoodist vs. significance tester w bernoulli trials
Likelihoodist vs. significance tester w bernoulli trials
 
Data mining assignment 2
Data mining assignment 2Data mining assignment 2
Data mining assignment 2
 
chapter_8_20162.pdf
chapter_8_20162.pdfchapter_8_20162.pdf
chapter_8_20162.pdf
 
Equational axioms for probability calculus and modelling of Likelihood ratio ...
Equational axioms for probability calculus and modelling of Likelihood ratio ...Equational axioms for probability calculus and modelling of Likelihood ratio ...
Equational axioms for probability calculus and modelling of Likelihood ratio ...
 
Categorical data analysis full lecture note PPT.pptx
Categorical data analysis full lecture note  PPT.pptxCategorical data analysis full lecture note  PPT.pptx
Categorical data analysis full lecture note PPT.pptx
 
An overview of Bayesian testing
An overview of Bayesian testingAn overview of Bayesian testing
An overview of Bayesian testing
 
2018 MUMS Fall Course - Essentials of Bayesian Hypothesis Testing - Jim Berge...
2018 MUMS Fall Course - Essentials of Bayesian Hypothesis Testing - Jim Berge...2018 MUMS Fall Course - Essentials of Bayesian Hypothesis Testing - Jim Berge...
2018 MUMS Fall Course - Essentials of Bayesian Hypothesis Testing - Jim Berge...
 
Problem_Session_Notes
Problem_Session_NotesProblem_Session_Notes
Problem_Session_Notes
 
U unit7 ssb
U unit7 ssbU unit7 ssb
U unit7 ssb
 
Fuzzy relations
Fuzzy relationsFuzzy relations
Fuzzy relations
 
eatonmuirheadsoaita
eatonmuirheadsoaitaeatonmuirheadsoaita
eatonmuirheadsoaita
 
Fisher, neyman pearson and beyond
Fisher, neyman pearson and beyondFisher, neyman pearson and beyond
Fisher, neyman pearson and beyond
 
Welcome to International Journal of Engineering Research and Development (IJERD)
Welcome to International Journal of Engineering Research and Development (IJERD)Welcome to International Journal of Engineering Research and Development (IJERD)
Welcome to International Journal of Engineering Research and Development (IJERD)
 
JSM 2011 round table
JSM 2011 round tableJSM 2011 round table
JSM 2011 round table
 
JSM 2011 round table
JSM 2011 round tableJSM 2011 round table
JSM 2011 round table
 
Inferring the number of components: dream or reality?
Inferring the number of components: dream or reality?Inferring the number of components: dream or reality?
Inferring the number of components: dream or reality?
 
fuzzy_measures.ppt
fuzzy_measures.pptfuzzy_measures.ppt
fuzzy_measures.ppt
 

Más de Christian Robert

Asymptotics of ABC, lecture, Collège de France
Asymptotics of ABC, lecture, Collège de FranceAsymptotics of ABC, lecture, Collège de France
Asymptotics of ABC, lecture, Collège de FranceChristian Robert
 
Workshop in honour of Don Poskitt and Gael Martin
Workshop in honour of Don Poskitt and Gael MartinWorkshop in honour of Don Poskitt and Gael Martin
Workshop in honour of Don Poskitt and Gael MartinChristian Robert
 
How many components in a mixture?
How many components in a mixture?How many components in a mixture?
How many components in a mixture?Christian Robert
 
Testing for mixtures at BNP 13
Testing for mixtures at BNP 13Testing for mixtures at BNP 13
Testing for mixtures at BNP 13Christian Robert
 
Testing for mixtures by seeking components
Testing for mixtures by seeking componentsTesting for mixtures by seeking components
Testing for mixtures by seeking componentsChristian Robert
 
discussion on Bayesian restricted likelihood
discussion on Bayesian restricted likelihooddiscussion on Bayesian restricted likelihood
discussion on Bayesian restricted likelihoodChristian Robert
 
NCE, GANs & VAEs (and maybe BAC)
NCE, GANs & VAEs (and maybe BAC)NCE, GANs & VAEs (and maybe BAC)
NCE, GANs & VAEs (and maybe BAC)Christian Robert
 
Coordinate sampler : A non-reversible Gibbs-like sampler
Coordinate sampler : A non-reversible Gibbs-like samplerCoordinate sampler : A non-reversible Gibbs-like sampler
Coordinate sampler : A non-reversible Gibbs-like samplerChristian Robert
 
Likelihood-free Design: a discussion
Likelihood-free Design: a discussionLikelihood-free Design: a discussion
Likelihood-free Design: a discussionChristian Robert
 
CISEA 2019: ABC consistency and convergence
CISEA 2019: ABC consistency and convergenceCISEA 2019: ABC consistency and convergence
CISEA 2019: ABC consistency and convergenceChristian Robert
 
a discussion of Chib, Shin, and Simoni (2017-8) Bayesian moment models
a discussion of Chib, Shin, and Simoni (2017-8) Bayesian moment modelsa discussion of Chib, Shin, and Simoni (2017-8) Bayesian moment models
a discussion of Chib, Shin, and Simoni (2017-8) Bayesian moment modelsChristian Robert
 
Poster for Bayesian Statistics in the Big Data Era conference
Poster for Bayesian Statistics in the Big Data Era conferencePoster for Bayesian Statistics in the Big Data Era conference
Poster for Bayesian Statistics in the Big Data Era conferenceChristian Robert
 

Más de Christian Robert (20)

Asymptotics of ABC, lecture, Collège de France
Asymptotics of ABC, lecture, Collège de FranceAsymptotics of ABC, lecture, Collège de France
Asymptotics of ABC, lecture, Collège de France
 
Workshop in honour of Don Poskitt and Gael Martin
Workshop in honour of Don Poskitt and Gael MartinWorkshop in honour of Don Poskitt and Gael Martin
Workshop in honour of Don Poskitt and Gael Martin
 
discussion of ICML23.pdf
discussion of ICML23.pdfdiscussion of ICML23.pdf
discussion of ICML23.pdf
 
How many components in a mixture?
How many components in a mixture?How many components in a mixture?
How many components in a mixture?
 
restore.pdf
restore.pdfrestore.pdf
restore.pdf
 
Testing for mixtures at BNP 13
Testing for mixtures at BNP 13Testing for mixtures at BNP 13
Testing for mixtures at BNP 13
 
CDT 22 slides.pdf
CDT 22 slides.pdfCDT 22 slides.pdf
CDT 22 slides.pdf
 
Testing for mixtures by seeking components
Testing for mixtures by seeking componentsTesting for mixtures by seeking components
Testing for mixtures by seeking components
 
discussion on Bayesian restricted likelihood
discussion on Bayesian restricted likelihooddiscussion on Bayesian restricted likelihood
discussion on Bayesian restricted likelihood
 
NCE, GANs & VAEs (and maybe BAC)
NCE, GANs & VAEs (and maybe BAC)NCE, GANs & VAEs (and maybe BAC)
NCE, GANs & VAEs (and maybe BAC)
 
ABC-Gibbs
ABC-GibbsABC-Gibbs
ABC-Gibbs
 
Coordinate sampler : A non-reversible Gibbs-like sampler
Coordinate sampler : A non-reversible Gibbs-like samplerCoordinate sampler : A non-reversible Gibbs-like sampler
Coordinate sampler : A non-reversible Gibbs-like sampler
 
eugenics and statistics
eugenics and statisticseugenics and statistics
eugenics and statistics
 
ABC-Gibbs
ABC-GibbsABC-Gibbs
ABC-Gibbs
 
asymptotics of ABC
asymptotics of ABCasymptotics of ABC
asymptotics of ABC
 
ABC-Gibbs
ABC-GibbsABC-Gibbs
ABC-Gibbs
 
Likelihood-free Design: a discussion
Likelihood-free Design: a discussionLikelihood-free Design: a discussion
Likelihood-free Design: a discussion
 
CISEA 2019: ABC consistency and convergence
CISEA 2019: ABC consistency and convergenceCISEA 2019: ABC consistency and convergence
CISEA 2019: ABC consistency and convergence
 
a discussion of Chib, Shin, and Simoni (2017-8) Bayesian moment models
a discussion of Chib, Shin, and Simoni (2017-8) Bayesian moment modelsa discussion of Chib, Shin, and Simoni (2017-8) Bayesian moment models
a discussion of Chib, Shin, and Simoni (2017-8) Bayesian moment models
 
Poster for Bayesian Statistics in the Big Data Era conference
Poster for Bayesian Statistics in the Big Data Era conferencePoster for Bayesian Statistics in the Big Data Era conference
Poster for Bayesian Statistics in the Big Data Era conference
 

Último

Sulphonamides, mechanisms and their uses
Sulphonamides, mechanisms and their usesSulphonamides, mechanisms and their uses
Sulphonamides, mechanisms and their usesVijayaLaxmi84
 
ARTERIAL BLOOD GAS ANALYSIS........pptx
ARTERIAL BLOOD  GAS ANALYSIS........pptxARTERIAL BLOOD  GAS ANALYSIS........pptx
ARTERIAL BLOOD GAS ANALYSIS........pptxAneriPatwari
 
How to Fix XML SyntaxError in Odoo the 17
How to Fix XML SyntaxError in Odoo the 17How to Fix XML SyntaxError in Odoo the 17
How to Fix XML SyntaxError in Odoo the 17Celine George
 
Narcotic and Non Narcotic Analgesic..pdf
Narcotic and Non Narcotic Analgesic..pdfNarcotic and Non Narcotic Analgesic..pdf
Narcotic and Non Narcotic Analgesic..pdfPrerana Jadhav
 
ESP 4-EDITED.pdfmmcncncncmcmmnmnmncnmncmnnjvnnv
ESP 4-EDITED.pdfmmcncncncmcmmnmnmncnmncmnnjvnnvESP 4-EDITED.pdfmmcncncncmcmmnmnmncnmncmnnjvnnv
ESP 4-EDITED.pdfmmcncncncmcmmnmnmncnmncmnnjvnnvRicaMaeCastro1
 
Reading and Writing Skills 11 quarter 4 melc 1
Reading and Writing Skills 11 quarter 4 melc 1Reading and Writing Skills 11 quarter 4 melc 1
Reading and Writing Skills 11 quarter 4 melc 1GloryAnnCastre1
 
Grade 9 Quarter 4 Dll Grade 9 Quarter 4 DLL.pdf
Grade 9 Quarter 4 Dll Grade 9 Quarter 4 DLL.pdfGrade 9 Quarter 4 Dll Grade 9 Quarter 4 DLL.pdf
Grade 9 Quarter 4 Dll Grade 9 Quarter 4 DLL.pdfJemuel Francisco
 
How to Make a Duplicate of Your Odoo 17 Database
How to Make a Duplicate of Your Odoo 17 DatabaseHow to Make a Duplicate of Your Odoo 17 Database
How to Make a Duplicate of Your Odoo 17 DatabaseCeline George
 
DIFFERENT BASKETRY IN THE PHILIPPINES PPT.pptx
DIFFERENT BASKETRY IN THE PHILIPPINES PPT.pptxDIFFERENT BASKETRY IN THE PHILIPPINES PPT.pptx
DIFFERENT BASKETRY IN THE PHILIPPINES PPT.pptxMichelleTuguinay1
 
Tree View Decoration Attribute in the Odoo 17
Tree View Decoration Attribute in the Odoo 17Tree View Decoration Attribute in the Odoo 17
Tree View Decoration Attribute in the Odoo 17Celine George
 
Beauty Amidst the Bytes_ Unearthing Unexpected Advantages of the Digital Wast...
Beauty Amidst the Bytes_ Unearthing Unexpected Advantages of the Digital Wast...Beauty Amidst the Bytes_ Unearthing Unexpected Advantages of the Digital Wast...
Beauty Amidst the Bytes_ Unearthing Unexpected Advantages of the Digital Wast...DhatriParmar
 
Congestive Cardiac Failure..presentation
Congestive Cardiac Failure..presentationCongestive Cardiac Failure..presentation
Congestive Cardiac Failure..presentationdeepaannamalai16
 
ICS2208 Lecture6 Notes for SL spaces.pdf
ICS2208 Lecture6 Notes for SL spaces.pdfICS2208 Lecture6 Notes for SL spaces.pdf
ICS2208 Lecture6 Notes for SL spaces.pdfVanessa Camilleri
 
Concurrency Control in Database Management system
Concurrency Control in Database Management systemConcurrency Control in Database Management system
Concurrency Control in Database Management systemChristalin Nelson
 
Active Learning Strategies (in short ALS).pdf
Active Learning Strategies (in short ALS).pdfActive Learning Strategies (in short ALS).pdf
Active Learning Strategies (in short ALS).pdfPatidar M
 
BIOCHEMISTRY-CARBOHYDRATE METABOLISM CHAPTER 2.pptx
BIOCHEMISTRY-CARBOHYDRATE METABOLISM CHAPTER 2.pptxBIOCHEMISTRY-CARBOHYDRATE METABOLISM CHAPTER 2.pptx
BIOCHEMISTRY-CARBOHYDRATE METABOLISM CHAPTER 2.pptxSayali Powar
 
ICS 2208 Lecture Slide Notes for Topic 6
ICS 2208 Lecture Slide Notes for Topic 6ICS 2208 Lecture Slide Notes for Topic 6
ICS 2208 Lecture Slide Notes for Topic 6Vanessa Camilleri
 
4.11.24 Poverty and Inequality in America.pptx
4.11.24 Poverty and Inequality in America.pptx4.11.24 Poverty and Inequality in America.pptx
4.11.24 Poverty and Inequality in America.pptxmary850239
 

Último (20)

Sulphonamides, mechanisms and their uses
Sulphonamides, mechanisms and their usesSulphonamides, mechanisms and their uses
Sulphonamides, mechanisms and their uses
 
ARTERIAL BLOOD GAS ANALYSIS........pptx
ARTERIAL BLOOD  GAS ANALYSIS........pptxARTERIAL BLOOD  GAS ANALYSIS........pptx
ARTERIAL BLOOD GAS ANALYSIS........pptx
 
Mattingly "AI & Prompt Design: Large Language Models"
Mattingly "AI & Prompt Design: Large Language Models"Mattingly "AI & Prompt Design: Large Language Models"
Mattingly "AI & Prompt Design: Large Language Models"
 
How to Fix XML SyntaxError in Odoo the 17
How to Fix XML SyntaxError in Odoo the 17How to Fix XML SyntaxError in Odoo the 17
How to Fix XML SyntaxError in Odoo the 17
 
Narcotic and Non Narcotic Analgesic..pdf
Narcotic and Non Narcotic Analgesic..pdfNarcotic and Non Narcotic Analgesic..pdf
Narcotic and Non Narcotic Analgesic..pdf
 
prashanth updated resume 2024 for Teaching Profession
prashanth updated resume 2024 for Teaching Professionprashanth updated resume 2024 for Teaching Profession
prashanth updated resume 2024 for Teaching Profession
 
ESP 4-EDITED.pdfmmcncncncmcmmnmnmncnmncmnnjvnnv
ESP 4-EDITED.pdfmmcncncncmcmmnmnmncnmncmnnjvnnvESP 4-EDITED.pdfmmcncncncmcmmnmnmncnmncmnnjvnnv
ESP 4-EDITED.pdfmmcncncncmcmmnmnmncnmncmnnjvnnv
 
Reading and Writing Skills 11 quarter 4 melc 1
Reading and Writing Skills 11 quarter 4 melc 1Reading and Writing Skills 11 quarter 4 melc 1
Reading and Writing Skills 11 quarter 4 melc 1
 
Grade 9 Quarter 4 Dll Grade 9 Quarter 4 DLL.pdf
Grade 9 Quarter 4 Dll Grade 9 Quarter 4 DLL.pdfGrade 9 Quarter 4 Dll Grade 9 Quarter 4 DLL.pdf
Grade 9 Quarter 4 Dll Grade 9 Quarter 4 DLL.pdf
 
How to Make a Duplicate of Your Odoo 17 Database
How to Make a Duplicate of Your Odoo 17 DatabaseHow to Make a Duplicate of Your Odoo 17 Database
How to Make a Duplicate of Your Odoo 17 Database
 
DIFFERENT BASKETRY IN THE PHILIPPINES PPT.pptx
DIFFERENT BASKETRY IN THE PHILIPPINES PPT.pptxDIFFERENT BASKETRY IN THE PHILIPPINES PPT.pptx
DIFFERENT BASKETRY IN THE PHILIPPINES PPT.pptx
 
Tree View Decoration Attribute in the Odoo 17
Tree View Decoration Attribute in the Odoo 17Tree View Decoration Attribute in the Odoo 17
Tree View Decoration Attribute in the Odoo 17
 
Beauty Amidst the Bytes_ Unearthing Unexpected Advantages of the Digital Wast...
Beauty Amidst the Bytes_ Unearthing Unexpected Advantages of the Digital Wast...Beauty Amidst the Bytes_ Unearthing Unexpected Advantages of the Digital Wast...
Beauty Amidst the Bytes_ Unearthing Unexpected Advantages of the Digital Wast...
 
Congestive Cardiac Failure..presentation
Congestive Cardiac Failure..presentationCongestive Cardiac Failure..presentation
Congestive Cardiac Failure..presentation
 
ICS2208 Lecture6 Notes for SL spaces.pdf
ICS2208 Lecture6 Notes for SL spaces.pdfICS2208 Lecture6 Notes for SL spaces.pdf
ICS2208 Lecture6 Notes for SL spaces.pdf
 
Concurrency Control in Database Management system
Concurrency Control in Database Management systemConcurrency Control in Database Management system
Concurrency Control in Database Management system
 
Active Learning Strategies (in short ALS).pdf
Active Learning Strategies (in short ALS).pdfActive Learning Strategies (in short ALS).pdf
Active Learning Strategies (in short ALS).pdf
 
BIOCHEMISTRY-CARBOHYDRATE METABOLISM CHAPTER 2.pptx
BIOCHEMISTRY-CARBOHYDRATE METABOLISM CHAPTER 2.pptxBIOCHEMISTRY-CARBOHYDRATE METABOLISM CHAPTER 2.pptx
BIOCHEMISTRY-CARBOHYDRATE METABOLISM CHAPTER 2.pptx
 
ICS 2208 Lecture Slide Notes for Topic 6
ICS 2208 Lecture Slide Notes for Topic 6ICS 2208 Lecture Slide Notes for Topic 6
ICS 2208 Lecture Slide Notes for Topic 6
 
4.11.24 Poverty and Inequality in America.pptx
4.11.24 Poverty and Inequality in America.pptx4.11.24 Poverty and Inequality in America.pptx
4.11.24 Poverty and Inequality in America.pptx
 

Reading Testing a point-null hypothesis, by Jiahuan Li, Feb. 25, 2013

  • 1. Testing a Point Null Hypothesisi: The Irreconcilability of P Values and Evidence JAMES O.BERGER and THOMAS SELLKE 25.02.2013 JAMES O.BERGER and THOMAS SELLKE Testing a Point Null Hypothesisi: The Irreconcilability of P Val
  • 2. Content 1 INTRODUCTION JAMES O.BERGER and THOMAS SELLKE Testing a Point Null Hypothesisi: The Irreconcilability of P Val
  • 3. Content 1 INTRODUCTION 2 POSTERIOR PROBABILITIES AND ODDS JAMES O.BERGER and THOMAS SELLKE Testing a Point Null Hypothesisi: The Irreconcilability of P Val
  • 4. Content 1 INTRODUCTION 2 POSTERIOR PROBABILITIES AND ODDS 3 LOWER BOUNDS ON POSTERIOR PROBABILI- TIES Introduction Lower Bounds for GA ={All Distributions} Lower Bounds for GS ={Symmetric Distributions} Lower Bounds for GUS ={Unimodal,Symmetric Distributions} Lower Bounds for GNOR ={Normal Distributions} JAMES O.BERGER and THOMAS SELLKE Testing a Point Null Hypothesisi: The Irreconcilability of P Val
  • 5. Content 1 INTRODUCTION 2 POSTERIOR PROBABILITIES AND ODDS 3 LOWER BOUNDS ON POSTERIOR PROBABILI- TIES Introduction Lower Bounds for GA ={All Distributions} Lower Bounds for GS ={Symmetric Distributions} Lower Bounds for GUS ={Unimodal,Symmetric Distributions} Lower Bounds for GNOR ={Normal Distributions} 4 MORE GENERAL HYPOTHESES AND CONDITIONAL CALCULATOINS General Formulation More General Hypotheses JAMES O.BERGER and THOMAS SELLKE Testing a Point Null Hypothesisi: The Irreconcilability of P Val
  • 6. Content 1 INTRODUCTION 2 POSTERIOR PROBABILITIES AND ODDS 3 LOWER BOUNDS ON POSTERIOR PROBABILI- TIES Introduction Lower Bounds for GA ={All Distributions} Lower Bounds for GS ={Symmetric Distributions} Lower Bounds for GUS ={Unimodal,Symmetric Distributions} Lower Bounds for GNOR ={Normal Distributions} 4 MORE GENERAL HYPOTHESES AND CONDITIONAL CALCULATOINS General Formulation More General Hypotheses 5 CONCLUSIONS JAMES O.BERGER and THOMAS SELLKE Testing a Point Null Hypothesisi: The Irreconcilability of P Val
  • 7. 1. Introduction The paper studies the problem of testing a point null hypothesis, of interest is the relationship between the P value and conditional and Bayesian measures of evidence against the null hypothesis ∗ The overall conclusion is that P value can be highly misleading measures of the evidence provided by le data against the null hypothesis JAMES O.BERGER and THOMAS SELLKE Testing a Point Null Hypothesisi: The Irreconcilability of P Val
  • 8. 1. Introduction Consider the simple situation of observing a random quantity X having density f (x | θ) , θ ⊂ R 1 , it is desired to test the null hypothesis H0 : θ = θ0 versus the alternative hypothesis H1 : θ = θ0 . p = Prθ=θ0 (T (X ) ≥ T (x)) JAMES O.BERGER and THOMAS SELLKE Testing a Point Null Hypothesisi: The Irreconcilability of P Val
  • 9. 1. lntroduction Example Suppose that X = (X1 , ......., Xn ) where the Xi are iid N(θ, σ 2 ) Then the usual test statistic is √ ¯ T (X ) = n | X − θ0 | /σ ¯ where X is the sample mean, and p = 2(1 − Φ(t)) where Φ is the standard normal cdf and √ t = T (x) = n | x − θ0 | /σ ¯ JAMES O.BERGER and THOMAS SELLKE Testing a Point Null Hypothesisi: The Irreconcilability of P Val
  • 10. 1. Introduction We presume that the classical approcach is the report of p, rather than the report of a Neyman-Perason error probability. This is because Most statistician prefer use of P values, feeling it to be impor- tant to indicate how strong the evidence against H0 is . The alternative measures of evidence we consider are based on knowledge of x. There are several well-known criticisms of testing a point null hypothesis. One is the issue of ’statistical’ versus ’practical’ significance, that one can get a very small p even when | θ − θ0 | is so small as to make θ equivalent to θ0 for practical purposes. Another well known is ’Jeffrey’s paradox’ JAMES O.BERGER and THOMAS SELLKE Testing a Point Null Hypothesisi: The Irreconcilability of P Val
  • 11. 1. Introduction Example Consider a Bayesian who chooses the prior distribution on θ, which gives probability 0,5 to H0 and H1 and spreads mass out on H1 according to an N(θ, σ 2 ) density. It will be seen in Section 2 that the posterior probability, Pr(H0 | x), of H0 is given by Pr (H0 | x) = (1 + (1 + n)−1/2 exp{t 2 /[2[(1 + 1/n)]})−1 JAMES O.BERGER and THOMAS SELLKE Testing a Point Null Hypothesisi: The Irreconcilability of P Val
  • 12. 1. Introduction Table 1 : Pr(H0 | x) for Jeffreys-Type Prior) n p t 1 5 10 20 50 100 1,000 .10 1.645 .42 .44 .47 .56 .65 .72 .89 .05 .1.960 .35 .33 .37 .42 .52 .60 .82 .01 .2.576 .21 .13 .14 .16 .22 .27 .53 .001 3.291 .086 .026 .024 .026 .034 .045 .124 The conflict between p and Pr(H0 | x) is apparent. JAMES O.BERGER and THOMAS SELLKE Testing a Point Null Hypothesisi: The Irreconcilability of P Val
  • 13. 1. Introduction Example Again consider a Bayesian who gives each hypothesis prior probabil- ity 0.5, but now suppose that he decides to spread out the mass on H1 in the symmetric fashion that is as favorable to H1 as possible. The corresponding values of Pr (H0 | x) are determined in Section 3 and are given in Table 2 for certain values of t. Table 2 : Pr(H0 | x) for a Prior Biased Towar H1 P value(p) t Pr (H0 | x) .10 1.645 .340 .05 .1.960 .227 .01 .2.576 .068 .001 3.291 .0088 JAMES O.BERGER and THOMAS SELLKE Testing a Point Null Hypothesisi: The Irreconcilability of P Val
  • 14. 1. Introduction Example (A Likelihood Analysis) It is common to perceive the comparative evidence provided by x for two pssible parameter values, θ1 andθ2 , as being measured by the likelihood ratio lx (θ1 : θ2 ) = f (x | θ1 )/f (x | θ2 ) A lower bound on the comparative evidence would be f (x | θ0 ) lx = inf lx (θ0 : θ) = = exp{−t 2 /2} θ supθ f (x | θ) JAMES O.BERGER and THOMAS SELLKE Testing a Point Null Hypothesisi: The Irreconcilability of P Val
  • 15. 1. Introduction Values of lx for various t are given in Table 3 Table 3 : Bounds on the Comparative Likelihood Likelihood ratio P value(p) t lower bound (lx ) .10 1.645 .340 .05 .1.960 .227 .01 .2.576 .068 .001 3.291 .0088 JAMES O.BERGER and THOMAS SELLKE Testing a Point Null Hypothesisi: The Irreconcilability of P Val
  • 16. 2. Posterior probabilities and odds let 0< π0 < 1 denote the prior probability of H0 , and let π1 = 1−π0 denote the prior probability of H1 , suppose that the mass on H1 is spread out according to the density g (θ). Realistic hypothesis: H0 :| θ − θ0 |≤ b Prior probability π0 would be assigned to {θ :| θ − θ0 |≤ b} (To a Bayesian, a point null test is typically reasonable only when the prior distribution is of this form) JAMES O.BERGER and THOMAS SELLKE Testing a Point Null Hypothesisi: The Irreconcilability of P Val
  • 17. 2. Posterior probabilities and odds Noting that the marginal density of X is m(x) = f (x | θ0 )π0 + (1 − π0 )mg (x) Where mg (x) = f (x | θ)g (θ)dθ The posterior probability of H0 is given by Pr (H0 | x) = f (x | θ0 ) × π0 /m(x) (1 − π0 ) mg (x) −1 = [1 + × ] π0 f (x | θ0 ) Also of interest is the posterior odds ratio of H0 to H1 which is Pr (H0 | x) π0 f (x | θ0 ) = × 1 − Pr (H0 | x) (1 − π0 ) mg (x) JAMES O.BERGER and THOMAS SELLKE Testing a Point Null Hypothesisi: The Irreconcilability of P Val
  • 18. 2. Posterior probabilities and odds The posterior odds ratio of H0 to H1 Pr (H0 | x) π0 f (x | θ0 ) = × 1 − Pr (H0 | x) (1 − π0 ) mg (x) Post odds Prior odds Bayes factor Bg (x) Interest in the Bayes factor centers around the fact that it does not involve the prior probabilities of the hypotheses and hence is sometimes interpreted as the actual odds of the hypotheses implied by the data alone. JAMES O.BERGER and THOMAS SELLKE Testing a Point Null Hypothesisi: The Irreconcilability of P Val
  • 19. 2. Posterior probabilities and odds Example (Jeffreys-Lindley paradox) Suppose that π0 is arbitrary and g is again N(θ0 , σ 2 ).Since a suffi- cent statistic for θ is X ¯ N(θ0 , σ 2 /n) ,we have that mg (¯) is an x N(θ0 , σ 2 (1 + n−1 )) distribution. Thus Bg (x) = f (x | θ0 )/mg (¯) x [2πσ 2 /n]−2 exp{− n (¯ − θ0 )2 /σ 2 } 2 x = 2 (1 + n−1 )]−1/2 exp{− 1 (¯ − θ 2 )/[σ 2 (1 + n−1 ]} [2πσ 2 x 0 1 = (1 + n)1/2 exp{− t 2 /(1 + n−1 )} 2 and Pr (H0 | x) = [1 + (1 − π0 )/(π0 Bg )]−1 (1 − π0 ) 1 = [1 + (1 + n)−1/2 × exp{ t 2 /(1 + n−1 )}]−1 π0 2 JAMES O.BERGER and THOMAS SELLKE Testing a Point Null Hypothesisi: The Irreconcilability of P Val
  • 20. 3. Lower bounds on posterior probabilites 3.1 Introduction This section will examine some lower bounds on Pr (H0 | x) when g (θ), the distribution of θ given that H1 is true is allowed to vary within some class of distribitions G GA ={all distributions} GS ={all distributions symmetric about θ0 } GUS ={all unimodal distribution symmetric about θ0 } GNOR ={all N(θ0 , τ 2 )distributions, 0≤ τ 2 < ∞} Even though these G’s are supposed to consist only of distribution on {θ | θ = θ0 }, il will be convenient to allow them to include distributions with mass at θ0 , so the lower bounds we compute are always attained. JAMES O.BERGER and THOMAS SELLKE Testing a Point Null Hypothesisi: The Irreconcilability of P Val
  • 21. 3. Lower bounds on posterior probabilites 3.1 Introduction Letting Pr (H0 | x, G ) = inf Pr (H0 | x) g ∈G and B(xmG ) = inf Bg (x) g ∈G we see immediately form formulas before that B(x, G ) = f (x | θ0 )/ sup mg (x) g ∈G and (1 − π0 ) 1 Pr (H0 | xmG ) = [1 + × ]−1 π0 B(x, G ) JAMES O.BERGER and THOMAS SELLKE Testing a Point Null Hypothesisi: The Irreconcilability of P Val
  • 22. 3. Lower bounds on posterior probabilites 3.2 Lower bounds for GA ={All distributions} Theorem Suppose that a maximum likelihood estimate of θ0 , exists for the observed x. Then ˆ B(x, GA ) = f (x | θ0 )/f (x | θ(x)) and ˆ (1 − π0 ) f (x | θ(x)) −1 Pr (H0 | x, GA ) = [1 + × ] π0 f (x | θ0 ) JAMES O.BERGER and THOMAS SELLKE Testing a Point Null Hypothesisi: The Irreconcilability of P Val
  • 23. 3. Lower bounds on posterior probabilites 3.2 Lower bounds for GA ={All distributions} Example In this situation,we have 2 B(x, GA ) = e −t /2 ¯ and (1 − π0 ) t 2 /2 −1 Pr (H0 | x, GA ) = [1 + e ] π0 For servral choices of t, Table 4 gives the corresponding two-sided P values,p, and the values of Pr (H0 | x, GA ),with π0 = 0.5. JAMES O.BERGER and THOMAS SELLKE Testing a Point Null Hypothesisi: The Irreconcilability of P Val
  • 24. 3. Lower bounds on posterior probabilites 3.2 Lower bounds for GA ={All distributions} For servral choices of t, Table 4 gives the corresponding two-sided P values,p, and the values of Pr (H0 | x, GA ),with π0 = 0.5. Table 4 : Comparison of P values and Pr (H0 | x, GA ) when π0 = 0.5 P value(p) t Pr (H0 | x, GA ) Pr (H0 | x, GA )/(pt) .10 1.645 .205 1.25 .05 .1.960 .128 1.30 .01 .2.576 .035 1.36 .001 3.291 .0044 1.35 JAMES O.BERGER and THOMAS SELLKE Testing a Point Null Hypothesisi: The Irreconcilability of P Val
  • 25. 3. Lower bounds on posterior probabilites 3.2 Lower bounds for GA ={All distributions} Theorem For t > 1.68 and π0 = 0.5 in Example 1, Pr (H0 | x, GA )/pt > π/2 1.253 Furthermore lim Pr (H0 | x, GA )/pt = π/2 t→∞ JAMES O.BERGER and THOMAS SELLKE Testing a Point Null Hypothesisi: The Irreconcilability of P Val
  • 26. 3. Lower bounds on posterior probabilites 3.3 Lower bounds for GS ={Symmetric distributions} There is a large gap between Pr (H0 | x, GA )andPr (H0 | x) for the Jeffreys-type single prior analysis.This reinforces the suspicion that using GA unduly biases the conclusion against H0 and suggests use of more reasonable classes of priors. Theorem sup mg (x) = sup mg (x), g ∈G2ps g ∈GS so B(x, G2PS ) = B(x, GS ) and Pr (H0 | x, G2ps ) = Pr (H0 | x, GS ) JAMES O.BERGER and THOMAS SELLKE Testing a Point Null Hypothesisi: The Irreconcilability of P Val
  • 27. 3. Lower bounds on posterior probabilites 3.3 Lower bounds for GS ={Symmetric distributions} Example If t ≤ 1, a calculus argument show that the symmetric two point distribution that strictly maximizes mg (x) is the degenerate ”two- point”distribution putting all mass at θ0 . Thus B(x, GS ) = 1 and Pr (H0 | x, GS ) = π0 for t ≤ 1. If t ≥ 1 , then mg (x) is maximized by a nondegenerate element of G2ps . For moderately large t, the maximum value of mg (x) for g∈ G2ps is very well approximated by taking g to be the two-point ˆ ˆ distribution putting equal mass at θ(x) and at 2θ − θ(x). so ϕ(t) B(x, GS ) 2 exp {−0.5t 2 } 0.5ϕ(0) + 0.5ϕ(2t) JAMES O.BERGER and THOMAS SELLKE Testing a Point Null Hypothesisi: The Irreconcilability of P Val
  • 28. 3. Lower bounds on posterior probabilites 3.3 Lower bounds for GS ={Symmetric distributions} Example For t ≤ 1.645, the first approximation is accurate to within 1 in the fourth significant digit and the second approximation to within 2 in the third significant digit. Table 5 gives the value of Pr (H0 | x, Gs ) of several choices of t. Table 5 : Comparison of P values and Pr (H0 | x, GS ) when π0 = 0.5 P value(p) t Pr (H0 | x, GS ) Pr (H0 | x, GS )/(pt) .10 1.645 .340 2.07 .05 .1.960 .227 2.31 .01 .2.576 .068 2.62 .001 3.291 .0088 2.68 JAMES O.BERGER and THOMAS SELLKE Testing a Point Null Hypothesisi: The Irreconcilability of P Val
  • 29. 3. Lower bounds on posterior probabilites 3.4 Lower bounds for GUS ={Unimodal, Symmetric distributions} Minimizing Pr (H0 | x) over all symmetric priors still involves considerable bias against H0 . A further ’objective’ restriction, which would seem reasonable to many, is to require the prior to be unimodal, or non-increasing in | θ − θ0 | . Theorem sup mg (x) = sup mg (x), g ∈Gus g ∈US with US ={all symmetric uniform distributions} so B(x, GUS ) = B(x, US ) and Pr (H0 | x, GUS ) = Pr (H0 | x, US ) JAMES O.BERGER and THOMAS SELLKE Testing a Point Null Hypothesisi: The Irreconcilability of P Val
  • 30. 3. Lower bounds on posterior probabilites 3.4 Lower bounds for GUS ={Unimodal, Symmetric distributions} Theorem If t ≤ 1 in example 1, then B(x, GUS ) = 1 and Pr (H0 | x.GUS ) = π0 . Ift > 1 then 2ϕ(t) B(x.GUS ) = ϕ(K + t) + ϕ(K − t) and (1 − π0 ) ϕ(K + t) + ϕ(K = t) −1 Pr (H0 | x, GUS ) = [1 + × ] π0 2ϕ(t) where K > 0 Figures 1 and 2 give values of K and B for various val- ues of t in this problem JAMES O.BERGER and THOMAS SELLKE Testing a Point Null Hypothesisi: The Irreconcilability of P Val
  • 31. 3. Lower bounds on posterior probabilites 3.4 Lower bounds for GUS ={Unimodal, Symmetric distributions} JAMES O.BERGER and THOMAS SELLKE Testing a Point Null Hypothesisi: The Irreconcilability of P Val
  • 32. 3. Lower bounds on posterior probabilites 3.4 Lower bounds for GUS ={Unimodal, Symmetric distributions} Table 6 gives Pr (H0 | x, GUS ) for some specific important values of t and π0 = 0.5 Table 6 : Comparison of P values and Pr (H0 | x, GUS ) when π0 = 0.5 P value(p) t Pr (H0 | x, GUS ) Pr (H0 | x, GUS )/(pt) .10 1.645 .390 1.44 .05 .1.960 .290 1.51 .01 .2.576 .109 1.64 .001 3.291 .018 1.66 JAMES O.BERGER and THOMAS SELLKE Testing a Point Null Hypothesisi: The Irreconcilability of P Val
  • 33. 3. Lower bounds on posterior probabilites 3.5 Lower bounds for GNOR ={Normal distributions} We have seene that minimizing Pr (H | x)overg ∈ GUS is the same as minimizing over g ∈ US . Althought using US is much more reasonable than using GA ,there is still some residual bias against H0 involved in using US . Theorem If t ≤ 1in Example 1, then B(x, GNOR ) = 1 and Pr (H0 | x, GNOR ) = π0 . If t > 1, then √ 2 B(x, GNOR ) = ete −t /2 and (1 − π0 ) exp{t 2 /2} −1 Pr (H0 | x, GNOR ) = [1 + × √ ] π0 et JAMES O.BERGER and THOMAS SELLKE Testing a Point Null Hypothesisi: The Irreconcilability of P Val
  • 34. 3. Lower bounds on posterior probabilites 3.5 Lower bounds for GNOR ={Normal distributions} Table 7 gives Pr (H0 | x, GNOR ) for servral values of t Table 7 : Comparison of P values and Pr (H0 | x, GNOR ) when π0 = 0.5 P value(p) t Pr (H0 | x, GNOR ) Pr (H0 | x, GNOR )/(pt) .10 1.645 .412 1.52 .05 .1.960 .321 1.67 .01 .2.576 .133 2.01 .001 3.291 .0235 2.18 JAMES O.BERGER and THOMAS SELLKE Testing a Point Null Hypothesisi: The Irreconcilability of P Val
  • 35. 4. More general hypotheses and conditional calculations 4.1 General formulation Consider the Bayesian calculation of Pr (H0 | A) , where H0 is of the form H0 : θ ∈ Θ0 and A is the set in which x is known to reside. Then letting π0 and π1 denote the prior probabilities of H0 and H1 and g1 and g2 as the densities on Θ0 and Θ1 . Then we have : 1 − π0 mg 1 (A) −1 Pr (H0 | A) = [1 + × ] π+0 mg 0 (A) where mg 1 (A) = Prθ (A)gi (θ)dθ Θ0 JAMES O.BERGER and THOMAS SELLKE Testing a Point Null Hypothesisi: The Irreconcilability of P Val
  • 36. 4. More general hypotheses and conditional calculations 4.1 General formulation For the general formulation, one can determine lower bounds on Pr (H0 | A) by choosing sets G0 and G1 of g0 and g1 , respectively, calculating B(A, G0 , G1 ) = inf mg0 (A)/ sup mg 1 (A) g0 ∈G0 g1 ∈G1 (1−π0 ) 1 −1 and defining Pr (H0 | A, G0 , G1 ) = [1 + π0 × B(A,G0 ,G1 ) ] JAMES O.BERGER and THOMAS SELLKE Testing a Point Null Hypothesisi: The Irreconcilability of P Val
  • 37. 4. More general hypotheses and conditional calculations 4.2 More general hypotheses Assume in this section that A= {x}. The lower bounds can be applied to a variety of generalizations of point null hypotheses. If Θ0 is a small set about θ0 , the negeral lower bounds turn out to be essentially equivalent to the point null lower bounds. In example 1, suppose that the hypotheses were H0 : θ ∈ (θ0 − √ b, θ0 + b) and H1 : θ ∈ (θ0 − b, θ0 + b) If | t − nb/σ |≥ 1 / and G0 = G1 = GS , then B(x, G0 , G1 ) and Pr (H0 | x, G0 , G1 ) are exactly the same as B and P for testing the point null. JAMES O.BERGER and THOMAS SELLKE Testing a Point Null Hypothesisi: The Irreconcilability of P Val
  • 38. 5. Conclusion B(x, GUS ) and the P value calculated at (t − 1)+ in instead of t, this last will be called the P value of (t − 1)+ Figure shows that this comparative likelihood is close to the P value that would be obtained if we replaced t by (t − 1)+ . The implication is that : t = 1 means only mild evidence against H0 , t = 2 means significant evidence against H0 , and t = 3 means highly evidence against H0 , and t = 4 means over whelming evidence against H0 , should at least replaced by the rule of thumb, that is : t = 1 means no evidence against H0 , t = 2 means only mild evidence against H0 , and t = 3 means significant evidence against H0 , and t = 4 means highly significant evidence against H0 . JAMES O.BERGER and THOMAS SELLKE Testing a Point Null Hypothesisi: The Irreconcilability of P Val
  • 39. 5. Conclusion t = 1 means no evidence against H0 , t = 2 means only mild evidence against H0 , and t = 3 means significant evidence against H0 , and t = 4 means highly significant evidence against H0 . JAMES O.BERGER and THOMAS SELLKE Testing a Point Null Hypothesisi: The Irreconcilability of P Val
  • 40. References EDWARDS, W, LINDMAN, H, and SAVAGE, L.J (1963). Bayesian statistical inference for psychological research. Psycho- logical Review 70, 193-242 Jayanta Ghosha, Sumitra Purkayastha and Tapas Samanta Role of P-values and other Measures of Evidence in Bayesian Anal- ysis. Handbook of Statistics, Vol. 25 James O. Berger Statistical Decision Theory and Bayesian Analy- sis. Springer Series in Statistics JAMES O.BERGER and THOMAS SELLKE Testing a Point Null Hypothesisi: The Irreconcilability of P Val