>>>>> "cspark" == cspark  <[EMAIL PROTECTED]>
>>>>>     on Wed, 22 Mar 2006 05:52:13 +0100 (CET) writes:

    cspark> Full_Name: Chanseok Park Version: R 2.2.1 OS: RedHat
    cspark> EL4 Submission from: (NULL) (130.127.112.89)



    cspark> pbinom(any negative value, size, prob) should be
    cspark> zero.  But I got the following results.  I mean, if
    cspark> a negative value is close to zero, then pbinom()
    cspark> calculate pbinom(0, size, prob). 

    >> pbinom( -2.220446e-22, 3,.1)
    [1] 0.729
    >> pbinom( -2.220446e-8, 3,.1)
    [1] 0.729
    >> pbinom( -2.220446e-7, 3,.1)
    [1] 0

Yes, all the [dp]* functions which are discrete with mass on the
integers only, do *round* their 'x' to integers.

I could well argue that the current behavior is *not* a bug,
since we do treat "x close to integer" as integer, and hence 
   pbinom(eps, size, prob)  with  eps "very close to 0" should give
   pbinom(0,   size, prob)
as it now does.

However, for esthetical reasons, 
I agree that we should test for "< 0" first (and give 0 then) and only
round otherwise.  I'll change this for R-devel (i.e. R 2.3.0 in
about a month).

    cspark> dbinom() also behaves similarly.

yes, similarly, but differently.
I have changed it (for R-devel) as well, to behave the same as
others d*() , e.g., dpois(), dnbinom() do.


Martin Maechler, ETH Zurich

______________________________________________
R-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-devel

Reply via email to