https://issues.apache.org/ooo/show_bug.cgi?id=121421

--- Comment #13 from orcmid <[email protected]> ---
(In reply to comment #12)

> Ignoring MAX_RAND is a very bad idea.
That would only be the case if the % failed in the initialization.  There is no
indication that has happened.  A serious implementation would have to look at
the seeding far more carefully.  Although the %-operation I used to simplify
things puts some bias into the seed values, this didn't seem important for
proof-of-concept.  (There is bias in the choking of seeds to under 30000
already.  I simply don't believe that comment in the Fortran code.)

> I also found my implementation is incomplete: to avoid precision issues the
> complete implementation does some adjustments that involve substracting.
> That may be cause for microsoft's bug.
This is apparently not a problem on systems where sizeof(int) > 3.  If it were,
we could see negative IX,IY,IZ values at some point.  It might be safer to
declare those as longs, though.  It will be interesting to see if that changes
anything.
> 
> The algorithm is rather old and thought for 16 bit architectures. There has
> been a revision in 2007 (which obviously didn't make it into Office 2003). I
> will update this soon.
I noticed that too.  The maximum periods of the individual IX, IY, and IZ
series are terribly small.  Do you have a reference on any analysis of the 2007
version and the rationale for the choice of its parameters?  (In this case,
watching out for multiplication range errors and %-buts will also be more
important.)

-- 
You are receiving this mail because:
You are on the CC list for the bug.
You are the assignee for the bug.

Reply via email to