[EMAIL PROTECTED] (Ton Hospel) writes:

> In article <[EMAIL PROTECTED]>,
>       Eugene van der Pijll <[EMAIL PROTECTED]> writes:
> > En op 03 juli 2002 sprak Steffen Mueller:
> > <snip>
> >> This really isn't a new rule, I think. The rules never stated that your
> >> entry isn't supposed to work on a very large number of nodes. In fact, we
> >> deliberately did not limit the number of nodes. Hence, we cannot allow
> >> hardcoded limits.
> > 
> > What do you mean by hardcoded limits? Code like 'if($x++>100){exit}'?
> > Would 'if($x++>$^T){exit}' be OK? Would using the stash size as a limit
> > be OK, as someone did in the Cantor/Kolakoski match?
> > 
> 
> The way I normally interpret it is rougly like this:
> (this is only my opinion, I'm not a referee here. For all I know they
> completely disagree)
> 
> Suppose you use some algorithm that duplicates the input in 10 arrays.
> One array element is something like 40 bytes. When you run it on a
> machine with 2**32 = 4*10**9 bytes, it means you can process roughly
> 10**7 elements.  The rules say you have "ample" play memory, so it's ok
> to hardcode a limit of 10**7. And what's more, if this 10 array method
> is "reasonable", other people may also hardcode 10**7, even if their 
> program could in principle handle more, but the use of 10**7 allows them 
> to shave one stroke.

If they have any sense, they'd hardcode it as 1e7...

-- 
Piers

   "It is a truth universally acknowledged that a language in
    possession of a rich syntax must be in need of a rewrite."
         -- Jane Austen?

Reply via email to