On Saturday, 9 February 2013 at 01:07:13 UTC, H. S. Teoh wrote:
On Sat, Feb 09, 2013 at 01:37:34AM +0100, John Colvin wrote:
On Friday, 8 February 2013 at 21:07:58 UTC, Era Scarecrow wrote:
>On Friday, 8 February 2013 at 12:27:50 UTC, John Colvin wrote:
>>On Friday, 8 February 2013 at 06:59:20 UTC, Marco Leise >>wrote:
>>>
>>>So right now we can handle 20! = 2,432,902,008,176,640,000
>>>permutations. If every check took only 20 clock cycles of a >>>4 >>>Ghz CPU, it would take you ~386 years to go through the >>>list.
>>>For the average human researcher this is plenty of time.
>>
>>On a modern supercomputer this would take well under 2 >>months.
>>(I calculated it as ~44 days on Minerva at Warwick,  UK). 19!
>>would take less than 3 days.
>>
>>In a parallel setting, such large numbers are assailable.
>
> If we have such a large number of computations, then either > cent >will have to be implemented, use BigInt, or an internal array >that >handles locational information. That would remove the >limitations >of 20 to either 255, or 65535 (assuming you EVER need that >many).
>Course rummaging through the array for the next computation
>becomes more difficult the larger the number of elements.

Seeing as 61! is of the order of the number of atoms in the
observable universe, i don't think there's much need to plan for any
higher than that!

That doesn't prevent mathematicians from grappling with numbers like Graham's number (see wikipedia entry), the magnitude of which exploded my perception of infinity several times over, and it's still *finite*.
;-)

Iterating over such inconceivably huge numbers is, of course, a fool's
errand.

But being able to *index* a large set of permutations is actually
valuable. In this sense, bearophile's factorial method, suitably
extended to some BigInt index, is superior, not because you want to iterate over the entire gigantic list of possibilities, but because using BigInt allows you to index a particular entry in that list without
having to go through them all.


T

This is a fair point. Being able to obtain the kth permutation of a large set would indeed be useful, even if iteration is not feasible. For example, you might want to examine the k=2^n perturbations, as you have some a priori knowledge that they contain the solution you're looking for.

In that case we'd want to be able to index with an effectively arbitrary size index. I don't have any experience with bigint but I presume it's the correct tool for the job.

Reply via email to