> Well one thing it is, is Perl that allows you to use the OS memory
> allocation funtions.  This allows you to write Perl programs that can use
> more than 3GB of addressable RAM.  Although you are still limited to 3GB per
> single data structure.  The prequisite is that your operating system and
> architecture must be 64bit, eg. Sparc/Solaris.  By its very nature though
> your Perl programs become overall a bit slower due to the fact that you must
> use larger than 32bit to address memory, ie more bits to shuffle for every
> memory address.

s/3GB/4GB/g;

Yes, people have (just for the fun of it) had Perl scalars greater
than 4 gigabytes.  Whether there's any real slowdown, I don't know,
but my gut instinct says that Perl spends most its time elsewhere and
64-bitness doesn't really cause that much of a slowdown.

64-bitness is not only about the 4GB data size limit; there's also
the feature of having 64-bit integers, which is related (and precursor)
to having larger files than 2GB/4GB (depends on whether you measure by
signed or unsigned).  Having 64-bit integers means that Perl can for
longer keep numbers as integers before slipping into floating point.

> Someone else may be able to provide better details, but that is the gist of
> it.

Nowadays (this mailing list is a bit dusty, try Perl 5.8.0 or 5.6.1)
getting integer 64-bitness is a question of

        Configure -Duse64bitint

and the data size 64-bitness

        Configure -Duse64bitall

Warning: either, and especially the latter, will produce Perls that
are binarily incompatible with Perls Configured without any-Duse64*.

-- 
Jarkko Hietaniemi <[EMAIL PROTECTED]> http://www.iki.fi/jhi/ "There is this special
biologist word we use for 'stable'.  It is 'dead'." -- Jack Cohen

Reply via email to