On Thursday, Nov 13, 2003, at 08:40 US/Pacific, NIPP, SCOTT V (SBCSI) wrote:


I am an SA for several HP-UX systems. Recently a user had a problem
running a Perl job and he indicated that it would die at about 1GB. Now, I
believe that the HP-UX 11i kernel parameter 'maxdsiz' is the limit he is
running into here, but I am not entirely sure. The question I really have
is if this is indeed the parameter, it is limited to slightly under 4GB and
the user needs approximately 5GB. So, can Perl use 64bit memory, which has
an insanely huge limit so this user can run this job? Any feedback is most
appreciated. Thanks.

p0: we'll skip of the 'why that' question, since if memory is a part of the problem, he might want to reconsider the option of porting to a tighter memory model code such as c89/c99 to begin with...

p1: when you do 'perl -V' there is an option for how
the version of perl you are using was built.

a part of the challenge is whether your underlying c-compiler
will actually support building a 64-bit verions:
cf
<http://www.xav.com/perl/lib/Pod/perldelta.html#64bit%20support>

so you will need to check that it was built with

-Duse64bitint

That having been said, perl can only do what the
underlying OS will allow it to do, and how the actual
perl executable was built to do.

p2: as an illustration, I ran into wakyNeff in perl code
that formerly worked, but, well, when I hauled it over to
a box that had a file system that allowed me to make files
larger than 2gigabytes, the code 'freaked'. So I went back
and rebuilt the actual /usr/bin/perl code with the USE_LARGE_FILES
so that it would be able to detect and correctly return on

if ( -f $file )...

when $file happened to be a file over 2gigabytes. The problem
was not in the 'perl text' passed to /usr/bin/perl, but in
the actual /usr/bin/perl executable itself.

HTH.

ciao
drieux

---


-- To unsubscribe, e-mail: [EMAIL PROTECTED] For additional commands, e-mail: [EMAIL PROTECTED]



Reply via email to