Hi,
I'm a software developer from China.
I have a Perl project, it runs on Windows XP very well.
The version of ActivePerl on Windows XP is following:
------------------------------------------------------------------------
Summary of my perl5 (revision 5 version 10 subversion 0) configuration:
  Platform:
    osname=MSWin32, osvers=5.00, archname=MSWin32-x86-multi-thread
------------------------------------------------------------------------
Now I need to migrate it to AIX platform. But I got an error from my log
file after launching it on AIX:
------------------------------------------------------------------------
Getting content of:
   http://xxxx/xxx/xxx.tar.gz
   Succeed
Out of memory!
------------------------------------------------------------------------
This is the version of Perl on AIX:
------------------------------------------------------------------------
Summary of my perl5 (revision 5 version 8 subversion 8) configuration:
  Platform:
    osname=aix, osvers=5.3.0.0, archname=aix-thread-multi
------------------------------------------------------------------------
This is the relevant code:
-------------------------------------------------------------------------------------------------------
sub initWebApp {
   if( $fvt_download == 0 ) {
      $fvt_download = LWP::UserAgent->new;
      $fvt_download -> agent("FMTFVT_AUTO_TEST/1.0.0 ");
   }
}

sub getWebContent {
   if( scalar(@_) < 1 )
   {
      return "";
   }

   if( $fvt_download == 0 ) {
      initWebApp();
   }

   print "Getting content of:\n   " . $_[0] . "\n";

   my $download_file = $fvt_download -> get( $_[0] );

   if( ! $download_file -> is_success ) {
      print "   Failed\n";
      return "";
   }
   print "   Succeed\n";

   return $download_file -> content;
}

sub downloadDist {
   my @file_path = split( /\//, $_[0] );
   my $file_name = $file_path[ scalar(@file_path) - 1 ];

   my $web_content = getWebContent( $_[0] );

   if( $web_content eq "" ) {
      return 0;
   }

   print "Saving web content to: $file_name \n";
   open( wrf, "> $file_name" );
   binmode wrf;
   print wrf $web_content;
   close(wrf);

   return 1;
}
-------------------------------------------------------------------------------------------------------
I want to download a file from somewhere, the file's is about 40M.
I tried to increase the memory limit to double, but the error occurred all
the same.
Do you have any idea about this(except to upgrade Perl to a high version)?
Thanks in advance.

Thanks & Best Regards
YangBo (杨波)
IBM China System & Technology Laboratory in Shanghai
Office Phone: 86-21-60922638
Internet ID:ybyan...@cn.ibm.com
Seat NO: 5B-W073
Adress: 5/F,Building 10, 399 Keyuan Road, Zhangjiang Hi-Tech Park, Pudong
New District, Shanghai 201203

Reply via email to