Hi,

I have a script to download some large binary files from a web site.
Everything works ok initially but for the bigger files (~100Mb) I am getting
a timeout. I am guessing that this is an issue with the web server I am
connecting to but I was wondering if there was something that I can change
in my script to fix or get around this problem? Or is there a better way to
do this?

(I have pasted the script below) The error is get is "500 read timeout"

Thanks for any help

Adam

##################################################

#! perl -w

use strict;
use LWP::UserAgent;

my @files = ('Malaria110699_10PA.tar.gz', 'PfSpzS4_10PA.tar.gz');

for(my $i=0; $i < @files; $i++)
    {
     print "downloading ".$files[$i]."\t";
          
     my $ua = new LWP::UserAgent;
     
     my $request = new HTTP::Request('GET', "<URL>");
     my $response = $ua->request($request, "$files[$i]");
     
     if($response->is_error())
        {
         print "\n".$response->status_line."\n"
        }
       else{print "done\n"}
    }

##################################################

Reply via email to