On Nov 16, 2007 6:28 AM, <[EMAIL PROTECTED]> wrote: > Hi everyone > > I get an html page with a 400 error code (Bad Request) when running > the following code: > > #!/usr/bin/perl -w > use strict; > use warnings; > use WWW::Mechanize; > > my $mech = WWW::Mechanize->new( agent => 'Mozilla 2.0.0.9' ); > > $mech->get('http://www.aeroporto.fvg.it/tab/fmarrb.php'); > my $arrivi = $mech->content; > > > print "Content-type: text/html\n\n"; > print $arrivi; > > exit; > > > When asking for this page directly from a browser (Firefox or IE) it > works fine... you can see the frame as a stand-alone page, exactly > what I want to get... but I cannot! > Do you have some good suggestion? > Thanks in advance > Livius
Interesting... I tried downloading the page with the wget utility, and I also get "forbidden", even with ignoring the robots.txt I tried to browse to the page in lynx; forbidden I think elinks worked... -- To unsubscribe, e-mail: [EMAIL PROTECTED] For additional commands, e-mail: [EMAIL PROTECTED] http://learn.perl.org/