Ken,
The difference is when you run through command script you are executing
the file as /owner/ and as /Other/ when you access it through the
browser. Looking at the error message you sent, I believe it might not
be executing the complete script. Try setting permissions as 707 or 777
to start with. You may have to create a temporary directory to test with.
Let me know if you have any questions,
Vishwam
Vishwam Annam
Wright State University Libraries
120 Paul Laurence Dunbar Library
3640 Colonel Glenn Hwy.
Dayton, OH 45435
Office: 937-775-3262
FAX 937-775-2356
Ken Irwin wrote:
Hi all,
I'm moving to a new web server and struggling to get it configured properly. The problem of the
moment: having a Perl CGI script call another web page in the background and make decisions
based on its content. On the old server I used an antique Perl script called hcat
(from the Pelican bookhttp://oreilly.com/openbook/webclient/ch04.html); I've also tried
curl and LWP::Simple.
In all three cases, I get the same behavior: it works just fine on the command
line, but when called by the web server through a CGI script, the LWP (or other
socket connection) gets no results. It sounds like a permissions thing, but I
don't know what kind of permissions setting to tinker with. In the test script
below, my command line outputs:
Content-type: text/plain
Getting URL: http://www.npr.org
885 lines
Whereas the web output just says Getting URL: http://www.npr.org; - and doesn't even get
to the Couldn't get error message.
Any clue how I can make use of a web page's contents from w/in a CGI script?
(The actual application has to do with exporting data from our catalog, but I
need to work out the basic mechanism first.)
Here's the script I'm using.
#!/bin/perl
use LWP::Simple;
print Content-type: text/plain\n\n;
my $url = http://www.npr.org;;
print Getting URL: $url\n;
my $content = get $url;
die Couldn't get $url unless defined $content;
@lines = split (/\n/, $content);
foreach (@lines) { $i++; }
print \n\n$i lines\n\n;
Any ideas?
Thanks
Ken