File Writing and CGI

2003-11-21 Thread Gohaku
Hi everyone,
I am trying to write to a file when running the following Perl/CGI 
script:
#!/usr/bin/perl
print Content-type: text/html\n\n;
print Hi;
open(FILE,hello.txt) || die(Cannot Open File);
print FILE Hello;

I have run the script from the command line and hello.txt does appear 
but if I run the script from a Web Browser, hello.txt does not get 
created.
I have a feeling I need to change or add something to httpd.conf but 
don't know what it could be.
Thanks in advance.
-Gohaku



Re: File Writing and CGI

2003-11-21 Thread Gohaku
When I tried use CGI::Carp qw(fatalsToBrowser)
I saw the following:
Internal Server Error
I also ran the script from a browser as Root and I still got the same 
message.

On Friday, November 21, 2003, at 02:21 AM, Thilo Planz wrote:

use CGI::Carp qw(fatalsToBrowser)



Re: File Writing and CGI

2003-11-21 Thread gene
On Nov 20, 2003, at 11:41 PM, Gohaku wrote:

When I tried use CGI::Carp qw(fatalsToBrowser)
I saw the following:
Internal Server Error
I also ran the script from a browser as Root and I still got the same 
message.

Make sure that the script is executable by the user that apache runs 
as.  Something like:
chmod 744 my_script
sudo chown apache my_script

I'm actually not running apache on my machine, so I don't know if the 
apache user is 'apache'. If not adjust the command above accordingly.



Re: File Writing and CGI

2003-11-21 Thread Sherm Pendley
On Nov 21, 2003, at 2:03 AM, Gohaku wrote:

Hi everyone,
I am trying to write to a file when running the following Perl/CGI 
script:
#!/usr/bin/perl
print Content-type: text/html\n\n;
print Hi;
open(FILE,hello.txt) || die(Cannot Open File);
print FILE Hello;

I have run the script from the command line and hello.txt does 
appear but if I run the script from a Web Browser, hello.txt does 
not get created.
This is a *very* frequently asked question. Take a look at the Perl 
FAQ. It's on your machine already, as a man page - see 'man perlfaq9'. 
Or, view it online at 
http://www.perldoc.com/perl5.8.0/pod/perlfaq9.html. There's also the 
Troubleshooting Perl CGI Scripts at 
http://www.perl.org/troubleshooting_CGI.html; Having said that...

I'm assuming that the script is executable, since you've indicated it 
runs OK from a shell prompt. Aside from that, there are two issues that 
immediately come to mind with the script as written: The path to the 
file you're writing to, and permissions.

First of all, you need to fully specify the location of the file you 
want to write to. That is, instead of just supplying the file name, you 
need to give the directory as well. For example, you might want to use 
/tmp/hello.txt instead of just hello.txt.

Also, the file must be writable by the user and/or group the web server 
is running as. On Mac OS X, the user and group are both www. To 
append to an existing file, the file must be writable; to create a new 
file, the directory in which the file will be created must be writable. 
One of the best ways to do this is to change the ownership of the 
file/directory to www. That allows the web server to write to the 
file, without having to make the file world-writable.

When I tried use CGI::Carp qw(fatalsToBrowser)
An excellent idea. The standard 505 - Server Error is arguably the 
most confusing, least useful error message ever written in the history 
of programming.

But, the die() statement in the script above only tells you it couldn't 
open the file. That's good, but it could be better - it could also tell 
you *why* it couldn't open the file. The $! variable contains an 
explanation of the latest error. You could include this in your die() 
message, like die(Error opening hello.txt: $!). Or, you could simply 
call die() with no message, which just prints $! by itself.

I also ran the script from a browser as Root and I still got the same 
message.
I hope I'm misunderstanding you... Are you saying you changed the 
User parameter in httpd.conf to run the server as root? I hope not - 
running Apache as root is a huge, major, gaping security problem. You 
should never do that.

sherm--



Re: File Writing and CGI

2003-11-21 Thread Thilo Planz
I am trying to write to a file when running the following Perl/CGI 
script:
#!/usr/bin/perl
print Content-type: text/html\n\n;
print Hi;
open(FILE,hello.txt) || die(Cannot Open File);
print FILE Hello;

I have run the script from the command line and hello.txt does 
appear but if I run the script from a Web Browser, hello.txt does 
not get created.
I have a feeling I need to change or add something to httpd.conf but 
don't know what it could be.
It could be a problem of permissions.
If it is, your script should die and you should find an error message 
in your log file (probably  /private/var/log/httpd/error_log  )

The web server is running under a different user account than yourself 
and has only limited permissions (for security reasons).

You should also include the error message in your die message:
die cannot open file: $! ;
Other hints:

- use strict
- close the file when done
- use CGI::Carp qw(fatalsToBrowser)
The last one will send error messages to your web browser, so you do 
not have to go through the error log file.

Cheers,

Thilo



Re: File Writing and CGI

2003-11-21 Thread Thilo Planz
When I tried use CGI::Carp qw(fatalsToBrowser)
I saw the following:
Internal Server Error
What does it say in the server error log ( /private/var/log/httpd/ ) ?

And can you still run it from the command line?
How do you run it from the command line?
 perl script.cgi
or
 ./script.cgi
(The latter needs to work for CGI, permissions should be set to 755 
(chmod a+x script.cgi) )


I also ran the script from a browser as Root and I still got the same 
message.
That will have no effect.
No matter what user is running the browser (or if that browser is local 
or on a remote machine),
the CGI will always be executed with the web server's permissions.



Thilo



Re: Apache::DBI on panther

2003-11-21 Thread Ray Zimmerman
Are you sure you're using the same version of perl, with the same 
@INC for both cases?  It sounds to me like when it's run under 
mod_perl it may be using a different DBD::mysql  client library or 
something ...

Just an idea ...

At 11:55 PM -0500 11/20/03, Chris Devers wrote:
Okay, let me try a reduced version of this problem.

I have a short script that does nothing but establish a connection to
MySQL via DBI/DBD::MySQL:
$ cat test.pl
#!/usr/bin/perl -wT
use DBI;

my $host = localhost;
my $db   = [];
my $user = [];
my $pass = [];
my $dsn  = DBI:mysql:host=$host;database=$db;
my $dbh = DBI-connect( $dsn, $user, $pass )
  or die Cannot connect to database: \n$DBI::errstr\n $!;
print qq[Content-type: text/plain

If you can read this then DBI connected.
];
--
 Ray Zimmerman  / e-mail: [EMAIL PROTECTED] / 428-B Phillips Hall
  Sr Research  /   phone: (607) 255-9645  /  Cornell University
   Associate  /  FAX: (815) 377-3932 /   Ithaca, NY  14853


Re: File Writing and CGI

2003-11-21 Thread Jeremy Mates
* Thilo Planz [EMAIL PROTECTED]
 - use strict

And taint mode!

perldoc perlsec

 - close the file when done

And check the exit status of the close on the written file, as that is
when you learn when the disk is full or whether subsequent processing on
the file should be avoided due to possible corruption.

close FILE or log_and_email_and_pretty_page_for_user_before_exit();


Re: File Writing and CGI

2003-11-21 Thread Chris Devers
On Fri, 21 Nov 2003, Gohaku wrote:

 I am trying to write to a file when running the following Perl/CGI
 script:

   #!/usr/bin/perl
   print Content-type: text/html\n\n;
   print Hi;
   open(FILE,hello.txt) || die(Cannot Open File);
   print FILE Hello;

The advice others have given is all valid -- you should always use
strict, warnings, and taint mode, so the opening of your script is:

   #!/usr/bin/perl -wT
   use strict;

But in any case that's been said already.

I'd also add that you should amend your `die` statement so that the output
actually indicates why it failed, by appending the `$!` error report
variable after the statement -- like this:

   open(FILE,hello.txt) || die(Cannot Open File: $!);

That way, a more specific report is produced for you. This report will
show up in your error log in /var/log/httpd/error_log (by default). Or, as
has also been suggested, you can turn on CGI::Carp to force the error
message to appear in the browser as well as the error log.

Putting everything together, try something like this:

   #!/usr/bin/perl -wT

   use strict;
   use CGI::Carp qw[ fatalsToBrowser ];

   print Content-type: text/html\n\n;
   print Hi;

   open(FILE,hello.txt) || die(Cannot Open File: $!);
   print FILE Hello;
   close(FILE);


As has been suggested, my guess is that when you get the $! error report
turned on, you'll see a message indicating that the problem had to do with
file permissions: hello.txt needs to be writable by the account that
Apache runs under, which by default is www. The easiest way to fix this is
probably to assign ownership of that file to the Apache account:

$ sudo chown www hello.txt
$ sudo chmod u+w hello.txt

This changes ownership, then sets user [owner] write permissions.

Chances aren't bad that the script will work after that.



-- 
Chris Devers


Re: File Writing and CGI

2003-11-21 Thread Jeremy Mates
* Chris Devers [EMAIL PROTECTED]
 Putting everything together, try something like this:
 
#!/usr/bin/perl -wT

# clean up env for taint mode
sub BEGIN {
  delete @ENV{qw(IFS CDPATH ENV BASH_ENV)};
  $ENV{'PATH'} = '/bin:/usr/bin';
}

use strict;
use CGI::Carp qw[ fatalsToBrowser ];
 
print Content-type: text/html\n\n;
print Hi;
 
open(FILE,hello.txt) || die(Cannot Open File: $!);

I find '||' far less readable than 'or', and far more likely to cause
precedence problems. Though I do write fairly () free Perl code.

print FILE Hello;

# proper error checking on close of write to file
close FILE or die Problem Writing File: $!;


expected to be defined in a dynamic image

2003-11-21 Thread Chris Nandor
[Fri Nov 21 07:27:55 2003] [notice] child pid 22666 exit signal Trace/BPT
trap (5)
dyld: /usr/local/apache/bin/httpd Undefined symbols:
/Users/pudge/.cpan/build/libapreq-1.3/blib/arch/auto/Apache/Request/Request.bundle
undefined reference to _ApacheRequest
_post_params expected to be defined in a dynamic image


I am seeing quite a lot of errors in Apache, preventing me from starting it
up, regarding undefined references with expected to be defined in a
dynamic image.  This is perl 5.8.2, darwin-2level.  I am tempted to try to
disable two-level namespaces in perl ... anyone?  Thanks,

-- 
Chris Nandor  [EMAIL PROTECTED]http://pudge.net/
Open Source Development Network[EMAIL PROTECTED] http://osdn.com/


Re: expected to be defined in a dynamic image

2003-11-21 Thread william ross
I've run into these. The answer has been to specify explicitly  
libraries that are normally left implicit. I found the failures odd,  
too. if a libdir is incorrect, make fails. if the directory is correct  
but the library is unspecified, make appears to work but all tests  
fail.

(eg. to build Image::Magick properly you have to specify -lpng -lxml2  
-ljpeg and so on in the LIBS, and to build DBD::Pg you have to specify  
-lssl -lcrypto, though it both normally work without the extra nudges.)

In your case it looks like a broken link to libapreq. i had similar  
errors with libapreq built through CPAN.pm, but I vaguely recall that  
rebuilding it by hand seemed to work. It wasn't a new apache, though:  
just the standard one, and i was so beset with error messages at the  
time that it's hard to remember what worked.

but most things still just work, for me, including other XSive modules  
and heavy linkers like DBD::mysql. I've only had trouble linking to  
fink- and cpan-built libraries, now that i think of it. Perhaps there  
is a subtle mismatch of compilers or configurations somewhere?

best

will





On 21 Nov 2003, at 15:47, Chris Nandor wrote:

[Fri Nov 21 07:27:55 2003] [notice] child pid 22666 exit signal  
Trace/BPT
trap (5)
dyld: /usr/local/apache/bin/httpd Undefined symbols:
/Users/pudge/.cpan/build/libapreq-1.3/blib/arch/auto/Apache/Request/ 
Request.bundle
undefined reference to _ApacheRequest
_post_params expected to be defined in a dynamic image

I am seeing quite a lot of errors in Apache, preventing me from  
starting it
up, regarding undefined references with expected to be defined in a
dynamic image.  This is perl 5.8.2, darwin-2level.  I am tempted to  
try to
disable two-level namespaces in perl ... anyone?  Thanks,

--
Chris Nandor  [EMAIL PROTECTED]http://pudge.net/
Open Source Development Network[EMAIL PROTECTED] http://osdn.com/



Re: File Writing and CGI

2003-11-21 Thread Chris Devers
On Fri, 21 Nov 2003, Jeremy Mates wrote:

 * Chris Devers [EMAIL PROTECTED]

 open(FILE,hello.txt) || die(Cannot Open File: $!);

 I find '||' far less readable than 'or', and far more likely to cause
 precedence problems. Though I do write fairly () free Perl code.

Agreed, but I was trying not to nitpick :)

I was trying to focus on the 'structural' issues, not 'stylistic' ones,
but now that you mention it, the English versions of some of the operators
('or' instead of '||', 'and' instead of '') do seem to make code much
more readable.

open(FILE,hello.txt) or die(Cannot Open File: $!);


-- 
Chris Devers


Re: File Writing and CGI

2003-11-21 Thread Doug McNutt
The || and or operators differ in the evaluation of their operands. It probably 
doesn't matter in this particular case but don't be fooled into thinking they are the 
same.

|| and  are short circuit operators that may not evaluate all of their operands.

Programming Perl, 3rd edition,, page 102

At 11:42 -0500 11/21/03, Chris Devers wrote:
On Fri, 21 Nov 2003, Jeremy Mates wrote:

 * Chris Devers [EMAIL PROTECTED]

  open(FILE,hello.txt) || die(Cannot Open File: $!);

 I find '||' far less readable than 'or', and far more likely to cause
 precedence problems. Though I do write fairly () free Perl code.

Agreed, but I was trying not to nitpick :)

I was trying to focus on the 'structural' issues, not 'stylistic' ones,
but now that you mention it, the English versions of some of the operators
('or' instead of '||', 'and' instead of '') do seem to make code much
more readable.

open(FILE,hello.txt) or die(Cannot Open File: $!);


-- 
-- As a citizen of the USA if you see a federal outlay expressed in $billion then 
multiply it by 4 to get your share in dollars. --


Re: File Writing and CGI

2003-11-21 Thread Bruce Van Allen
OK, one more thing to add to the tutorial the OP just got on CGI scripting:

In the CGI execution context (webserver), it's best to put your output last. As soon 
as you do this:

print Content-type: text/html\n\n; 
print ... ;  

don't rely on the webserver to do much more for you after it's done printing the http 
output.

My experience across about many different commercial and institutional web servers 
leads me to at minimum wrap up all system-related actions -- like file ops -- before 
printing output. And the best approach is: output last. Otherwise, I find webservers 
inconsistent as far as when they to let go of a CGI process. Perhaps someone who knows 
more about Apache or IIS could comment.


  - Bruce

__bruce__van_allen__santa_cruz__ca__


Re: expected to be defined in a dynamic image

2003-11-21 Thread Chris Nandor
In article [EMAIL PROTECTED],
 [EMAIL PROTECTED] (William Ross) wrote:

 I've run into these. The answer has been to specify explicitly  
 libraries that are normally left implicit. I found the failures odd,  
 too. if a libdir is incorrect, make fails. if the directory is correct  
 but the library is unspecified, make appears to work but all tests  
 fail.
 
 (eg. to build Image::Magick properly you have to specify -lpng -lxml2  
 -ljpeg and so on in the LIBS, and to build DBD::Pg you have to specify  
 -lssl -lcrypto, though it both normally work without the extra nudges.)
 
 In your case it looks like a broken link to libapreq. i had similar  
 errors with libapreq built through CPAN.pm, but I vaguely recall that  
 rebuilding it by hand seemed to work. It wasn't a new apache, though:  
 just the standard one, and i was so beset with error messages at the  
 time that it's hard to remember what worked.

But ... libapreq is what I am building/testing.  I can't link libapreq to 
itself ... so I have no idea what to do here.  Here's the complete error:

dyld: /usr/local/apache/bin/httpd Undefined symbols:
/Users/pudge/.cpan/build/libapreq-1.3/blib/arch/auto/Apache/Request/Request.b
undle undefined reference to _ApacheRequest
_post_params expected to be defined in a dynamic image
/Users/pudge/.cpan/build/libapreq-1.3/blib/arch/auto/Apache/Request/Request.b
undle undefined reference to _ApacheRequest
_query_params expected to be defined in a dynamic image
[Fri Nov 21 12:48:52 2003] [notice] child pid 1858 exit signal Trace/BPT 
trap (5)


 but most things still just work, for me, including other XSive modules  
 and heavy linkers like DBD::mysql. I've only had trouble linking to  
 fink- and cpan-built libraries, now that i think of it. Perhaps there  
 is a subtle mismatch of compilers or configurations somewhere?

Nope.  Everything built fresh with same compilers, same perl version.

I really hope I don't need to back out of this two-level namespace thing and 
rebuild everything again.

-- 
Chris Nandor  [EMAIL PROTECTED]http://pudge.net/
Open Source Development Network[EMAIL PROTECTED] http://osdn.com/


Re: File Writing and CGI

2003-11-21 Thread Chris Devers

Note:
  I'm cc'ing this back to the list, so that
  others can correct anything I get wrong :)


On Fri, 21 Nov 2003, Gohaku wrote:

 On Friday, November 21, 2003, at 09:40 AM, Chris Devers wrote:

  Apache runs under, which by default is www. The easiest way to fix
  this is
  probably to assign ownership of that file to the Apache account:
 
  $ sudo chown www hello.txt
  $ sudo chmod u+w hello.txt
 
  Chances aren't bad that the script will work after that.


 You're right, the script worked.

Glad to hear that.

 But what if I want to create a new
 file?  How do I do that?
 I ask because I would like to move these scripts to a webhost.
 Thanks.

One of the other commenters touched on this -- Sherm Pendley, I think
(he's a smart guy -- his posts are worth reading closely :).

Basically, if you want to work on an existing file, then you have to
verify the permissions on that file before Apache sends a script off to
work on it. That was the situation here.

On the other hand, if you want to create new files, then Apache needs to
have permission to work on the directory itself.

arcana caveat=someone may wish to improve on this explanation

According to purist Unix design philosophy, everything in the system is to
be treated as a file, where a file is defined as something like a stream
of data that you can perform various standard operations on (read from it,
write to it, etc).

(By way of comparison, according to purist relational database design
philosophy, everything in the system is to be treated as a table -- which
means that every time you issue a SQL statement on one table, the result
set you get back is effectively a new table -- and that in turn is why
it's usually trivial to do something like INSERT INTO foo SELECT * FROM
bar and it'll just work. But I digress from my digression.)

So anyway, everything in Unix is supposed to be a file. One example of
things that are files, even if they don't seem to be, is directories --
because a directory is just a file that contains a list of other files
(and some of which might themselves be directory-files, with lists of
their own).

/arcana

Why does this matter? Because it means that the same basic operations 
rules apply to directories as they do to plain files. In particular, all
directories have file ownership  permission settings, the side effects of
which are often kind of easy to work out if you get the general idea about
how permissions work on simple files.

In this case, that means that the user account that Apache runs under, www
by default on Mac OS X, needs to be able to write to the directory-file in
question, in order to add new data to that file -- that is, to add new
files to that directory. Or remove them, or just to make any changes in
general.

So, if you want to work on /Library/WebServer/Documents/hello/data --

$ sudo chown www /Library/WebServer/Documents/hello/data
$ sudo chmod u+w /Library/WebServer/Documents/hello/data

And then the Apache user will be able to create or remove files from that
directory. (And of course, any files created will be made with that user's
default ownership  permissions, so you mostly don't have to worry about
that aspect of things -- creation of the files themselves was the real
problem at hand here.)


Make sense?

As I say, someone may wish to revise my description of the Unix
everything is a file notion, but the basic idea is pretty simple, and
once you get your head around it, it makes a lot of system behavior a lot
less surprising and a lot more predictable.



-- 
Chris Devers


Re: expected to be defined in a dynamic image

2003-11-21 Thread Chris Nandor
In article [EMAIL PROTECTED],
 [EMAIL PROTECTED] (Chris Nandor) wrote:

 dyld: /usr/local/apache/bin/httpd Undefined symbols:
 /Users/pudge/.cpan/build/libapreq-1.3/blib/arch/auto/Apache/Request/Request.b
 undle undefined reference to _ApacheRequest
 _post_params expected to be defined in a dynamic image
 /Users/pudge/.cpan/build/libapreq-1.3/blib/arch/auto/Apache/Request/Request.b
 undle undefined reference to _ApacheRequest
 _query_params expected to be defined in a dynamic image
 [Fri Nov 21 12:48:52 2003] [notice] child pid 1858 exit signal Trace/BPT 
 trap (5)

For kicks, I decided to rebuild this one extension with the old LDDLFLAGS 
(-flat_namespace -undefined suppress).  The errors are now:

dyld: /usr/local/apache/bin/httpd Undefined symbols:
_ApacheRequest_post_params
_ApacheRequest_query_params

I don't know if this is instructive in any way.

-- 
Chris Nandor  [EMAIL PROTECTED]http://pudge.net/
Open Source Development Network[EMAIL PROTECTED] http://osdn.com/


Re: File Writing and CGI

2003-11-21 Thread Chris Devers
On Fri, 21 Nov 2003, Bruce Van Allen wrote:

 My experience across about many different commercial and institutional
 web servers leads me to at minimum wrap up all system-related actions --
 like file ops -- before printing output. And the best approach is:
 output last. Otherwise, I find webservers inconsistent as far as when
 they to let go of a CGI process. Perhaps someone who knows more about
 Apache or IIS could comment.


Really? Interesting.

I thought it was considered best to try to send back data early  often,
and to flush output by setting $| to 1 -- the idea being that a script
that takes a long time to produce results might lead to timeout errors if
the web client gives up, but if you keep sending back data as it becomes
available then the client will tend to keep the connection open.

Have I had this wrong? Is it better to save up all output for the end?



-- 
Chris Devers


Re: expected to be defined in a dynamic image

2003-11-21 Thread william ross
On 21 Nov 2003, at 20:54, Chris Nandor wrote:

In article [EMAIL PROTECTED],
 [EMAIL PROTECTED] (William Ross) wrote:
snip

In your case it looks like a broken link to libapreq. i had similar
errors with libapreq built through CPAN.pm, but I vaguely recall that
rebuilding it by hand seemed to work. It wasn't a new apache, though:
just the standard one, and i was so beset with error messages at the
time that it's hard to remember what worked.
But ... libapreq is what I am building/testing.  I can't link libapreq 
to
itself ... so I have no idea what to do here.  Here's the complete 
error:
i would have said that the errors were coming from a failure to link 
Apache::Request to libapreq. in a diffident, mumbled sort of a way.

I had similar trouble when in CPAN.pm but it all eventually worked for 
me when i downloaded libapreq from http://httpd.apache.org/apreq/ and 
approached it that way round.

?

best

will



Re: expected to be defined in a dynamic image

2003-11-21 Thread Chris Nandor
At 22:01 + 2003.11.21, william ross wrote:
On 21 Nov 2003, at 20:54, Chris Nandor wrote:

 In article [EMAIL PROTECTED],
  [EMAIL PROTECTED] (William Ross) wrote:

 snip

 In your case it looks like a broken link to libapreq. i had similar
 errors with libapreq built through CPAN.pm, but I vaguely recall that
 rebuilding it by hand seemed to work. It wasn't a new apache, though:
 just the standard one, and i was so beset with error messages at the
 time that it's hard to remember what worked.

 But ... libapreq is what I am building/testing.  I can't link libapreq
 to
 itself ... so I have no idea what to do here.  Here's the complete
 error:

i would have said that the errors were coming from a failure to link
Apache::Request to libapreq. in a diffident, mumbled sort of a way.

But Apache::Request is a part of libapreq.

-- 
Chris Nandor  [EMAIL PROTECTED]http://pudge.net/
Open Source Development Network[EMAIL PROTECTED] http://osdn.com/


Re: File Writing and CGI

2003-11-21 Thread Bruce Van Allen
On 11/21/03 Chris Devers wrote:

On Fri, 21 Nov 2003, Bruce Van Allen wrote:

 My experience across about many different commercial and institutional
 web servers leads me to at minimum wrap up all system-related actions 
 like file ops -- before printing output. And the best approach is:
 output last. Otherwise, I find webservers inconsistent as far as when
 they to let go of a CGI process. Perhaps someone who knows more about
 Apache or IIS could comment.

I thought it was considered best to try to send back data early  often,
and to flush output by setting $| to 1 -- the idea being that a script
that takes a long time to produce results might lead to timeout errors 
if
the web client gives up, but if you keep sending back data as it becomes
available then the client will tend to keep the connection open.

Have I had this wrong? 

Well, that's all correct. 

Is it better to save up all output for the end?

Most of my web apps have 100% dynamic output, and the main way I cope with timeout 
issues is to keep things fast. It could be that my experience is from particular 
combinations of web server, web browser, and my scripts' operations. Or maybe I've 
drawn a wrong conclusion.

To be more specific, I noticed inconsistencies with some web apps that logged state  
activity; if I waited to write to the logs after all http output was printed, my tests 
seemed to show that not every action was properly logged; if the actions were logged 
before final http output, no problems. If my logging routines involved time-consuming 
things like looking up and manipulating data, especially over a network, things would 
get worse. I always $|++, use strict, check return values, and I test relentlessly. 
The inconsistent results lead me to conclude that factors outside my control were 
causing the web server to sometimes drop or ignore the trailing parts of my scripts 
after printing output.

Inconsistent means I don't see the pattern, not that there isn't one.

Thinking about it, another reason I habitually delay printing even http headers is 
that a script's result state might be expressed as a redirect, an image, or as plain 
text, rather than html. OTOH, a quick look shows that I have a bunch of CGIs deployed 
whose first action is to print the usual Content-type: text/html\n\n line, long 
before doing anything with the input or any data, which follows the standard CGI 
advice.

So, no, Chris, you're not wrong. But my what works practice is to do as much as 
possible before output. Maybe it's more correct to say before final output but I 
don't know how the webserver would know that a script has no more output. Is it 
because the client breaks the connection once it gets the html output?

Again, perhaps someone who understand more about how Apache (or IIS) controls CGI 
execution could comment.


  - Bruce

__bruce__van_allen__santa_cruz__ca__


Re: expected to be defined in a dynamic image

2003-11-21 Thread william ross
On 21 Nov 2003, at 22:15, Chris Nandor wrote:

At 22:01 + 2003.11.21, william ross wrote:
On 21 Nov 2003, at 20:54, Chris Nandor wrote:

In article [EMAIL PROTECTED],
 [EMAIL PROTECTED] (William Ross) wrote:
snip

In your case it looks like a broken link to libapreq. i had similar
errors with libapreq built through CPAN.pm, but I vaguely recall 
that
rebuilding it by hand seemed to work. It wasn't a new apache, 
though:
just the standard one, and i was so beset with error messages at the
time that it's hard to remember what worked.
But ... libapreq is what I am building/testing.  I can't link 
libapreq
to
itself ... so I have no idea what to do here.  Here's the complete
error:
i would have said that the errors were coming from a failure to link
Apache::Request to libapreq. in a diffident, mumbled sort of a way.
But Apache::Request is a part of libapreq.
ah. i dimly grasp the wrongness of this end of the stick and let go of 
it as if burnt.

i assumed that libapreq was the name of the C library - what the apache 
site refers to as a 'Generic Apache Request Library for manipulating 
request data via the apache api', and that Apache::Request was a 
wrapper around that to expose those functions in perl. from what you're 
saying i glean that actually libapreq is just the name of the whole 
thing? oops.

beg your pardon.

will



Re: File Writing and CGI

2003-11-21 Thread Chris Devers
On Fri, 21 Nov 2003, Bruce Van Allen wrote:

 Most of my web apps have 100% dynamic output, and the main way I cope
 with timeout issues is to keep things fast. It could be that my
 experience is from particular combinations of web server, web browser,
 and my scripts' operations. Or maybe I've drawn a wrong conclusion.

I think that's it right there.

If everything is *really* dynamic, and you can't even know a priori if the
result is going to be simple html, some other content type, a redirect, or
some other server response, then okay, I can see delaying output.

I think though that I'd try to write things in such a way that that
particular decision is as front-loaded as possible -- that is, try to
figure out which branch you're going to end up following before doing
anything else, then try to let the web client know if possible, then get
on to doing the rest of the output.

There are also simple html tricks that can help perceived speeds, e.g. if
your page layout depends on tables -- hopefully not these days, but I'm
sure it still happens -- then put some content before the tables, or try
to break the page into a vertical stack of tables or something along those
lines. That way it at least *looks* like something is happening to the web
user. I'd assume that with CSS layouts you don't need to be nearly as
clever in page construction, but haven't actually seen any tests to
verify this assumption.

 To be more specific, I noticed inconsistencies with some web apps that
 logged state  activity;

That is a valid point; I don't know what the best habit there would be. If
it's not too noisy, I personally would consider logging twice for anything
that I expected to take a long time -- first as a ping, with maybe the
parameters to be processed (if that's a reasonable amount of data to log),
and then again after processing with the finished results.

If you want to get real fancy, you could get into database style ACID
techniques, where everything is handled as a transaction and you put in
mechanisms for things like commits  rollbacks, and various commit logs
and so on -- but then at that point, the application may be complex enough
that it may start to make sense to actually let a database engine start
doing the work for you rather than hand-rolling such mechanisms. Maybe.

 Thinking about it, another reason I habitually delay printing even http
 headers is that a script's result state might be expressed as a
 redirect, an image, or as plain text, rather than html.

See above about branching; that's how I think I'd try to cope with this.

 OTOH, a quick
 look shows that I have a bunch of CGIs deployed whose first action is to
 print the usual Content-type: text/html\n\n line, long before doing
 anything with the input or any data, which follows the standard CGI
 advice.

...and that, generally, is my approach. Print the headers, try to print
the header part of the html (title, stylesheet, etc), and then just the
most generic part of the top of the actual page -- that for the visual
feedback for the user more than anything else.

 So, no, Chris, you're not wrong. But my what works practice is to do
 as much as possible before output. Maybe it's more correct to say
 before final output but I don't know how the webserver would know that
 a script has no more output. Is it because the client breaks the
 connection once it gets the html output?

I got an interesting mail about this offlist which I won't try to quote
or summarize unless the sender minds. Andrew? Care to comment?

 Again, perhaps someone who understand more about how Apache (or IIS)
 controls CGI execution could comment.

In general, I think your opening paragraph nailed it: there are a lot of
variables to consider, and there may be no general best practice. In
many cases, my hunch is still to send data as you have it, but there can
obviously be cases where this doesn't work in practice.

As with everything else in web programming, the only real rule of thumb is
probably just one word: Test.




-- 
Chris Devers
still baffled by the apache::dbi errors