ok, looks like your setup is fine.

Though I cannot seem to reproduce your problem. Indeed it seems that the 
unbuffered output doesn't work. I'm checking on that. But I've devised a 
better test that verifies that the requests aren't serialized. If you 
run this hander by two clients at the same time, you should see the 
lines in /tmp/test123 being interleaved by two processes. If this 
doesn't happen and you get first 100 lines from process A followed by 
process B's 100 lines than indeed we have a problem.

sub handler {
     my $r = shift;
     $r->content_type('text/plain');
     local $| = 1;

     open my $fh, ">>", "/tmp/test123" or die $!;
     my $oldfh = select($fh); local $| = 1; select($oldfh);
     my $i = 0;
     while ($i < 100) {
       $r->print("$$: $i \n");
       print $fh "$$: $i \n";
       sleep 1;
       $i++;
     }

     $r->print(__PACKAGE__);
     close $fh;

     return Apache::OK;
}

If this indeed doesn't work, please try the suggestion at:
http://perl.apache.org/docs/1.0/guide/debug.html#Using_the_Perl_Trace
so you can see where the second process is stalled. If you don't get 
anything from this approach try to attach to the second process with gdb 
and see where it is stack on the C calls level. Though I suggest that 
you test with prefork and this setup:

<IfModule prefork.c>
StartServers         2
MinSpareServers      2
MaxSpareServers      2
MaxClients           2
MaxRequestsPerChild  0
</IfModule>

__________________________________________________________________
Stas Bekman            JAm_pH ------> Just Another mod_perl Hacker
http://stason.org/     mod_perl Guide ---> http://perl.apache.org
mailto:[EMAIL PROTECTED] http://use.perl.org http://apacheweek.com
http://modperlbook.org http://apache.org   http://ticketmaster.com

Reply via email to