I've suggested one thing below that may getting your first approach
going.

There still must be a better way though... Creating a new instance of
the perl interperter is messy.

Gary Nielson wrote:
> 
> Thank you! I had not realized you had sent a previous email as well. I
> tried this solution and so far, it has not worked -- and I have some
> questions about that. But I did find another solution that does work, and
> I wanted to share that and see if there is any disadvantage to this
> approach, if there might be more performance degradation this way or not.
> 
> First, the problems I encountered with the first approach.
> I realized that the newsedit, encrypted perl application relies on
> a variable passed to the apache Web server before initiating
> the cgi script (see below). When I invoked:
> 
> [-
> sub newsedit
> {
>         my ($self, $department) = @_;
>         unshift(@INC,
>         '/webServer/virtualDW/cannonschool.org/cgi-bin/newsEdit2') ;

Instead of tacking on ?name=Homepage when executing a CGI via a system
call, or from the command line, I believe you need to do something like
this:

          $ENV{QUERY_STRING} = "name=Homepage";

If that doesn't work, maybe try adding QUERY_STRING=\"name=Homepage\" in
front of /usr/bin/perl on the next line:

>         open FH, ("/usr/bin/perl
>         
>/webServer/virtualDW/cannonschool.org/cgi-bin/newsEdit2/department.cgi?name=Homepage|")
> or
>                 die "Cannot start cgi";
>         @output = <FH> ;
>         close FH ;
>         print OUT @output ;

You may want to change this code. Currently it reads all of the output
of the CGI into a scaler, then prints it out. You could change it to
print each line as it receives it from the pipe:

while (<FH>) {
 print OUT $_;
}
close (FH);

This will use less memory, and the browser will get data as it comes
available, not in one huge chunk after waiting 2 seconds.

> }
> -]
> 
> it did not work and I suspected that was why. I do not know for sure. Does
> this make sense?
> 
> When invoked through exec cgi, it passes the following to the apache web
> server:
> 
> <!--#set var="name" value="Homepage" -->
> <!--#exec cgi="/cgi-bin/newsEdit2/department.cgi" -->
> 
> But then I thought of another way. I do not know if this is a good
> solution or not, but it does work:
> 
> [-
> sub writepage
>         {
>         use LWP::Simple;
>         my ($self, $department) = @_;
>         
>$graburl="http://www.cannonschool.org/cgi-bin/newsEdit2/department.cgi?name=$department";;
>         print OUT (get $graburl);
>         }
> -]
> 
> I invoke it in an html file as:
> 
> [- $subs = $param[0];
>    $subs->writepage ("Homepage");
> -]
> 
> and it comes up. Now I am wondering if by having to make a DNS request to
> the domain, if I am creating more work and therefore slowing down
> execution more than if I could actually get the first solution to work
> right. Or if invoking another instance of perl, in either case, amounts to
> about the same amount of work. What do you think, and if you think the
> first approach is the better way, what am I still doing wrong?

Try and get the first approach to work; it will be much faster and
easier on the server. With the second approach, two requests are done
for every request to the Embperl page.s

-- 

Regards,

Wim Kerkhoff, Software Engineer
Merilus, Inc.  -|- http://www.merilus.com
Email: [EMAIL PROTECTED]

---------------------------------------------------------------------
To unsubscribe, e-mail: [EMAIL PROTECTED]
For additional commands, e-mail: [EMAIL PROTECTED]

Reply via email to