Thanks, Eric.  I may give that a go, although a stand-alone process is starting 
to look like a better way...

-Kevin

> 
> From: Eric Rybski <[EMAIL PROTECTED]>
> Date: 2006/12/06 Wed AM 12:15:32 EST
> To: "[EMAIL PROTECTED]" <[EMAIL PROTECTED]>
> CC: [email protected],  [EMAIL PROTECTED]
> Subject: Re: Threads running in cgi scripts...
> 
> Kevin,
> 
>    Yes, that error correlates directly with a CGI timing out. (See
> http://httpd.apache.org/docs/2.0/mod/core.html#timeout for how that
> timeout is controlled.)
> 
>    If you still want to try to use persistant perl threads, you'll
> probably need to use mod_perl2 with the PerlChildInitHandler directive
> to spawn your "server" thread in each apache child process.  See the
> following link for an example of how you might do this: 
> http://www.gossamer-threads.com/lists/modperl/modperl/77651#77651
> 
>    Once again, though, probably the most efficient solution is to use
> an external datasource for your data, and connect to it whenever you
> need it.  If using a separate server is too much (like memcached or a
> database server), then you could try BerkeleyDB
> (http://search.cpan.org/~pmqs/BerkeleyDB-0.31/) or SQLite
> (http://search.cpan.org/~msergeant/DBD-SQLite-1.13/lib/DBD/SQLite.pm). 
> Each can store your data in a separate data source and access it using
> in-memory shared libraries.  BerkeleyDB has the advantage of having a
> perltie interface, such that with a little work, you can abstract away
> the access to your data in a way that looks like you're using standard
> perl hashes and arrays.
> 
>    Of course, this depends on how difficult it would be to convert the
> file data to a database format and keep it updated.  Thus, I'd probably
> recommend you try the PerlChildInitHandler mod_perl2 idea first.
> 
> Regards,
> Eric
> 
> --- "[EMAIL PROTECTED]" <[EMAIL PROTECTED]> wrote:
> 
> > Eric,
> > 
> > Thanks for the help.  One other things that might show some insight
> > into the problem, is that I seeem to get this error in the Apache log
> > when the thread finally dies.  (70007) The timeout specified has
> > expired: ap_content_length_filter: apr_bucket_read() failed.
> > 
> > One other question -- if I made my server in a .pm file, which
> > creates
> > the thread, and I 'require' that .pm file in more than one script,
> > each
> > server will be in a separate thread of execution, right?  Is there
> > any
> > 'common memory' style of execution I can use for a daemon, such that
> > I
> > can run one instance of the server, yet access it using function
> > calls?
> >  Too much to ask perhaps?
> > 
> > Thanks again,
> > 
> > -Kevin
> > 
> > Eric Rybski wrote:
> > > Kevin,
> > >
> > >    If you are using a CGI script to start a perl thread, then it is
> > > highly likely apache is timing out (the CGI hosting) your thread.
> > > (Apache threads should block on an executed CGI until it returns,
> > or
> > > apache times it out, whichever occurs first.)
> > >
> > >    First of all, AFAIK, fork() is not an option on Win32 as it is
> > > emulated using multiple perl interpreters in a process (e.g.
> > ithreads).
> > >  See http://perldoc.perl.org/perlport.html and
> > > http://perldoc.perl.org/perlfork.html regarding this.
> > >
> > >    (Note that if you were running on a platform with native fork(),
> > you
> > > would still need to isolate STDIN, STDOUT, and STDERR pipes of the
> > > child process to truely daemonize the child process and allow the
> > > parent apache thread to return.)
> > >
> > >    Additionally, if your apache is configured to run multiple
> > servers
> > > (i.e. StartServers, MinSpareServers, and/or MaxSpareServers greater
> > > than 1), then this thread "data server" you are trying to create
> > will
> > > need to exist isolated as a separate instance in each apache child
> > > process.  This may or may not be an issue for what you are trying
> > to do
> > > with your thread "server".
> > >
> > >    What is the purpose of using a long-running thread?  Just to
> > provide
> > > access to read-only data instead of reading it from a file every
> > time?
> > >
> > > You may wish to re-approach the problem using a data server
> > independent
> > > from both apache and perl.  I recommend you check out
> > > http://www.danga.com/memcached/ as it's fast, reliable, fairly easy
> > to
> > > set up, and will persist data between *all* threads and processes
> > > without any trickery.  There is also a lightweight perl API to use
> > it,
> > > which implements the Cache::Cache interface:
> > > http://search.cpan.org/~bradfitz/Cache-Memcached-1.18/
> > >
> > > Hope this helps a bit.
> > >
> > > Regards,
> > > Eric
> > >
> > > --- "[EMAIL PROTECTED]" <[EMAIL PROTECTED]> wrote:
> > >
> > > > I am trying to write a utility that contains a server which spins
> > up
> > > > in
> > > > a cgi script.  I am using a thread (ithreads), do a check to see
> > if
> > > > it
> > > > is running, and the start the server.  I have the thread creation
> > in
> > > > a
> > > > module.  I am running in Win32, using an apache server, all
> > > > relatively
> > > > latest and greatest of evreything.
> > > >
> > > > So, my server dies after some time of successful operation.  Is
> > this
> > > > due to Apache limiting how long a process can run?  I am
> > considering
> > > > a
> > > > fork/exec (daemon-style), but would prefer not, as this poses
> > other
> > > > problems.  If I do use fork, is the only way to communicate to it
> > > > through IPCs (message queue, socket, etc?)?
> > > >
> > > > Also, I don't know how to handle static data in perl.  I have a
> > > > function in my module that looks up data out of a file.  How can
> > I
> > > > look
> > > > it up once, and keep it in memory thereafter?  Make it shared,
> > and
> > > > part
> > > > of my ever-running thread, or is there another way?  Every time I
> > > > enter
> > > > into my module for a function call, this data gets read from a
> > file.
> > > > 
> > > > Thanks in advance,
> > > > 
> > > > -Kevin
> > > > 
> > > >
> > 
> > 
> 
> 

Reply via email to