Hi Guys

I am in serious diffs here attempting to port a legacy system. We use a
custom tag system with template pages and a custom parser. This has been
working for several years in a modd_cgi environment, but due to
performance problems is being ported to mod_perl. The parser has been
ported and works fine, as do most of the libraries.

The problem I am facing is with our database definition files. These are
custom files which are required at run time. The file consists of a long
series of subroutine calls with arguments that refer to the definitions
of fields, tables, etc. They are used in conjunction with a series of
internal libraries to provide information for displaying data, handling
file upload locations, etc. The subroutines modify data in global
variables.

The subroutines called exist in the calling module's namespace. When
used as supplied they caused a significant memory leak (~120K per
request). I have done a lot of work over the past few days to try and
deal with the system to make it function as expected, but with no
success. It would be nice if this data could be read and compiled at
server start. I experimented with IPC::Shareable, but when I attempted
to do anything with it in my startup.pl file it segfaulted the server
and httpd would not start.

I have got myself to a situation where I can get the data in to memory
and work with it for some requests. With three child processes running
we get a situation where the data will load for the first three
requests, then the next three will not have the data, then one will
segfault, then no data then data appears again. I am slowly going mad
with this. I can get things working without all of this nonsense and
just using DBI, but word from above says that this must be integrated. 

I know there is not much to go on in this email, but if anyone wants to
help, respond by email and I can give you more information.

-- 
John Reid
Senior Analyst/Programmer
Open Connect (Ireland) Ltd
http://www.openconnect.ie/

Reply via email to