"Anthony Gardner" <[EMAIL PROTECTED]> writes:

> I have a problem.
> 
> I want to use data in XML format and store it in memory for obvious
> reasons of speed and to utilise the capabilities of XML. I also want
> to load the data at server start up time.
> 
> The problem arises with the size of the data. I end up with six
> children each with 150MB of data. Needless to say I've run out of
> memory and the error_log gives error msgs about not being able to
> fork.
> 
> I really would like to keep the data in XML format and not use a
> DB. How can I make it so that the children reference one copy of
> data and not have their own.
> 
> I've tried assigning the $doc from $xp->parsefile to IPC::Shareable
> but that keels over under the load.
> 
> I could maybe get away with not loading the data at startup time but
> how can the children reference it?

I think you're going about the problem wrong.  If you have that much
data, it makes sense to store it in a database.  Leave as much of it
on the file system as possible instead of in process memory -- that
way each process can utilize the OS's caching of the file itself and
not worry about Perl-level memory sharing.

Even _one_ instance of 150meg of data is inefficient.  As nice as XML
is, for some things it just isn't appropriate.  Maybe you could make a
script to translate the XML into a database format, perhaps using
MySQL or Postgresql.

I think you'll continue to bang your head against a wall unless you
change your strategy; slurping 150meg of data at a time just isn't the
appropriate way to solve almost any problem.  Just use DBI, you'll be
happier for it :)

Chip

-- 
Chip Turner                   [EMAIL PROTECTED]
                              Programmer, ZFx, Inc.  www.zfx.com
                              PGP key available at wwwkeys.us.pgp.net

Reply via email to