I could but it would make things very difficult. Some of the entities around
id # 100 could be affected by entities around id #11000 and would result in
a file needing to be manipulated at the same time. Unfortunately, I don't
think this is a top to bottom change for the information at hand.

On Wed, Apr 23, 2008 at 4:36 PM, Bastien Koert <[EMAIL PROTECTED]> wrote:

>
>
> On 4/23/08, Steve Gula <[EMAIL PROTECTED]> wrote:
> >
> > I work for a company that has chosen to use XML (Software AG Tamino XML
> > database) as its storage system for an enterprise application. We need
> > to
> > make a system wide change to information within the database that isn't
> > feasible to do through our application's user interface. My solution was
> > to
> > unload the XML collection in question, open it, manipulate it, then
> > write it
> > back out. Problem is it's a 230+MB file and even with PHP's max mem set
> > to
> > 4096MB (of 8GB available to the system) SimpleXML claims to still run
> > out of
> > memory. Can anyone recommend a better way for handling a large amount of
> > XML
> > data? Thanks.
> >
> > --
> > --Steve Gula
> >
> > (this email address is used for list communications only, direct contact
> > at
> > this email address is not guaranteed to be read)
> >
>
> Can you chunk the data in any way, break it into smaller more managable
> peices?
>
> --
>
> Bastien
>
> Cat, the other other white meat




-- 
--Steve Gula

(this email address is used for list communications only, direct contact at
this email address is not guaranteed to be read)

Reply via email to