The scenario is that I have 500 text files in a directory.  Each file
contains up to 4000 XML documents of several different types just
appended one after another.  The file size is between 20MB and 60MB.
Each individual XML file is about 10k bytes. I get the out of memory
error roughly after 5-6 files. If I restart the script, it will
process 5-6 more and then throw another memory error.

My script goes through the directory and loads each file one by one
into a string.  I explode it so that I get each XML doc into its own
array element.  I then create a new SDO object using the correct XSD
for the XML type and load the xml string from the array.  I pull out
data, massage it, and then write it to a MySQL database. I then clean
up the created object, strings and go onto the next array element.

Using the PHP memory_get_usage, it reports only a little more than the
size of the current file I am processing, which is up to 60MB.  My PHP
mem limit is set to 512 MB and the machine has 8 GB of RAM.

Thanks
Chris


--~--~---------~--~----~------------~-------~--~----~
You received this message because you are subscribed to the Google Groups 
"phpsoa" group.
To post to this group, send email to [EMAIL PROTECTED]
To unsubscribe from this group, send email to [EMAIL PROTECTED]
For more options, visit this group at 
http://groups.google.co.uk/group/phpsoa?hl=en
-~----------~----~----~----~------~----~------~--~---

Reply via email to