On 2/22/07, cdouglas <[EMAIL PROTECTED]> wrote:
> The scenario is that I have 500 text files in a directory.  Each file
> contains up to 4000 XML documents of several different types just
> appended one after another.  The file size is between 20MB and 60MB.
> Each individual XML file is about 10k bytes. I get the out of memory
> error roughly after 5-6 files. If I restart the script, it will
> process 5-6 more and then throw another memory error.
> My script goes through the directory and loads each file one by one
> into a string.  I explode it so that I get each XML doc into its own
> array element.  I then create a new SDO object using the correct XSD
> for the XML type and load the xml string from the array.  I pull out
> data, massage it, and then write it to a MySQL database. I then clean
> up the created object, strings and go onto the next array element.
> Using the PHP memory_get_usage, it reports only a little more than the
> size of the current file I am processing, which is up to 60MB.  My PHP
> mem limit is set to 512 MB and the machine has 8 GB of RAM.
> Thanks
> Chris

Thanks for that Chris

I started testing with very small files which highlights performance issues
we have with loading the files in the first place and memory leak
performance which I passed over to the tuscany team. I'll up my file sizes
and make them like you describe and see what happens now..



You received this message because you are subscribed to the Google Groups 
"phpsoa" group.
To post to this group, send email to phpsoa@googlegroups.com
To unsubscribe from this group, send email to [EMAIL PROTECTED]
For more options, visit this group at 

Reply via email to