I am developing a product that extracts large amounts of data from public
databases, crafts urls, and then extracts information from those pages and
makes archetypes from that information.

I am dealing with after about 2000 records I get into memory/swap space
issues (yes there are hardware limitations).  Can anyone recommend a good
strategy for me to batch this process so that I don't get into swap space
issues?

Thanks in advance.

matthew
_______________________________________________
Product-Developers mailing list
[email protected]
http://lists.plone.org/mailman/listinfo/product-developers

Reply via email to