Hi,

perhaps it would be better for an article export to access the database 
directly than to use
the oxid framework. (I use this method when I export many articles.)

BTW How long does it take to export your 1700 articles (until the
script runs out of memory)?

Thorsten


> hi everyone,

> actually i'm writting an export which is run by a cronjob. There
> are ca. 3200 articles to be exported, but after 1700 articles the
> script runs out of memory (ca. 130 MB). My exportclass inherits from
> the dynexportbase-class and is using its full functionality.
> These caches i'm clearing after each call of the "nextTick"-function:
> oxUtilsObject::getInstance()->resetInstanceCache();
> oxUtils::getInstance()->cleanStaticCache();

> After preparing the export the script is using 8,9 MB memory (to
> get this value i used memory_get_usage) and foreach call of the
> native getOneArticle the memory usage is growing by ca. 80 KB -
> 120KB (i measured after unsetting the fetched article).

> Do anyone has an idea where i could find a possiblity to improve
> the usage of memory or on which places oxid is caching?

> During my search for the memory problem i saw that in
> dynexportbase::_initArticle articles are loaded by oxNew, not by
> oxNewArticle, why? This function is caching instances of all loaded
> articles, is there something similar for other objects like categories, 
> countries etc?


> _______________________________________________
> dev-general mailing list
> [email protected]
> http://dir.gmane.org/gmane.comp.php.oxid.general

_______________________________________________
dev-general mailing list
[email protected]
http://dir.gmane.org/gmane.comp.php.oxid.general

Reply via email to