Hello,
I've noticed that when using a script to read an exceptionally large
directory (~ 1000 or so files), parse an exceptionally large file (~1000 or
so lines), or repetitively perform external system commands, that the memory
used to do this is not freed when the particular step has finished.
For example, if I parse a file, detect some if it's fields, and write values
from those fields to external files, the MEM Usage indicator in the Task
Manager shoots up and stays there until the batch script (in this case, a
wizard) is finished or canceled. If I were to need to press the "< Back"
button on the wizard, and perform the step again, then the MEM Usage would
shoot up again, compounding the MEM Usage. It also, as you would expect,
take noticeably longer to perform the same step.
Is there a way to free this memory used after performing a step like this?
Thanks,
Lance
-----------------------------------------------------------
To make changes to your subscription, please visit our website,
http://www.ermapper.com/technicl/ermapperl/index.htm