I am running optimizations using different algorithms within the NLopt.jl 
package. Memory slowly builds until the julia process is killed. I thought 
the problem might be that the NLopt opt objects leaks some memory so I 
tried the following code after each optimization run (there is a big loop 
running multiple optimization runs after each other) to release the 
NLopt::Opt object saved in my nlopt in "slot" named "opt" object:

  # Overwrite the opt object to try to give back its memory. There seems to 
be mem leaks
  # when we have long-running opt runs:
  NLopt.destroy(nlopt.opt) # Not sure what is the effect of this but we 
try...
  nlopt.opt = nothing
  gc()

but memory keeps building. I guess it could be in my (large and very 
complex and thus hard to distill down to an example) code used in the 
fitness function that NLopt calls out to but this is normal Julia code and 
should be garbage collected. I realize it is hard to debug without more 
concrete code but if anyone has ideas on why the Julia process might slowly 
but continuously be increasing its mem use (I'm running on a MacBoock Pro 
with Yosemite) I'd appreciate any tips/pointers or how to debug further.

Each NLopt run is on the order of 15 minutes with around 2000 function 
evaluations.

Regards,

Robert Feldt 

Reply via email to