Hello,

We're meeting a limit with the numberOfBuffers.

In a quite complex job we do a lot of operations, with a lot of operators, on a 
lot of folders (datehours).

In order to split the job into smaller "batches" (to limit the necessary 
"numberOfBuffers") I've done a loop over the batches (handle the datehours 3 by 
3), for each batch I create a new env then call the execute() method.

However it looks like there is no cleanup : after a while, if the number of 
batches is too big, there is an error saying that the numberOfBuffers isn't 
high enough. It kinds of looks like some leak. Is there a way to clean them up ?

Reply via email to