I have a process that I want run on every folder on a file system.  The
operation has to happen from the deepest child folder first then back to
the parent.   Some folders will have many folders,  which is where parallel
fits well.  However, all child folders must be processed before their
parent.

Using a recursive script works well so that the operations on all the
children always happen before the parent no matter how deep the folder tree
goes.  But when combined with 'parallel' in some cases I end up with too
many processes running at once, choking the system.

Is there a way so that multiple 'parallel' processes become aware of each
other and never run more than a specified number of processes at once, yet
still satisfies the dependency order?  Or maybe there's a better way to
approach this problem?

Thanks,
-Chip

Reply via email to