Hi!

I have about 3 million PNG files across some directories and I would
like to optimize them.
I did a shell script with this function:

optimize () {
        pngquant --force --speed 1 --ext .png 16 $1
        optipng -o7 -quiet $1
}

And I am (trying) to run it against all the PNGs with this:

find $TILEDIR -type f -name "*.png" | \
        parallel --env optimize -j +0 --eta optimize

But there is one problem: parallel starts to eat all my RAM and I just
can't run it.
After using 8G of RAM I have to kill it (it seems that it waits for
the full output of find and it stores all the file paths before
calling optimize() on them)

Is there a way to accomplish this task without having problems with
the memory usage, please?

parallel is 20140722

Thank you!

Best regards,
Nelson

Reply via email to