On 07/27/2009 11:10 AM, Marc T. wrote:
>> On Mon, 2009-07-27 at 16:30 +0200, Marc T. wrote:
>>> i am on a similar problem,
>>> i have to build a batch script that slices over 6000 images very big
>>> into seven pieces.
>> David's Batch Processor should be able to do the job, not by slicing but
>> by cropping each slice out of the image one at a time. Not the most
>> efficient technique, but it should work.
>> -- David
> thanx for the quick reply,
> but i really need to start the process from command line, to integrate it
> into a bigger workflow.
> and efficensy is a big isssue also,
> otherwise i could just stick to imagemagick
> i basically just need to know how i can start python batch scripts from
> can't find anything via google...
> maybe it's too simple..
I am speaking _without_ much experience here, and I am not the one who
wrote the scripts, but we process thousands of images in large & small
batches using ImageMagick. We use Perl, which has a ImageMagick module
(or whatever you call it).
We have one Perl script that does the actual work. However, we use an
outer wrapper Perl script to do the finding of the files that need to be
We run from the command line.
As far as I know, we are starting ImageMagick only once.
The process looks at about 40,000 potential source tiff images and makes
sure that none are newer than the about 160,000 target jpeg images. If
any tiffs are newer (or the jpegs don't exist), from each tiff, we make
4 jpegs of different output pixel dimensions, based on a complicated
algorithm determining/scaling output pixel dimensions based on input
image size (physical dimension as 300 dpi tiff).
If no images need to be made, then the whole thing runs in about 3
minutes or so.
(This is on a fairly old RedHat 8 linux server -- if it was running on
my new workstation and if the files themselves were on my new
workstation, it would at least double the speed.)
When images do need to be made, if the source tiffs are small (i.e. 300
KB), then they only take a couple seconds each. If the source tiffs are
large (i.e. for us 10-20 MB), then they take maybe 6-10 seconds each
depending upon what else is happening on the system. Again, these
speeds could surely be at least doubled on better hardware.
Regarding your really big images > 1 GB, if you are working with stuff
like than, then you need to be working with hardware that has enough
memory. Images that large (perhaps these are satellite photos?) are
surely "important", thus get a system that can do the job. Even my
lowly workstation has 4 GB of RAM and two 1 TB hard disks and dual
quad-core (2 GB I think?) processors. I think I spent $700 on it, not
including the monitor. The hardware cost is NOT the problem. The
problem is the skilled time to configure this kind of stuff to _really_
Look at Perl with ImageMagick.
Gimp-user mailing list