"Cory Teshera-Sterne" <ctste...@gmail.com> wrote

I'm wondering if I'm going to get myself in hot water memory- or
performance-wise since these scripts are scheduled to run on thousands of
files fairly often.

It shouldn't be worse than your bash scripts since they implicitly
start subprocesses anyway. And if you mare moving some of the
functionality into Python it should be better overall.

Do I need to explicitly close the subprocesses, or clean
them up, or something

Mostly they should just close down themselves.
Keep a check with a tool like top when testing to ensure you
don't get something stuck spinning in a loop, but it should be
OK most of the time.

An example is a call to the command-line image manipulation program,
ImageMagick:
s = subprocess.Popen(['identify', '-format', ''%w %h\n'', 'testimg.jpg'],
stdout=subprocess.PIPE)

If its image manipulation have you checked out the PIL package?
It can do most of what imagemagick can do from Python. Its not
in the standard library but Google will find it for you. The only snag is
that I'm not sure if its on Python 3 yet if you are using v3...

# output should be the dimensions of testimg.jpg in pixels, ie, 100 50

PIL should definitely be able to manage that kind of thing.

There are also Python bindings to drive ImageMagick from within
Python too although I don;t know if they are regularly maintained.

HTH,

--
Alan Gauld
Author of the Learn to Program web site
http://www.alan-g.me.uk/


_______________________________________________
Tutor maillist  -  Tutor@python.org
To unsubscribe or change subscription options:
http://mail.python.org/mailman/listinfo/tutor

Reply via email to