Ken Moffat wrote:
First comment: you've done it once, is it really important enough to spend more time trying to optimise it ?
Well, I haven't done it yet. The probleme is that I fear I won't be able to have a backup, so before trashing anything, I want to be sure there will not be any problems. All I can do is test on a subset of those thousands files, and test.
On the other hand, this is a way for me to learn a few things about scripting, generic tools, and healthy ways to use them, hence my RFC.
Second comment: from here, it looks like a problem for 'awk' (printing fields from the input) - just get the data for the file into one line with constant delimiters.
From what I've read up to now, I was fearing something like that :(
Thank you for the solution you're proposing, I'll ask once I've tried to understand it.
I have found another one ("by myself", he says proudly!), which is:
for FILE in *.{jpg,JPG} ; do
DATA=`jhead $FILE | grep -F -e"File name" -e"File size" -e"Date/Time" -e"Resolution"`
EXIF_NAME=`echo $DATA | cut -d ' ' -f 4`
EXIF_SIZE=`echo $DATA | cut -d ' ' -f 8`
EXIF_DATE=`echo $DATA | cut -d ' ' -f 12,13`
EXIF_RESO=`echo $DATA | cut -d ' ' -f 16,18`
printf "NAME: %s\tSIZE: %s\tDATE: %s %s\tRESO: %s x %s\n" $EXIF_NAME done
I have one problem with this yet: if a file is missing some information, variables values are "shifted to the left", which is annoying.
The obvious solution is to keep only files with correct tags, I'm playing with this right now, but any better idea is welcome.
--
http://linuxfromscratch.org/mailman/listinfo/lfs-chat
FAQ: http://www.linuxfromscratch.org/faq/
Unsubscribe: See the above information page
