On 11/28/2006 8:54 AM, Carl Lowenstein wrote:
On 11/21/06, Karl Cunningham <[EMAIL PROTECTED]> wrote:
Gus Wirth wrote:
#!/bin/bash
ISOFILE=$1
[ -z "${ISOFILE}" ] && ISOFILE="/data/cdr/data.iso"
SPEED=$2
[ -z "${SPEED}" ] && SPEED="16"
cdrecord -v -fs=32m -dao -overburn speed=${SPEED} dev=/dev/hdc -data
${ISOFILE}
echo "Please stand by...  calculating md5sum..."
md5sum ${ISOFILE} /dev/cdrom

This is nice if you have an infinitely fast CPU.  Otherwise, you are
computing the MD5 sum of two large images, and then, it seems to me,
visually comparing them on two consecutive lines of screen output.

Why not just do a byte-by-byte compare cmp(1) of the two images.  Read
the two images in parallel, running at CDreader speed with no time out
for computing the MD5 sum.  Finishes early if there is an
early-detected error.

Things to note:  some CD drives don't know they have come to the end
of the image when reading, and continue for a few more blocks.  This
would change the comparison, whether MD5 or cmp.  So we want to
actually count the data as it is read, to stop at the end.

Not all images are ISO9660 even if we call them ".iso".  So the tool
/usr/bin/isosize won't always work for determining the image size.


   carl

Carl,

Thanks for the notes, and the code. I'll try using cmp

Karl

--
[email protected]
http://www.kernel-panic.org/cgi-bin/mailman/listinfo/kplug-newbie

Reply via email to