On Thu, 2008-12-04 at 07:10 +0000, Mick wrote:
> Almost every time I split a large file >1G into say 200k chunks, then ftp it 
> to a server and then:

That's thousands of files!  Have you gone mad?!

> 
>  cat 1 2 3 4 5 6 7 > completefile ; md5sum -c completefile

> if fails.  Checking the split files in turn I often find 1 or two chunks that 
> fail on their own md5 checks.  Despite that the concatenated file often works 
> (e.g. if it is a video file it'll play alright).

Let me understand this. Are [1..7] the split files or the checksums of
the split files?  If the former then 'md5sum -c completefile' will fail
with "no properly formatted MD5 checksum lines found" or similar due to
the fact that "completefile" is not a list of checksums.  If the latter,
then how are you generating [1..7]? If you are using the split(1)
command to split the files and are not passing at least "-a 3" to it
then your file is going to be truncated do to the fact that the suffix
length is too small to accommodate the thousands of files needed to
split a 1GB+ file into 200k chunks. You should get an error like "split:
Output file suffixes exhausted."

Maybe if you give the exact commands used I might understand this
better.

I have a feeling that this is not the most efficient method of file
transfer.


Reply via email to