What's the best way to copy a large directory tree (around 3TB in
total) with a combination of large and small files? The files
currently reside on my NAS which is on my LAN (connected via gigabit
ethernet) and are mounted on my system as a NFS share. I would like to
copy all files/directories to an external hard disk connected via USB.

I care about speed, but I also care about reliability, making sure
that every file is copied, that all metadata is preserved and that
errors are handled gracefully. I've done some research and currently,
I am thinking of using tar or rsync, or a combination of the two.
Something like:

tar --ignore-failed-read -C $SRC -cpf - . | tar --ignore-failed-read
-C $DEST -xpvf -

to copy everything initially, and then

rsync -ahSD --ignore-errors --force --delete --stats $SRC/ $DIR/

To check everything with rsync.

What do you guys think about this? Am I missing something? Are there
better tools for this? Or other useful options for tar and rsync that
I am missing?

Cheers

--
Aryan
_______________________________________________
luv-main mailing list
[email protected]
http://lists.luv.asn.au/listinfo/luv-main

Reply via email to