On Sat February 10 2007 08:42, Janus wrote: > I need a simple system for regular backups. The easiest would be to make > one big tar file out of my home directory and copy it to an external disk > via scp. > > But is there a max size limit for tar files? Can I trust tar and make one > big tar file, e.g. 25 GB of my entire home directory?
Hi Janus, I tried what you're contemplating here and was not at all satisfied, the principal reason being huge tarballs are sluggish to deal with when it comes time to recover something inadvertently broken or lost. I now use rsync to create mirrored snapshots of directories and partitions by scheduled script as well as via commandline whenever I need one. rsync will default to ssh2 protocol when you use it to sync a remote target to a local source. And it's reasonably easy to use, with practice. Examples: rsync -av /home/carl/ /mnt/homebak --> the 'a' is for 'archive' (means preserve all the original attributes); 'v' is verbose so you can observe progress; the trailing '/' on source means 'do not copy the directory, itself, just everything underneath it; no trailing '/' seems to be required for local targets... works for me ;-) rsync -av --delete /home/carl/ /mnt/homebak --> same as above, but truly 'syncs' a target to the source by deleting files and directories which no longer exist on the source. rsync -av /home/carl/ [EMAIL PROTECTED]:~/carlbackup/ --> will default to ssh2; will prompt for a password if needed --> add the '--delete' parameter to maintain a true 'mirror' of the source (items deleted locally since the last 'snapshot' will be deleted at the target.) Of course, YMMV and all that... hth & regards, Carl -- To unsubscribe, e-mail: [EMAIL PROTECTED] For additional commands, e-mail: [EMAIL PROTECTED]
