> I hate to be a bit lazy, but lacking all the fancy Linux
> distro's, I was wondering if somebody could distill the
> essential commands needed to grab everything in bulk data
> format so that I'd just have a few .tgz files or something.
> Sort of like:
>
>     1) SF provides nightly tarballs of our repository for backup purposes.
>          http://cvs.sourceforge.net/cvstarballs/leaf-cvsroot.tar.gz
>     2) Grab all released files
>          wget -m http://prdownloads.sourceforge.net/leaf/

BTW:  You can rsync these from downloads.sourceforge.net::sourceforge/leaf/

>     3) Rsync some stuff
>          rsync blah
>     4) Back all that up to tape.
>
> Then I could at least guarantee to have it properly archived.
> I could probably do that once a week or two if the download
> went fast enough.  It if were 6 GB , that'd take me 11 hours
> downloading via my 150 KB/s pipe.  Backup to tape would take
> about 30 min :)  Same for verify.

I'm working on exactly that, but it will probably be a few more days until
I've got it down to a science.

I just got my new 1U server appliance, and have to play some with the new
toy...

In case you missed these, CompGeeks was selling an Intel 1U web server
appliance (ie "headless"), with 750 MHz P3, Adaptec 29160, 9G quantum drive,
256 Meg ram, and 2 onboard NIC's...all for $635.  I wish I'd bought three
instead of one (I didn't know they had 64 bit Adaptec 29160's in them at the
time!!!), but then I might never get back to LEAF work ;-)

Charles Steinkuehler
http://lrp.steinkuehler.net
http://c0wz.steinkuehler.net (lrp.c0wz.com mirror)



_______________________________________________
Leaf-devel mailing list
[EMAIL PROTECTED]
https://lists.sourceforge.net/lists/listinfo/leaf-devel

Reply via email to