> BTW, I'm curious why you'all are mastering images with such large files? > > I hope you're not putting large tar.gz in them. Considering that: > > A) All it takes is a single byte error to destroy your entire archive > from that point forward, and > B) This double-archiving (remember, CD images are a type of archive > format) results in exponentially increasing the time it takes to access > files > > As such, using big, single file tar.gzs are not recommended.
That is _exactly_ what I'm doing. When I was deciding on a backup mechanism (tapes vs DVD), DVD seemed a lot cheaper. I guess I just assumed that the tape way, tar.gz, would have been appropriate. I guess it's not really a good idea in this case. > If you feel you must "double archive," at least using something like > afio (cpio-compatible), which does per-file compression in the archive. > In addition to removing tar.gz's "single byte total corruption" issue, > it also allows you to break up archives into multiple, independent > archives (_unlike_ using split, which still requires you to have all > pieces). OK, I'll look into afio. Thanks for the info. > Either that, or copy the tree you want to master, recursive gzip, bzip2 > or lzop the files themselves, and then master that. In addition to > massively decreasing suseptibility to single byte errors, you can > directly browser your tree on CD, and easily restore individual files. > If you'd prefer this, I have a script that will do this for you (it was > published in the 2002 April edition of SysAdmin). That sounds like a good idea also. Actually, I would like to read the whole article. Do you know if it is possible to buy the issue in PDF somewhere? If not, I will just buy a hard copy. Thanks a lot for the info. Scott Talbert _______________________________________________ Dvdrtools-users mailing list [EMAIL PROTECTED] http://mail.nongnu.org/mailman/listinfo/dvdrtools-users
