i have about 5 GB's worth of stuff that back up periodically to CD (weekly to an archive server, but about quarterly to CD, for extra security). within this data, there are subcategories, to it hasn't been too hard to simply make one CD for each sub-category. the data has grown so large however, that it is impossible to fit an entire category on one CD. so, i am faced with a very difficult problem... how cut the data up into 650 MB chunks, in a way that is easy, fast, and makes sense for accessing later.

i think that it would be really nice if there was a program that would take a directory (with all of the sub-directories within it)... yes all 5 GB worth... and simply start burning... filling up one CD after the next... prompting you to insert the next disc when each fills up... writing the files to disk in the order that they are within the directory... while preserving the file structure. once done, you might have part of one directory on one CD and part of it on another, but the directory structure would be retained and consistent, so that you could find everything.

does something like this exist? or is there an easy way to accomplish this kind of backing up?

-wade



____________________
BYU Unix Users Group http://uug.byu.edu/ ___________________________________________________________________
List Info: http://uug.byu.edu/cgi-bin/mailman/listinfo/uug-list

Reply via email to