If the best guess technique is used that means almost
all the discs would not be filled to capacity. The
data be burned is being kept online via Pioneer
jukeboxes in our data center (700 discs each). Thus
the idea of not filling each disc to its full capacity
means we are not maximizing our use of square footage
in the data center, we are not using the full capacity
of the jukeboxes and the media. While this would
improve performance it would waste lots of precious
money. This is why I opted to go with the slow
technique instead of best guess. I appreciate the
suggestion. 

I guess the perfect world solution would be to have
some option for mkisofs -print-size to store its run
information and to have another option to add a
file(s) to its previous run. That way the problem
could be called sort of incrementally without
re-reading all the old files again. On CD-R this isn't
too bad but we do a lot of DVD-R stuff too. The DVD-Rs
are holding 17,000 to 21,000 files per disc, thats a
lot of files to re-read each time! To help with this I
am getting a SCSI hard drive. 

Anyone else with an idea?

--- James Pearson <[EMAIL PROTECTED]> wrote:
> >I have the same problem. My system is automatically
> >receiving files over the network and archiving them
> to
> >CD-R. I wanted a way to automatically determine in
> my
> >own program when the CD-R would be full when
> selecting
> >files to be put on the CD-R. For this to be
> accurate
> >you need to know the filesystem overhead (as you
> >clearly state). I tried looking at the mkisofs code
> to
> >figure this out but it was more complicated than I
> was
> >willing to do so instead I used the command
> "mkisofs
> >-print-size <DIR>" to determine how big an ISO of
> that
> >directory would be. I then compare it to my disc
> size,
> >if there is more room I add another file. I do this
> in
> >my program after each file I select to put on the
> CD-R
> >to make sure I don't overrun the media size. This
> is
> >terribly inefficent but I don't know what else to
> do.
> >To help minimize the number of calls to -print-size
> I
> >changed my program so I only call it after the size
> of
> >the files equals 90%+ of the media size (only do
> >accurate size check when rough size check is
> getting
> >close to full). It is still slow but tolerable. Any
> >other suggestions would be appreciated.
> 
> This question comes up from the time to time ....
> 
> There is no easy way to accurately estimate the
> overhead - except by
> running mkisofs with the -print-size option....
> 
> All I would suggest is using a 'safe' overhead
> percentage based on CDs
> you've already written.
> 
> James Pearson
> 
> 
> -- 
> To UNSUBSCRIBE, email to
> [EMAIL PROTECTED]
> with a subject of "unsubscribe". Trouble? Contact
> [EMAIL PROTECTED]
> 


__________________________________________________
Do You Yahoo!?
Yahoo! - Official partner of 2002 FIFA World Cup
http://fifaworldcup.yahoo.com


-- 
To UNSUBSCRIBE, email to [EMAIL PROTECTED]
with a subject of "unsubscribe". Trouble? Contact [EMAIL PROTECTED]

Reply via email to