On Sat, 27 Oct 2007 15:29:07 -0500 Loye Young <[EMAIL PROTECTED]> wrote:
> > Your analogy with meteorites is not correct. > > -- > > Peter van der Does > > > > You are so close to the trees that you are missing the forest. The > comparison is between a full CD that has low or no compression versus > a full CD that has very high compression being written using the same > physical device. My original observation was that there is a tradeoff > between getting more on the CD via higher compression on the one > hand, versus getting lower failure rates on CDs with lower > compression. Such is in accord with observations and with theoretical > analysis. > > When speaking of compression, we aren't actually speaking about disk > surface that doesn't have data written to it. Instead we are really > referring to whole sections of disk real estate that are filled with > zeros or other repetitive and unimportant data. If you need to make > better use of areas that are literally unpopulated, you employ > defragmentation, which is a related but different technique needed > for antiquated file systems. > > On a typical CD, the entire CD is populated, as Soren rightly > mentioned, with zeros and ones. The physical device makes the same > number of read/writes whether the data is compressed or not, and the > error rate is the same either way. On the surface, it would appear > that compression doesn't introduce more significant error. > > The difference is that on an uncompressed CD, much of what is written > is not important. Text files, for instance, are mostly a bunch of > zeros at the physcial layer. Compression uses algorhythms to > represent all those zeros in a shorthand way, so that the device > doesn't actually have to write each one of them. This frees up disk > real estate for more information. The consequence is that the > compressed disk has a higher density of important bits and bytes on > the same disk. > > Assuming that the device has a constant error rate and assuming that > the CD is filled to the same capacity, it is more likely that the > errors on compressed disks will affect something important and cause > a failure, simply because there is more important data on the CD. This is relative. What's important and what's not. Like I said earlier it all depends on what you need. Lets say Ubuntu ships out a version with Gnome and KDE combined, for the sake of this example everything fits on one disc. If there are errors in the KDE files and not in the Gnome files it doesn't really matter to the person who wants to install Gnome. In order to supply the same amount of data you'll need more CD's resulting in more chances of a read/write error. A typical compression compresses about 2.5 times now a days which means you need 2.5 times more CD's to hold the same amount of data as a compressed CD > > I actually remember when "floppy disks" were flexible 12 inch disks > and how amazed everyone was to get so much information on 5 1/2 inch > disks. Engineers have made remarkable progress over the last 30 > years. Much of the heavy lifting to make that possible was the > improvements in error prevention, detection, and correction > necessitated by the compression. > The fact that more data could be written on a 5 1/4, followed by 3 1/2 inch floppy had to do with miniaturizing of the components, not with compression. I know that DLT tapes have the option of hardware compression, they will compress the data before writing it to tape and back in the days you could buy a massive 30MB hardrive, which usually was a 20MB drive but the firmware did RLE. On CD's there isn't hardware compression. -- Peter van der Does GPG key: E77E8E98 IRC: Ganseki on irc.freenode.net Blog: http://blog.avirtualhome.com Jabber ID: [EMAIL PROTECTED] GetDeb Package Builder http://www.getdeb.net - Software you want for Ubuntu
signature.asc
Description: PGP signature
-- ubuntu-server mailing list [email protected] https://lists.ubuntu.com/mailman/listinfo/ubuntu-server More info: https://wiki.ubuntu.com/ServerTeam
