On 2015-08-05 17:06, Glen Hermannsfeldt (Contractor) wrote: > 8 bits per byte, all ones. > ... > (and the third compression increased the original from 90 to 110 bytes.) > That's typical. If each compression reduced the size of the file, every file could be compressed with enough iterations to a single byte.
A colleague has proposed this as a compression technique: Regard any file as a stream of ones and zeroes representing a binary number. Iteratively subtract 1 from the number. Most of the time its length doesn't change; some small fraction of the time a borrow ripples all the way to the left and it gets one bit shorter. With sufficient iterations the file is reduced to a single bit. The algorithm is reversible; no information is lost. Simply add 1 as many times as you subtracted 1 and the original file is restored perfectly. (Leading zeroes spoil the technique. Devising a repair is left as an exercise for the student.) -- gil ---------------------------------------------------------------------- For IBM-MAIN subscribe / signoff / archive access instructions, send email to [email protected] with the message: INFO IBM-MAIN
