Greg Rose <[EMAIL PROTECTED]> writes: >The scariest thing, though... at first I put in an unkeyed RC4 generator for >the self-test data, but accidentally ran the FIPS test on a straight counter >output... and it passed (even version 1)! I'd always assumed that something in >the regularity of a counter would trigger it. Running through the buffer, >XORing consecutive bytes, makes it fail quite handily, but might also have the >undesirable effect of hiding a bias in the data, if there was one. I'm thinking >of suggesting to NIST that a stronger test would be to run the test on the raw >data, and then on the data after XORing consecutive bytes... if it's really >random stuff it should pass both, and in the meantime this would catch all >sorts of failures. Any comments?
General-purpose data compressors (which make rather nice entropy estimators) also have problems with counting events. The Calgary compression corpus (the Dhrystone of the compression world) includes a file geo in which every fourth byte is a zero. No standard compressor will pick this up, so that while they all realise that zeroes occur with ~25% probability, they don't realise that they always occur at every fourth byte (alongside a few others in between). There will always be data patterns which appear obvious to a human but aren't easily picked up by automated tests, so I don't know how far it's worth chasing this thing. Peter. --------------------------------------------------------------------- The Cryptography Mailing List Unsubscribe by sending "unsubscribe cryptography" to [EMAIL PROTECTED]
