The naive countermeasure to timing attacks is to add a random delay,
but of course that can be averaged out by repeating the computation. 
I have never heard anyone propose a delay that is based on the input,
and maybe some per-machine secret, so that it is unpredictable but
constant.  Of course this wouldn't solve averaging across different
inputs from some subset, but it would prevent averaging on the same
value.  Perhaps something more clever could be done to prevent
averaging across subsets -- for example, the timing of the actual
computation could be used as an input to the delay function.
--  -><-
"We already have enough fast, insecure systems." -- Schneier & Ferguson
GPG fingerprint: 50A1 15C5 A9DE 23B9 ED98 C93E 38E9 204A 94C2 641B

The Cryptography Mailing List
Unsubscribe by sending "unsubscribe cryptography" to [EMAIL PROTECTED]

Reply via email to