HT processors are a nightmare for security yes :).

You are assuming the target software is collecting data continuously as fast as it can - which I agree, simply turns it into the designated victim :). Don't do that - the data rate it high enough you can sample on demand and you can afford some delay between samples.
And make sure your sample collection code is branch free - you can still attack it via the cache, but it's a lot harder to know exactly where the victim is and your attack code has to be able to get that exactly right. The usual assumptions - the attacker doesn't have root privileges.

Pete



-----owner-openssl-...@openssl.org wrote: -----
To: openssl-dev@openssl.org
From: Andy Polyakov
Sent by: owner-openssl-...@openssl.org
Date: 01/21/2012 12:53AM
Subject: Re: OS-independent entropy source?

> My comments were to clarify why this works 'quite well' on multi-user
> systems even though the underlying source may not be truely random - and
> why it may not be as usable on single user ones.

Attached is circular cross-correlation vector for two synchronized
threads running on multi-core non-hyperthreading x86 processor.
"Synchronized" means that one thread blocks on semaphore and then
collects data, while another thread unlocks the semaphore and then
collects data. "Multi-core" means that both threads exercise same
external memory interface. As mentioned earlier high spikes is
manifestation of system timer interrupt, nothing to worry about. But
what do we make from the fact that there are areas with effectively
"guaranteed" correlation of 0.02? How does the value translate in
"tangible" terms? Is it acceptable?

Naturally single-CPU system can't exhibit such behavior...

Reply via email to