Hi Danny,Unfortunately it's impossible to draw any conclusions from your data 
point.  For example, here in the U.S. I'd actually want a longer interrogation 
if the alternative is local law enforcement effortlessly exploiting my device 
and copying all my data from it.

-Jonathan


     On Monday, October 6, 2014 2:53 PM, Danny O'Brien <da...@eff.org> wrote:
   

 On Mon, Oct 06, 2014 at 05:56:59PM +0100, Eleanor Saitta wrote:
> On 2014.10.06 01.56, Bill Cox wrote:
> > I will have an impact on the code going forward.  Also, I am
> > entirely a pragmatist.  I am an engineer, not a cryptographer, and
> > I build stuff that works in the real world.  Can you explain a
> > deniable crypto-system that fits the real world?
> 
> It's unclear that there is one.  I'd feel far happier recommending a
> (new, continued development, audited, etc.) version of Truecrypt with
> no deniability features at all.  Using the features in such a way that
> you don't leave traces of the container has always been really, really
> difficult -- if you read the docs page on what's required to evade
> forensic detection, it should be pretty clear how unsuitable this
> feature is for regular users.  Yes, some of those might be removable
> with significant developer effort, but I'm not sure why that's worth
> it, given the larger issues.
>

I think one of the challenges here is that, to the extent that deniable
crypto-systems are used and understood in the real world, the switch
from "we will use our ingenious forensic tools to detect your
subterfuge" to "we will beat you up until you tell us the password" is
prompted by Truecrypt's presence and notoriety, rather than any feature
of the software. By that, I mean that the one data point I have is
talking to activists who say that if their laptop or devices are
inspected, having Tor and Truecrypt visibly installed is a signal for
further interrogation.

So we're really in a position where hiding the application from casual
inspection is more important than the cryptosystem, because the
cryptosystem is going to be bypassed by rubberhose cryptoanalysis once
noticed. Security developers hate this, I think,  because hiding an
application's traces on a standard OS is an endless task with no
guarantee that we haven't left some sort of fingerprint which is
trivially detectable with the right kind of tool. This is one of the
reasons why practical advice seems to be moving more towards the "have a
secure device which you hide" rather than "use secure software on your
visible everyday device". 

A hidden, cordoned device allows us to make a much stronger assertion
about the safety of its contents, and a much clearer moment to describe
when its contents may be breached. Under this design, deniability really
isn't something you implement in software. Deniability comes from
physically hiding the device. There's no deniability *within* Truecrypt
because Truecrypt use itself is already perceived as an indication of
guilt.

d.






> 
> > I think we who are trying to keep TrueCrypt alive could use your
> > advice.
> 
> Happy to chat more.
> 
> E.
> 
> -- 
> Ideas are my favorite toys.
> -- 
> Liberationtech is public & archives are searchable on Google. Violations of 
> list guidelines will get you moderated: 
> https://mailman.stanford.edu/mailman/listinfo/liberationtech. Unsubscribe, 
> change to digest, or change password by emailing moderator at 
> compa...@stanford.edu.
> 
-- 
Liberationtech is public & archives are searchable on Google. Violations of 
list guidelines will get you moderated: 
https://mailman.stanford.edu/mailman/listinfo/liberationtech. Unsubscribe, 
change to digest, or change password by emailing moderator at 
compa...@stanford.edu.


   
-- 
Liberationtech is public & archives are searchable on Google. Violations of 
list guidelines will get you moderated: 
https://mailman.stanford.edu/mailman/listinfo/liberationtech. Unsubscribe, 
change to digest, or change password by emailing moderator at 
compa...@stanford.edu.

Reply via email to