Speak of the Devil

2006-05-19 Thread Mike Perry
British govt just started pushing for Part III of RIPA citing
terrorism and kiddie porn as major reasons to require people to
disclose encryption keys...

http://arstechnica.com/news.ars/post/20060518-6870.html

Seems we may have a strong ally on our side on this one. International
bankers might not want the local police requiring them to hand over
keys either, though they certainly have enough political influence to
stop investigations before they start I'm sure...

The UK Crypto thread that spawned this article is here:
http://www.chiark.greenend.org.uk/pipermail/ukcrypto/2006-May/080742.html

One can only hope that the Bill of Rights is enough to keep this
bullshit out of the US, but who knows.

-- 
Mike Perry
Mad Computer Scientist
fscked.org evil labs


Did you see this?

2006-05-19 Thread Eric H. Jung
U.K. Government to force handover of encryption keys
http://news.zdnet.co.uk/0,39020330,39269746,00.htm


Re: Speak of the Devil

2006-05-19 Thread Dan Mahoney, System Admin

On Thu, 18 May 2006, Mike Perry wrote:

A few varying thoughts here:

I can't speak for the british government, but if someone came to me and 
said someone is using your SSL-enabled webmail system to traffic kiddie 
porn and felt that somehow the easiest way to sniff their traffic was 
with my private key (as opposed to just asking me to tap their spool 
dir, tar up their homedir, and gladly hand over any information 
associated with them), I'd be more than willing to cooperate.  With 
probable cause.  I know warrants are difficult, but I come from a law 
enforcement family.


Sadly, the truth here is that if someone is using my server, then the 
fedgov HAS to act as if I am in on this, and will likely blow their 
investigation if they contact me -- at least this is how procedural rules 
are set up for them.


I've investigated kiddie porn complaints on my network, and let me say 
this in total seriousness -- while we've all seen the maxim-like young 
looking models that are just recently 18 (hell, they advertise on regular 
cable here in the states)...every once in a while you come across a site 
like the ones in question that is so blatant, so disgusting -- where 
there's no question in your mind that yes, that's thirteen.  Following 
that, there's a fit of nausea and a willingness to research some drug or 
amount of voltage that can remove the images you've just seen from your 
mind.  I'm told the sensation is about ten times worse if you're a parent.


With that said, however...

There's nothing stopping governments from logging the traffic (possibly at 
a higher level, like the upstream level) and then getting a subpoena for 
whatever key was used to encrypt it.


The PROBLEM with this method is that once the length of the warrant has 
expired, 99 percent of people out there DO NOT check CRL's.  I myself am 
guilty of this.  I.e. once the government HAS your key, they've got it for 
the lifetime of your cert -- and while you can certainly retire that cert 
from use, there's no way to prevent the now-compromised cert and key from 
being used creatively for the remainder of the validity period.


Or am I wrong here?

-Dan



British govt just started pushing for Part III of RIPA citing
terrorism and kiddie porn as major reasons to require people to
disclose encryption keys...

http://arstechnica.com/news.ars/post/20060518-6870.html

Seems we may have a strong ally on our side on this one. International
bankers might not want the local police requiring them to hand over
keys either, though they certainly have enough political influence to
stop investigations before they start I'm sure...

The UK Crypto thread that spawned this article is here:
http://www.chiark.greenend.org.uk/pipermail/ukcrypto/2006-May/080742.html

One can only hope that the Bill of Rights is enough to keep this
bullshit out of the US, but who knows.




--

Don't be so depressed dear.

I have no endorphins, what am I supposed to do?

-DM and SK, February 10th, 1999

Dan Mahoney
Techie,  Sysadmin,  WebGeek
Gushi on efnet/undernet IRC
ICQ: 13735144   AIM: LarpGM
Site:  http://www.gushi.org
---



[EMAIL PROTECTED]: [Clips] UK Government to force handover of encryption keys]

2006-05-19 Thread Eugen Leitl

I have no keys, and I must disclose.

- Forwarded message from R.A. Hettinga [EMAIL PROTECTED] -

From: R.A. Hettinga [EMAIL PROTECTED]
Date: Thu, 18 May 2006 14:17:16 -0400
To: [EMAIL PROTECTED]
Subject: [Clips] UK Government to force handover of encryption keys

--- begin forwarded text


  Delivered-To: [EMAIL PROTECTED]
  Delivered-To: [EMAIL PROTECTED]
  Date: Thu, 18 May 2006 14:10:20 -0400
  To: Philodox Clips List [EMAIL PROTECTED]
  From: R.A. Hettinga [EMAIL PROTECTED]
  Subject: [Clips] UK Government to force handover of encryption keys
  Reply-To: [EMAIL PROTECTED]
  Sender: [EMAIL PROTECTED]

  http://www.zdnet.co.uk/print/?TYPE=storyAT=39269746-39020330t-1025c




  Government to force handover of encryption keys

  Tom Espiner

  ZDNet UK

  May 18, 2006, 12:10 BST

  The UK Government is preparing to give the police the authority to force
  organisations and individuals to disclose encryption keys, a move which has
  outraged some security and civil rights experts.

  The powers are contained within Part 3 of the Regulation of Investigatory
  Powers Act (RIPA). RIPA was introduced in 2000, but the government has held
  back from bringing Part 3 into effect. Now, more than five years after the
  original act was passed, the Home Office is seeking to exercise the powers
  within Part Three of RIPA.

  Some security experts are concerned that the plan could criminalise
  innocent people and drive businesses out of the UK. But the Home Office,
  which has just launched a consultation process, says the powers contained
  in Part 3 are needed to combat an increased use of encryption by criminals,
  paedophiles, and terrorists.

  The use of encryption is... proliferating, Liam Byrne, Home Office
  minister of state told Parliament last week. Encryption products are more
  widely available and are integrated as security features in standard
  operating systems, so the Government has concluded that it is now right to
  implement the provisions of Part 3 of RIPA... which is not presently in
  force.

  Part 3 of RIPA gives the police powers to order the disclosure of
  encryption keys, or force suspects to decrypt encrypted data.

  Anyone who refuses to hand over a key to the police would face up to two
  years' imprisonment. Under current anti-terrorism legislation, terrorist
  suspects now face up to five years for withholding keys.

  If Part 3 is passed, financial institutions could be compelled to give up
  the encryption keys they use for banking transactions, experts have warned.



  The controversy here [lies in] seizing keys, not in forcing people to
  decrypt. The power to seize encryption keys is spooking big business,
  Cambridge University security expert Richard Clayton told ZDNet UK on
  Wednesday.

  The notion that international bankers would be wary of bringing master
  keys into UK if they could be seized as part of legitimate police
  operations, or by a corrupt chief constable, has quite a lot of traction,
  Clayton added. With the appropriate paperwork, keys can be seized. If
  you're an international banker you'll plonk your headquarters in Zurich.

  Opponents of the RIP Act have argued that the police could struggle to
  enforce Part 3, as people can argue that they don't possess the key to
  unlock encrypted data in their possession.

  It is, as ever, almost impossible to prove 'beyond a reasonable doubt'
  that some random-looking data is in fact ciphertext, and then prove that
  the accused actually has the key for it, and that he has refused a proper
  order to divulge it, pointed out encryption expert Peter Fairbrother on
  ukcrypto, a public email discussion list.

  Clayton backed up this point. The police can say 'We think he's a
  terrorist' or 'We think he's trading in kiddie porn', and the suspect can
  say, 'No, they're love letters, sorry, I've lost the key'. How much
  evidence do you need [to convict]? If you can't decrypt [the data], then by
  definition you don't know what it is, said Clayton.

  The Home Office on Wednesday told ZDNet UK that it would not reach a
  decision about whether Part 3 will be amended until the consultation
  process has been completed.

  We are in consultation, and [are] looking into proposals on amendments to
  RIPA, said a Home Office spokeswoman. The Home Office is waiting for the
  results of the consultation before making any decisions, she said.

  The Home Office said last week that the focus on key disclosure and forced
  decryption was necessary due to the threat to public safety posed by
  terrorist use of encryption technology.

  Clayton, on the other hand, argues that terrorist cells do not use master
  keys in the same way as governments and businesses.

  Terrorist cells use master keys on a one-to-one basis, rather than using
  them to generate pass keys for a series of communications. With a
  one-to-one key, you may as well just force the terrorist suspect to decrypt
  that communication, or use 

Re: Speak of the Devil

2006-05-19 Thread Eugen Leitl
On Fri, May 19, 2006 at 03:59:46AM -0400, Dan Mahoney, System Admin wrote:

 I can't speak for the british government, but if someone came to me and 
 said someone is using your SSL-enabled webmail system to traffic kiddie 
 porn and felt that somehow the easiest way to sniff their traffic was 

I can't believe you have actually bought into this tripe about
terrorists, and pedophiles. Consider it the new Godwin's law:
if someone mentions pedophiles, terrorists and drug traffickers
in order to justify wiretapping, that argument is automatically
nil and void.

 with my private key (as opposed to just asking me to tap their spool 
 dir, tar up their homedir, and gladly hand over any information 
 associated with them), I'd be more than willing to cooperate.  With 

Are you running a Tor node? You should not be running a Tor node.

 probable cause.  I know warrants are difficult, but I come from a law 
 enforcement family.
 
 Sadly, the truth here is that if someone is using my server, then the 
 fedgov HAS to act as if I am in on this, and will likely blow their 
 investigation if they contact me -- at least this is how procedural rules 
 are set up for them.

So basically I can use bogus pedophile and terrorist charges to
shut down about anybody? No doubt that's terribly convenient for
some people.
 
 I've investigated kiddie porn complaints on my network, and let me say 
 this in total seriousness -- while we've all seen the maxim-like young 
 looking models that are just recently 18 (hell, they advertise on regular 
 cable here in the states)...every once in a while you come across a site 
 like the ones in question that is so blatant, so disgusting -- where 
 there's no question in your mind that yes, that's thirteen.  Following 

What has this to do with turning over your keys because somebody
claims that children are being violated somewhere?

 that, there's a fit of nausea and a willingness to research some drug or 
 amount of voltage that can remove the images you've just seen from your 
 mind.  I'm told the sensation is about ten times worse if you're a parent.

So, again, what has moral indignation to do with cooperating with
people who you *know* would lie and bend the law to their advantage?
 
 With that said, however...
 
 There's nothing stopping governments from logging the traffic (possibly at 
 a higher level, like the upstream level) and then getting a subpoena for 
 whatever key was used to encrypt it.
 
 The PROBLEM with this method is that once the length of the warrant has 
 expired, 99 percent of people out there DO NOT check CRL's.  I myself am 
 guilty of this.  I.e. once the government HAS your key, they've got it for 
 the lifetime of your cert -- and while you can certainly retire that cert 
 from use, there's no way to prevent the now-compromised cert and key from 
 being used creatively for the remainder of the validity period.
 
 Or am I wrong here?

Yes, you're being a good German here. Facilitating the totalitarian
takover, by cooperating instead of being difficult.

-- 
Eugen* Leitl a href=http://leitl.org;leitl/a http://leitl.org
__
ICBM: 48.07100, 11.36820http://www.ativel.com
8B29F6BE: 099D 78BA 2FD3 B014 B08A  7779 75B0 2443 8B29 F6BE


signature.asc
Description: Digital signature


P2P revisited.

2006-05-19 Thread Watson Ladd

-BEGIN PGP SIGNED MESSAGE-
Hash: SHA1

If we created a P2P client using Tor that acted as an exit node we  
could get a lot more users, a lot more traffic, and a lot more  
capacity, all adding to the anonymity Tor provides.  Any downsides?  
I'm not saying Tor implement a P2P network, I'm saying we have two  
clients, one just Tor, the other a P2P client built on top.


Sincerely,
Watson Ladd
- ---
Those who would give up Essential Liberty to purchase a little  
Temporary Safety deserve neither  Liberty nor Safety.
- -- Benjamin Franklin 



-BEGIN PGP SIGNATURE-
Version: GnuPG v1.4.3 (Darwin)

iD8DBQFEbQHXGV+aWVfIlEMRAslQAKCx9UHkcn+i7c2rC8xsg0lUoMPfHgCgj5TY
F06trCjAsBgN8kbRmvc+VNc=
=DbPD
-END PGP SIGNATURE-


Re: Speak of the Devil

2006-05-19 Thread Landorin
I agree to your posting except the following:

Eugen Leitl schrieb:


 Yes, you're being a good German here. Facilitating the totalitarian
  takover, by cooperating instead of being difficult.

That was totally inappropriate and since then I wonder on what
prejudices you base your judgement on... I don't want an answer to
this since this isn't a flame board and shouldn't turn into one. Just
something for you to think about.
Also, if someone has a contrary position, one should talk to him in an
objective way and not the way you just did it. Otherwise don't be
surprised if people who believe more control leads to more security
won't even listen to you.
As said, do not respond to this, just think about it.

-- 
Accelerate cancer research with your PC:
http://www.chem.ox.ac.uk/curecancer.html

GPG key ID: 4096R/E9FD5518



Re: P2P revisited.

2006-05-19 Thread Fabian Keil
Watson Ladd [EMAIL PROTECTED] wrote:

 If we created a P2P client using Tor that acted as an exit node we  
 could get a lot more users, a lot more traffic, and a lot more  
 capacity, all adding to the anonymity Tor provides.  Any downsides?  

While it could motivate some people to run Tor on their servers
and thus adding capacity, I believe it's more likely that it
would motivate more people to block as much Tor traffic
as possible and lead to congestion of the network.

Fabian
-- 
http://www.fabiankeil.de/


signature.asc
Description: PGP signature


Re: Did you see this?

2006-05-19 Thread Steve Crook
On Thu, May 18, 2006 at 07:16:49PM -0700, Eric H. Jung wrote:
 U.K. Government to force handover of encryption keys
 http://news.zdnet.co.uk/0,39020330,39269746,00.htm

Yes, once this is passed encrypting storage with a passphrase becomes a
pointless exercise in the UK unless you are prepared to spend time at
Her Majesty's pleasure in order to protect your data.

I think the best solution is to run privacy services in a different
jurisdiction from where the operator resides.  For example, my Tor node
is located in Texas and runs from encrypted volumes that I manually
mount from the UK after a reboot.  I don't think the special
agreements between these countries currently stretch to international
demands for passphrases.  No doubt this would rapidly change if the
accusation was related to terrorism or possibly one of the other
horsemen of the infocalypse.

I'd be interested to hear other suggestions for circumventing RIPA.


plausible deniability

2006-05-19 Thread Matej Kovacic
Hi,

 Yes, once this is passed encrypting storage with a passphrase becomes a
 pointless exercise in the UK unless you are prepared to spend time at
 Her Majesty's pleasure in order to protect your data.

I thought plausible deniability feature of True Crypt is usable for
repressive regimes like China only.

I think I was wrong.

bye, Matej


Re: plausible deniability

2006-05-19 Thread Marko Sihvo

Matej Kovacic wrote:

Hi,

  

Yes, once this is passed encrypting storage with a passphrase becomes a
pointless exercise in the UK unless you are prepared to spend time at
Her Majesty's pleasure in order to protect your data.



I thought plausible deniability feature of True Crypt is usable for
repressive regimes like China only.

I think I was wrong.

bye, Matej

  


Individual. Welcome to freedom*.

*( crime not included.

Society. I do not take* illegal amphetamines.

*( except on mon, tue, wed, thu, fri, sat, sun

The society pretends to offer just laws. We pretend to obey.


RE: Did you see this?

2006-05-19 Thread Tony
Hi.
 
As the RIPA 3 is currently written there seem to be two big holes.
 
1. Destroy the key and retain proof that you destroyed it - eg microwave the 
USB key.
 
It seems that the law is only really designed to cope with keys (passphrases) 
that you can remember. Therefore if you have a physical 'key file' and can 
destroy it then there doesnt seem to be a penalty for that if I read it 
correctly. You can prove that you no longer posess the key - and therefore cant 
be penalised for refusing to reveal it!
 
2. Keep multiple keys (e.g. a dummy volume).
 
The act specifies that if there is more than one key, you can choose which key 
to give up!
 



From: [EMAIL PROTECTED] on behalf of Steve Crook
Sent: Fri 19/05/2006 12:41
To: tor talk
Subject: Re: Did you see this?



On Thu, May 18, 2006 at 07:16:49PM -0700, Eric H. Jung wrote:
 U.K. Government to force handover of encryption keys
 http://news.zdnet.co.uk/0,39020330,39269746,00.htm

Yes, once this is passed encrypting storage with a passphrase becomes a
pointless exercise in the UK unless you are prepared to spend time at
Her Majesty's pleasure in order to protect your data.

I think the best solution is to run privacy services in a different
jurisdiction from where the operator resides.  For example, my Tor node
is located in Texas and runs from encrypted volumes that I manually
mount from the UK after a reboot.  I don't think the special
agreements between these countries currently stretch to international
demands for passphrases.  No doubt this would rapidly change if the
accusation was related to terrorism or possibly one of the other
horsemen of the infocalypse.

I'd be interested to hear other suggestions for circumventing RIPA.


winmail.dat

Re: Speak of the Devil

2006-05-19 Thread Jason Holt


On Fri, 19 May 2006, Eugen Leitl wrote:

What has this to do with turning over your keys because somebody
claims that children are being violated somewhere?


But, think of the children!  Won't *somebody* think of the children???

-J


RE: Did you see this?

2006-05-19 Thread Tony
I didn't say a false key, I said a dummy key. One that will work, but
would unlock a dummy outer volume - but not all data within it. There is
no way of telling the inner contents of such a drive from random data.
There are several products that can do that. The act specifically says
that if there are multiple keys then you can choose which one to
release.

Destroying a false key and claiming you didn't have the key would be
illegal if you still possessed the real key. 


-Original Message-
From: [EMAIL PROTECTED] [mailto:[EMAIL PROTECTED]
On Behalf Of Jonathan D. Proulx
Sent: 19 May 2006 17:28
To: or-talk@freehaven.net
Subject: Re: Did you see this?

On Fri, May 19, 2006 at 03:11:20PM +0100, Tony wrote:

:2. Keep multiple keys (e.g. a dummy volume).
: 
:The act specifies that if there is more than one key, you can choose
which key to give up!

That just means you can revoke the key when they're done.  Giving a
false key is not giving a key.  

You can play whatever games you want (ie microwave a different USB
frob while shipping the real key to a trusted associate in a country
without an extradition tready), but that isn't a loop hole in the law
that can be legally exploited, it's a dodge that can land you in heaps
more trouble if you're caught.

-Jon




Threats to anonymity set at and above the application layer; HTTP headers

2006-05-19 Thread Seth David Schoen
It's pretty well understood that anonymity can be lost at higher protocol
layers even when it's well protected at lower layers.

One eye-opening paper on this point is Can Pseudonymity Really Guarantee
Privacy? by Rao and Rohatgi (in the Freehaven Anonymity Bibliography):

http://www.usenix.org/publications/library/proceedings/sec2000/full_papers/rao/rao.pdf

This is a philosophically interesting problem; it prompts the question if
pseudonymity can't guarantee privacy, what _can_?.  (Rao and Rohatgi
remind us that the authors of the Federalist Papers used pseudonyms and
were still identified solely from the evidence of their writing.)

There is also the scary field of timing attacks on users' typing:

http://www.cs.berkeley.edu/~daw/papers/ssh-use01.pdf
http://www.cs.berkeley.edu/~tygar/papers/Keyboard_Acoustic_Emanations_Revisited/ccs.pdf

(The Tygar paper is not really relevant for network surveillance, but
it shows the scariness of statistical methods for figuring out what
users are doing based on seemingly irrelevant information.)

In a sense, there are many privacy-threatening features in and above the
application layer (some of them depending on the nature and latency of a
communication):

* timing of access (what time zone are you in, when do you usually do
something?) -- for communications with non-randomized latency  1 day

* typing patterns (cf. Cliff Stoll's _Cuckoo's Egg_ and the Song et al. paper)

* typing speed

* language comprehension and selection

* language proficiency

* idiosyncratic language use

* idiosyncratic language errors (cf. Rao and Rohatgi)

* cookies and their equivalents (cf. Martin Pool's meantime, a cookie
equivalent using client-side information that was intended for a
totally different purpose -- cache control)

* unique browser or other application headers or behavior (distinguishing
MSIE from Firefox from Opera? not just based on User-agent but based on
request patterns, e.g. for inline images, and different interpretations
of HTTP standards and perhaps CSS and JavaScript standards)

* different user-agent versions (including leaked information about the
platform)

* different privoxy versions and configurations

I'm not sure what to do to mitigate these things.  The Rao paper alone
strongly suggests that providing privacy up to the application layer
will not always make communications unlinkable (and then there is the
problem of insulating what pseudonymous personae are supposed to know
about or not know about, and the likelihood of correlations between
things they mention).

These problems are alluded to in on the Tor web site:

   Tor can't solve all anonymity problems. It focuses only on protecting
   the transport of data. You need to use protocol-specific support
   software if you don't want the sites you visit to see your identifying
   information. For example, you can use web proxies such as Privoxy
   while web browsing to block cookies and withhold information about
   your browser type.

   Also, to protect your anonymity, be smart. Don't provide your name
   or other revealing information in web forms. Be aware that, like
   all anonymizing networks that are fast enough for web browsing, Tor
   does not provide protection against end-to-end timing attacks: If
   your attacker can watch the traffic coming out of your computer,
   and also the traffic arriving at your chosen destination, he can
   use statistical analysis to discover that they are part of the same
   circuit.

However, the recommendation to use Privoxy, by itself, is far from
solving the problem of correlations between user and user sessions.
I think a low-hanging target is the uniqueness of HTTP headers sent by
particular users of HTTP and HTTPS over Tor.  Accept-Language, User-Agent,
and a few browser-specific features are likely to reveal locale and OS
and browser version -- sometimes relatively uniquely, as when someone
uses a Linux distribution that ships with a highly specific build of
Firefox -- and this combination may serve to make people linkable or
distinguishable in particular contexts.  Privoxy does _not_, depending on
its configuration, necessarily remove or rewrite all of the potentially
relevant HTTP protocol headers.  Worse, different Privoxy configurations
may actually introduce _new_ headers or behaviors that further serve to
differentiate users from one another.

One example is that some Privoxy configurations insert headers specifically
identifying the user as a Privoxy user and taunting the server operator;
but if some users do this and other users don't, the anonymity set is
chopped up into lots of little bitty anonymity sets.  For instance:

+add-header{X-User-Tracking: sucks}

User tracking does suck, but adding an optional header saying so has the
obvious effect of splitting the anonymity set in some circumstances into
people who send the X-User-Tracking: sucks header and people who don't.
Any variation in practice here is potentially bad for the size of the