Re: [Cryptography] real random numbers

2013-09-15 Thread John Denker
Previously I said we need to speak more carefully about these
things.  Let me start by taking my own advice:

Alas on 09/14/2013 12:29 PM, I wrote:
 a) In the linux random device, /any/ user can mix stuff into the
 driver's pool.  This is a non-privileged operation.  The idea is that
 it can't hurt and it might help.  So far so good. b) Contributions of
 the type just mentioned do *not* increase the driver's estimate of
 the entropy in the pool.  If you want to increase the
 entropy-estimate, you need to issue a privileged ioctl.

 ... step (a) cannot get anybody into trouble.  Step (b) gets you into
 trouble if you claim credit for more entropy than was actually
 contributed.

Actually it's one step more complicated than that.  Step (a) 
causes problems if you /underestimate/ the entropy content of
what you contributed.  The problem is that the end-user
application will try to read from the RNG and will stall
due to insufficient entropy available.

Step (b) has the opposite problem: You get into trouble if 
you /overestimate/ the entropy of what you have contributed.
This causes insidious security problems, because your allegedly 
random numbers are not as random as you think.


On 09/14/2013 03:12 PM, John Kelsey wrote:

 Your first two categories are talking about the distribution of 
 entropy--we assume some unpredictability exists, and we want to 
 quantify it in terms of bits of entropy per bit of output.  That's a 
 useful distinction to make, and as you said, if you can get even a 
 little entropy per bit and know how much you're getting, you can get 
 something very close to ideal random bits out.
 
 Your second two categories are talking about different kinds of 
 sources--completely deterministic, or things that can have
 randomness but don't always.  That leaves out sources that always
 have a particular amount of entropy (or at least are always expected
 to!).

That very much depends on what you mean by expected.
 -- An ill-founded expectation is little more than a wild guess,
  and it is not useful for critical applications.
 ++ OTOH a well-founded statistical expectation value is just
  what we need, and it moves the source firmly out of the
  squish category.

I say again, a squish is not reliably predictable /and/ not
reliably unpredictable.  If you have *any* trustworthy nonzero
lower bound on the entropy content, it's not a squish.

On the other hand, again and again people latch onto something
that is not reliably predictable, call it random, and try
to do something with it without establishing any such lower
bound.  This has led to disaster again and again.

There is a ocean of difference between not reliably predictable
and reliably unpredictable.

 I'd say even the squish category can be useful in two way
 
 a.  If you have sensible mechanisms for collecting entropy, they 
 can't hurt and sometimes help.  For example, if you sample an 
 external clock, most of the time, the answer may be deterministic, 
 but once in awhile, you may get some actual entropy, in the sense 
 that the clock drift is sufficient that the sampled value could have 
 one of two values, and an attacker can't know which.

However, alas, the good guys don't know how much either, so they 
don't know much to take credit for.  An underestimate causes the 
RNG to stall, and an overestimate means the output is not as random
as it should be.  I vehemently recommend against risking either of
these failures.

I emphasize that there are two operations that must be considered
carefully:  
 1) Mixing stuff into the driver's pool, and
 2) taking credit for it ... the right amount of credit.

One without the other is strictly amateur hour.

 b.  If you sample enough squishes, you may accumulate a lot of 
 entropy.

You might, or you might not.  In an adversarial situation, this
is begging for trouble.  I vehemently recommend against this.

 Some ring oscillator designs are built like this, hoping to 
 occasionally sample the transition in value on one of the 
 oscillators.

Hope is not an algorithm.

 The idea is that the rest of the behavior of the oscillators might
 possibly be predicted by an attacker, but what value gets read when
 you sample a value that's transitioning between a 0 and a 1 is really
 random, changed by thermal noise.

So quantify the thermal noise already.  It sounds like you are
using the oscillator as a crude digitizer, digitizing the thermal
noise, which is the first step in the right direction.  The next 
step to come up with a hard lower bound on the entropy density.

OTOH when you plug in the actual numbers, you will probably find 
that the oscillator is incredibly inefficient compared to a 
soundcard.

My main point is, there is a perfectly reasonable formalism for 
analyzing these things, so that hope is not required.

Secondarily, there is a huge industry mass-producing soundcards
at a very low price.  Very often, a soundcard is build into the
mainboard, whether you ask for it or not.  So in 

Re: [Cryptography] real random numbers

2013-09-15 Thread Jerry Leichter
On Sep 14, 2013, at 5:38 PM, Kent Borg wrote:
 Things like clock skew are usually nothing but squish ... not reliably 
 predictable, but also not reliably unpredictable. I'm not interested in 
 squish, and I'm not interested in speculation about things that might be 
 random. 
 
 I see theoretical the enemy of the good here.
 
 The term squish is entertaining, but be careful that once you paint away 
 with your broad brush that you don't dismiss engineering realities that 
 matter.

 And once we have built such vaguely secure systems, why reject entropy 
 sources within those systems, merely because they you think they look like 
 squish?  If there is a random component, why toss it out?  You seem to 
 respect using hashing to condition and stretch entropy--though why any 
 existing hash shouldn't also fall to your squish generalization, I don't 
 know.
You've completely missed what Denker was getting at with squish.  Squish 
never applies to a fully characterized, deterministic component like a hash.  
Squish is an unknown unknown:  Data that you don't understand, so you think 
it might be random, but you really can't be sure.  Consider the example he 
responded to, that comparing the clocks on the CPU and on the sound card 
should be usable as a source of randomness.  If you dig in to what should be 
usable means here, it comes down to:  Both clocks show some degree of random 
variation, and the random variation is uncorrelated.  That might be true - or 
it might not:  Perhaps there's some path you haven't thought of through the 
power supply that tends to synchronize the two.  Lack of imagination on the 
analyst's part does not equate to lack of correlation (or other failure modes) 
on the system's part!  (In fact, the world is full of unexpected couplings 
between nominally independent events.  I've debugged and fought f
 ailures in systems built on the unsupported assumption that things will 
smooth out on average.  They are always unexpected, and can be difficult to 
find after the fact.  And ... people don't seem to learn the lesson:  The next 
system makes the same bad assumptions.)

As Denker said:  Adding squish as a source of confusion in a well implemented 
mixer is at worst harmless - if you want to do it, go ahead.  But adding it as 
a source of an entropy estimate is wrong.  Either you have some way of 
estimating the entropy based on real physical modeling, or you're just making 
things up - and just making things up is not the way to build a secure system.
   -- Jerry


___
The cryptography mailing list
cryptography@metzdowd.com
http://www.metzdowd.com/mailman/listinfo/cryptography


Re: [Cryptography] real random numbers

2013-09-15 Thread ianG

On 15/09/13 00:38 AM, Kent Borg wrote:

On 09/14/2013 03:29 PM, John Denker wrote:



And once we have built such vaguely secure systems, why reject entropy
sources within those systems, merely because they you think they look
like squish?  If there is a random component, why toss it out?



He's not tossing it out, he's saying that it is no basis for measurement.

Think of the cryptography worldview -- suppliers of black boxes (MDs, 
encryptions, etc) to the software world are obsessed about the 
properties of the black box, and suppliers want them to be reliable and 
damn near perfect.  No come back, no liability.


Meanwhile, in the software world, we think very differently.  We want 
stuff that is good enough not perfect.  That's because we know that 
systems are so darn complex that the problems are going to occur 
elsewhere -- either other systems that don't have the cryptographic 
obsession, our own mistakes or user issues.


E.g., SHA1 is close to perfect for almost all software needs, but for 
the cryptographers, it isn't good enough any more!  We must have SHA2, 
SHA3, etc.  The difference for most real software is pretty much like 
how many bit angels can dance on a pinhead.


As John is on the supplier side, he needs a measurement that is totally 
reliable and totally accurate.  Squish must therefore be dropped from 
that measurement.


...

You dismiss things like clock skew, but when I start to imagine ways
to defeat interrupt timing as an entropy source, your Johnson noise
source also fails: by the time the adversary has enough information
about what is going on inside the GHz-plus box to infer precise clock
phase, precise interrupt timing, and how fast the CPU responds...they
have also tapped into the code that is counting your Johnson.



Once the adversary has done that, all bets are off.  The adversary can 
now probably count the keys bits in use, and is probably at the point 
where they can interfere at the bit level.


Typically, we don't build designs to that threat model, that way lies 
TPMs and other madness.  In risk terms, we accept that risk, the user 
loses, and we move on.




There are a lot of installed machines that can get useful entropy from
existing sources, and it seems you would have the man who is dying of
thirst die, because the water isn't pure enough.



It is a problem.  Those on the supplier side of the divide cannot 
deliver the water unless it is pure enough.  Those on the builder side 
don't need pure water when everything else is so much sewage.  But oh 
well, life goes on.




Certainly, if hardware manufacturers want to put dedicated entropy
sources in machines, I approve, and I am even going to use rdrand as
*part* of my random numbers, but in the mean time, give the poor servers
a sip of entropy.  (And bravo to Linux distributions that overruled the
purist Linux maintainer who thought no entropy was better than poorly
audited entropy, we are a lot more secure because of them.)



Right.  The more the merrier.



iang
___
The cryptography mailing list
cryptography@metzdowd.com
http://www.metzdowd.com/mailman/listinfo/cryptography


Re: [Cryptography] real random numbers

2013-09-15 Thread Kent Borg

On 09/15/2013 10:19 AM, John Kelsey wrote:
But those are pretty critical things, especially (a). You need to know 
whether it is yet safe to generate your high-value keypair. For that, 
you don't need super precise entropy estimates, but you do need at 
least a good first cut entropy estimate--does this input string have 
20 bits of entropy or 120 bits? 


Yes, the time I was part of designing a physical RNG product (for use in 
real gambling, for real money) we made sure to not only sweep up all the 
entropy sources we could, and not only mixed in fixed information such 
as MAC addresses to further make different machines different, our 
manufacturing procedures included pre-seeding the stored pool with data 
from Linux computer that had a mouse and keyboard and lots of human input.


We did not try to do entropy accounting, but did worry about having enough.

We also were going way overboard on security thinking, far exceeding 
regulatory requirements for any jurisdiction we looked at.  I don't know 
if it every shipped to a customer, but we got all the approvals 
necessary so it could have...


I do agree that, though a Linux box might make keys on its first boot, 
it should be used interactively first, and then generate keys.


Again Ubuntu (at least a desktop install) doesn't include sshd by 
default, you have to decide to install it, and at that point, if there 
is a human setting up things with a keyboard and mouse, there should be 
a lot of entropy.  Ubuntu server installations might be different, and 
I would be very worried about automatic provisioning of server machines 
in bulk.


-kb

___
The cryptography mailing list
cryptography@metzdowd.com
http://www.metzdowd.com/mailman/listinfo/cryptography


Re: [Cryptography] real random numbers

2013-09-15 Thread Kent Borg
John Kelsey wrote:
 I think the big problem with (b) is in quantifying the entropy you get.

Maybe don't.

When Bruce Schneier last put his hand to designing an RNG he concluded that 
estimating entropy is doomed. I don't think he would object to some coarse 
order-of-magnitude confirmation that there is entropy coming in, but I think 
trying to meter entropy-in against entropy-out will either leave you starved or 
fooled.

-kb
___
The cryptography mailing list
cryptography@metzdowd.com
http://www.metzdowd.com/mailman/listinfo/cryptography


[Cryptography] ADMIN: entropy of randomness discussion is falling...

2013-09-15 Thread Perry E. Metzger
One wants maximum entropy not only from one's RNG but also from one's
discussions about randomness.

Sadly, entropy is measured based on the level of surprise at the
content, and the level of surprise is going down in the current
discussion. As surprise goes to zero, so does interest on the part of
the couple thousand people reading along.

I'd like to ask participants to please:

1) Write compactly but clearly.
2) Avoid repeating themselves.

Perry
-- 
Perry E. Metzgerpe...@piermont.com
___
The cryptography mailing list
cryptography@metzdowd.com
http://www.metzdowd.com/mailman/listinfo/cryptography


Re: [Cryptography] Why prefer symmetric crypto over public key crypto?

2013-09-15 Thread Tony Arcieri
On Thu, Sep 12, 2013 at 1:11 PM, Nico Williams n...@cryptonector.comwrote:

  - Life will look a bit bleak for a while once we get to quantum machine
 cryptopocalypse...


Why? We already have NTRU. We also have Lamport Signatures. djb is working
on McBits. I'd say there's already many options on the table if you want to
build a quantum-proof system.

-- 
Tony Arcieri
___
The cryptography mailing list
cryptography@metzdowd.com
http://www.metzdowd.com/mailman/listinfo/cryptography

Re: [Cryptography] Security is a total system problem (was Re: Perfection versus Forward Secrecy)

2013-09-15 Thread Dirk-Willem van Gulik

Op 13 sep. 2013, om 21:23 heeft Perry E. Metzger pe...@piermont.com het 
volgende geschreven:

 On Fri, 13 Sep 2013 08:08:38 +0200 Eugen Leitl eu...@leitl.org
 wrote:
 Why e.g. SWIFT is not running on one time pads is beyond me.
 
 I strongly suspect that delivering them securely to the vast number
 of endpoints involved and then securing the endpoints as well would
..
 The problem these days is not that something like AES is not good
 enough for our purposes. The problem is that we too often build a

While most documents on Swift its move from something very akin to OTP (called 
BKE) seem no longer to be on the internet; the documents:


http://web.archive.org/web/20070218160712/http://www.swift.com/index.cfm?item_id=57203
and

http://web.archive.org/web/20070928013437/http://www.swift.com/index.cfm?item_id=61595

should give you a good introduction; and outline quite clearly what 
organisational issues they where (and to this day stil are) in essence trying 
to solve. 

I found them quite good readings - with a lot of (often) implicit governance 
requirements which have wider applicability.  And in all fairness - quite a 
good example of an 'open' PKi in that specific setting if you postulate you 
trust SWIFT only so-so as a fair/honest broker of information - yet want to 
keep it out of the actual money path. A separation of roles/duties which some 
of the internet PKI's severly lack.

Dw.
___
The cryptography mailing list
cryptography@metzdowd.com
http://www.metzdowd.com/mailman/listinfo/cryptography


Re: [Cryptography] prism proof email, namespaces, and anonymity

2013-09-15 Thread StealthMonger
John Kelsey crypto@gmail.com writes:

 In the overwhelming majority of cases, I know and want to know the
 people I'm talking with.  I just don't want to contents of those
 conversations or the names of people I'm talking with to be revealed
 to eavesdroppers.  And if I get an email from one of my regular
 correspondents, I'd like to know it came from him, rather than being
 spoofed from someone else.

That's a good description of stealthmail [1].  My only regret is that it
badly needs an update and I don't have time these days to work on it.
But it still works out of the box.  Here's the Debian description:


Package: stealthmail
Architecture: all
Pre-Depends: gnupg
Depends: procmail, esubbf, openssl, dc, libssl0.9.6 | libssl0.9.7,
 fetchmail | kmail, suck, ppp, solid-pop3d, exim | exim4, dpkg (= 1.10.21),
 grep (= 2.5), bash (= 2.05b), ${shlibs:Depends}, ${misc:Depends}
Description: scripts to hide whether you're doing email, or when, or with whom
 Maintain on-going random cover traffic via usenet newsgroup
 alt.anonymous.messages, substituting encrypted live traffic when
 available.  A live message is indistinguishable from a random cover
 message except with the decryption keys.  All potential participants
 send messages to alt.anonymous.messages with rigid periodicity
 uncorrelated with any live traffic, and maintain an uninterrupted
 full feed from alt.anonymous.messages, so that an observer cannot
 determine whether, when, or among whom live communication is
 happening.
 .
 Members of a stealthmail group -- call it OurGroup for purposes
 of this discussion -- are defined by their knowledge of the
 encryption keys created for the group.  With this package installed,
 mail addressed to OurGroup@stealthmail does not go directly to the
 Internet like ordinary mail, but gets encrypted by the OurGroup key,
 given an encrypted subject intelligible only with OurGroup keys, and
 queued to go to alt.anonymous.messages in place of a piece of cover
 traffic at the next scheduled sending time.  Meanwhile, all messages
 appearing on alt.anonymous.messages are downloaded into an incoming
 queue.  A POP3 server runs on the local host.  The mail reader is
 provided with filters so that when it fetches mail from this local
 server, messages having subject lines encrypted for OurGroup (or any
 other stealthmail group of which this host is a member) are decrypted
 by the appropriate key and presented.  Other messages are discarded.


[1] See mailto URL below.


-- 


 -- StealthMonger stealthmon...@nym.mixmin.net
Long, random latency is part of the price of Internet anonymity.

   anonget: Is this anonymous browsing, or what?
   
http://groups.google.ws/group/alt.privacy.anon-server/msg/073f34abb668df33?dmode=sourceoutput=gplain

   stealthmail: Hide whether you're doing email, or when, or with whom.
   mailto:stealthsu...@nym.mixmin.net?subject=send%20index.html


Key: mailto:stealthsu...@nym.mixmin.net?subject=send%20stealthmonger-key



pgpqkHhnE3m__.pgp
Description: PGP signature
___
The cryptography mailing list
cryptography@metzdowd.com
http://www.metzdowd.com/mailman/listinfo/cryptography