Re: building a true RNG

2002-08-03 Thread Bart Preneel



On 2 Aug 2002, Paul Crowley wrote:

 I meant to say, another example of a believed one-way function that is
 guaranteed to be able to produce any output is one based on the
 difficulty of discrete log:

 f:(x) = g^x mod p

 is bijective if the domain and range is 1..p-1, but finding preimages
 is the discrete log problem.  Of course this doesn't compress.  I
 don't know of any examples which compress and have collision resistance.

Choose a group of prime order.
Choose t random group elements g_i (1 = i = t) different from 1.
Write input x as x_1 || x_2 || ... || x_t
prepend a 1 bit to each x_i giving z_i

f:(x)= g_1^z_1 . g_2^z_2 .  . g_t^z^t

For details:
M. Bellare, O. Goldreich, S. Goldwasser,
(Incremental cryptography: the case of hashing and signing, Crypto 94)
after earlier work by D. Chaum, E. van Heijst, B. Pfitzmann,
(Cryptographically strong undeniable signatures, unconditionally secure
for the signer, Crypto'91) and S. Brands.

--Bart


-
The Cryptography Mailing List
Unsubscribe by sending unsubscribe cryptography to [EMAIL PROTECTED]



[SIMSOFT] Protecting Privacy with Translucent Databases

2002-08-03 Thread R. A. Hettinga


--- begin forwarded text


Status: RO
From: Simson L. Garfinkel [EMAIL PROTECTED]
To: [EMAIL PROTECTED]
Subject: [SIMSOFT] Protecting Privacy with Translucent Databases
Sender: [EMAIL PROTECTED]
Date: Sat, 3 Aug 2002 08:14:02 -0400

http://www.oreillynet.com/pub/a/network/2002/08/02/simson.htmlhttp://www.oreillynet.com/pub/a/network/2002/08/02/simson.html

Protecting Privacy with Translucent Databases

by http://www.oreillynet.com/pub/au/355Simson Garfinkel, author of
http://www.oreilly.com/catalog/websec2/Web Security, Privacy  Commerce,
2nd Edition
08/02/2002

Last week, officials at http://www.yale.edu/Yale University complained to
the FBI that admissions officers from
http://www.princeton.edu/index.shtmlPrinceton University had broken into
a Yale Web site and downloaded admission decisions on 11 students who had
applied to both schools. Princeton responded by suspending its associate
dean of admissions and launching an investigation. That's a good start, but
both colleges should go further, and redesign the way that their databases
treat personal information.

As details surrounding the incident have emerged, it's clear that there's a
lot of blame to go around. Both Yale and Princeton compete vigorously for
the nation's top high school students, and in recent years the competition
has become increasingly aggressive. The schools shower the best students
not just with phone calls and letters, but even with tuition discounts. As
part of that competition, this year Yale unveiled a new Web site designed
to let applicants find out if they had been admitted -- no more waiting for
either that thin rejection letter or the thick admissions packet.

Unfortunately, the security on the Yale Web site was atrocious: all anybody
needed to look up a student's record was that student's name, social
security number (SSN), and date of birth. And it just so happened that the
officials at Princeton had this same information for the most
highly-contested applicants. So in April, when the Web site went live,
Princeton's admissions office sprang to action as well, allegedly
downloading admissions decisions from the Yale Web site on at least 18
separate occasions. The most highly sought-after applicant? President
Bush's niece Lauren Bush, according to an article that appeared in The
Washington Post. (Read about it at
http://www.washingtonpost.com/wp-dyn/articles/A2983-2002Jul25.htmlhttp://www.washingtonpost.com/wp-
dyn/articles/A2983-2002Jul25.html and
http://www.washingtonpost.com/wp-dyn/articles/A7815-2002Jul26.htmlhttp://www.washingtonpost.com/wp-
dyn/articles/A2983-2002Jul25.html.)

Who's To Blame

Most of the cyber-security professionals I've spoken with have taken a
decidedly blame-the-victim approach with this latest story of Web site
hackery. Assuming that the allegations are true, it's terrible that an
administrator at Princeton would engage in such patently illegal
activities. But what's even worse, they say, is that Yale would deploy a
Web application so poorly conceived and implemented.

To be sure, Yale is not alone in deploying systems with poor security for
personal information. Many banks and credit card companies continue to
treat widely-circulated personal information, like SSNs and birthdays, as
if this information is secret, available only to the bank account holder or
credit card applicant. Clearly it is not, as evidenced by the national
epidemic in identity fraud. But financial organizations have been stymied
in their attempts to find a better means for verifying the identity of
account applicants -- people with whom, by definition, the banks have no
current relationship.

Poor Design Principles At Play

Related Reading

http://www.oreilly.com/catalog/websec2/index.html
http://www.oreilly.com/catalog/websec2/index.htmlWeb Security, Privacy 
Commerce, 2nd Edition
By
http://www.oreillynet.com/cs/catalog/view/au/355?x-t=book.viewSimson Garfinkel

http://www.oreilly.com/catalog/websec2/toc.htmlTable of Contents
http://www.oreilly.com/catalog/websec2/inx.htmlIndex
http://www.oreilly.com/catalog/websec2/chapter/ch08.htmlSample Chapter
http://safari.oreilly.com/main.asp?bookname=websec2Read Online--Safari

Yale could have designed a better system: it could have asked each
applicant to supply a PIN or a password as part of their application. An
even more secure solution would have been for the university to assign a
password to every applicant and send it back to the high school students
with their confirmation cards. Such an approach would have protected the
process against students who would otherwise use the same password for both
Yale and Princeton.

To provide even better security, Yale and Princeton could have used what's
called a translucent database, a term coined by author and cryptographer
Peter Wayner in his new book by the same title.

A translucent database uses cryptographic methods like hash functions and
public key cryptography to mathematically protect information so that it
cannot be wrongly 

Privacy-enhancing uses for TCPA

2002-08-03 Thread AARG!Anonymous

Here are some alternative applications for TCPA/Palladium technology which
could actually promote privacy and freedom.  A few caveats, though: they
do depend on a somewhat idealized view of the architecture.  It may be
that real hardware/software implementations are not sufficiently secure
for some of these purposes, but as systems become better integrated
and more technologically sound, this objection may go away.  And these
applications do assume that the architecture is implemented without secret
backdoors or other intentional flaws, which might be guaranteed through
an open design process and manufacturing inspections.  Despite these
limitations, hopefully these ideas will show that TCPA and Palladium
actually have many more uses than the heavy-handed and control-oriented
ones which have been discussed so far.

To recap, there are basically two technologies involved.  One is secure
attestation.  This allows machines to securely receive a hash of the
software which is running remotely.  It is used in these examples to
know that a trusted client program is running on the remote machine.
The other is secure storage.  This allows programs to encrypt data
in such a way that no other program can decrypt it.

In addition, we assume that programs are able to run unmolested;
that is, that other software and even the user cannot peek into the
program's memory and manipulate it or learn its secrets.  Palladium has
a feature called trusted space which is supposed to be some special
memory that is immune from being compromised.  We also assume that
all data sent between computers is encrypted using something like SSL,
with the secret keys being held securely by the client software (hence
unavailable to anyone else, including the users).

The effect of these technologies is that a number of computers across
the net, all running the same client software, can form their own
closed virtual world.  They can exchange and store data of any form,
and no one can get access to it unless the client software permits it.
That means that the user, eavesdroppers, and authorities are unable to
learn the secrets protected by software which uses these TCPA features.
(Note, in the sequel I will just write TCPA when I mean TCPA/Palladium.)

Now for a simple example of what can be done: a distributed poker game.
Of course there are a number of crypto protocols for playing poker on the
net, but they are quite complicated.  Even though they've been around
for almost 20 years, I've never seen game software which uses them.
With TCPA we can do it trivially.

Each person runs the same client software, which fact can be tested
using secure attestation.  The dealer's software randomizes a deck and
passes out the cards to each player.  The cards are just strings like
ace of spades, or perhaps simple numerical equivalents - nothing fancy.
Of course, the dealer's software learns in this way what cards every
player has.  But the dealer himself (i.e. the human player) doesn't
see any of that, he only sees his own hand.  The software keeps the
information secret from the user.  As each person makes his play, his
software sends simple messages telling what cards he is exposing or
discarding, etc.  At the end each person sends messages showing what
his hand is, according to the rules of poker.

This is a trivial program.  You could do it in one or two pages of code.
And yet, given the TCPA assumptions, it is just as secure as a complex
cryptographically protected version would be that takes ten times as
much code.

Of course, without TCPA such a program would never work.  Someone would
write a cheating client which would tell them what everyone else's cards
were when they were the dealer.  There would be no way that people could
trust each other not to do this.  But TCPA lets people prove to each
other that they are running the legitimate client.

So this is a simple example of how the secure attestation features of
TCPA/Palladium can allow a kind of software which would never work today,
software where people trust each other.  Let's look at another example,
a P2P system with anonymity.

Again, there are many cryptographic systems in the literature for
anonymous communication.  But they tend to be complicated and inefficient.
With TCPA we only need to set up a simple flooding broadcast network.
Let each peer connect to a few other peers.  To prevent traffic
analysis, keep each node-to-node link at a constant traffic level using
dummy padding.  (Recall that each link is encrypted using SSL.)

When someone sends data, it gets sent everywhere via a simple routing
strategy.  The software then makes the received message available to the
local user, if he is the recipient.  Possibly the source of the message
is carried along with it, to help with routing; but this information is
never leaked outside the secure communications part of the software,
and never shown to any users.

That's all there is to it.  Just send messages with flood broadcasts,
but 

Re: Privacy-enhancing uses for TCPA

2002-08-03 Thread Jay Sulzberger



On Sat, 3 Aug 2002, AARG!Anonymous wrote:


 ... /

 Now for a simple example of what can be done: a distributed poker game.
 Of course there are a number of crypto protocols for playing poker on the
 net, but they are quite complicated.  Even though they've been around
 for almost 20 years, I've never seen game software which uses them.
 With TCPA we can do it trivially.

 ... /

No.  Have you included the cost of giving every computer on Earth to the
Englobulators?  If you wish, we can write an implementation of the
wonderful protocols for distributed safer card drawing and we can play our
games of poker.  And we may run our poker room on the hardware and software
we have today, no need for DRM.

Indeed today millions use toady's untrammeled hardware and, this is
incredible, Microsoft OSes to conduct their personal banking.  If the
market considers that present systems suffice for this, well, I do not
think that we need surrender our computers to the Englobulators to save
three man-months of programmer time.

ad next moves in the eristic tree:

You: Marginals vs. total time-space integrated costs/benefits!

I: Happy to demonstrate estimates of totals come out for my side.

oo--JS.


-
The Cryptography Mailing List
Unsubscribe by sending unsubscribe cryptography to [EMAIL PROTECTED]



Re: [SIMSOFT] Protecting Privacy with Translucent Databases

2002-08-03 Thread David Wagner

R. A. Hettinga wrote:
Protecting Privacy with Translucent Databases

Last week, officials at http://www.yale.edu/Yale University complained to
the FBI that admissions officers from
http://www.princeton.edu/index.shtmlPrinceton University had broken into
a Yale Web site and downloaded admission decisions on 11 students who had
applied to both schools. [...]
Unfortunately, the security on the Yale Web site was atrocious: all anybody
needed to look up a student's record was that student's name, social
security number (SSN), and date of birth. [...]
[ proposes a solution ]


I'm glad commentators are beginning to point out that
more care should be put into protected personal information.
However, solution proposed in this article seems to me to
be more complicated than necessary.

I can't find any legitimate reason why colleges should need your
SSN when deciding whether to admit you.  They get away with it because
they can, but that doesn't mean they are right to do so.

It seems to me that a much more privacy-friendly solution would be
to simply refrain from asking for sensitive personal information like
SSN and date of birth -- name and a random unique identifier printed
on the application form ought to suffice.  (If SSN is later needed
for financial aid purposes, it could be requested after the student
decides to matriculate.)

Am I missing anything?

-
The Cryptography Mailing List
Unsubscribe by sending unsubscribe cryptography to [EMAIL PROTECTED]



Re: Translucent Databases

2002-08-03 Thread John S. Denker

David Wagner wrote:

 It seems to me that a much more privacy-friendly solution would be
 to simply refrain from asking for sensitive personal information like
 SSN and date of birth -- name and a random unique identifier printed
 on the application form ought to suffice.  (If SSN is later needed
 for financial aid purposes, it could be requested after the student
 decides to matriculate.)
 
 Am I missing anything?

I think the problem is a lot harder than that.

Let me clarify by telling a story:  Once upon a time, Hansel
designed an online-forms system that collected credit-card
info, encrypted it using PGP, and mailed it to Goldylocks
(the secretary) with a backup copy going to Tweedledee.
Despite the fact that Hansel had installed PGP on her
computer and indoctrinated her on how to use it, Goldylocks
was unable to decrypt the info.  So at her request, Tweedledee
decrypted it -- a whole conference's worth of registrations --
and sent it to her in the clear.

In a clear violation of Murphy's law, no harm came of this,
but otherwise it was a worst-case use of cryptology:  just
secure enough to be a nuisance to the authorized users, but
in the long run providing no real protection for the card-
holders.

The sad fact is that most people on this planet cannot get
PGP to work in a way that suits them.  The future of security
depends at least as much on user-interface research as it does
on mathematical cryptology research.

Oh, BTW, a preprinted number on the admissions form doesn't
really do the trick.  Forms are printed on printing presses,
in batches of several thousand, all alike.  After they are
mailed out, the guidance counselor at Podunk South High School
will make copies as needed.  A web-based approach won't work
unless you are making computer-savviness an entrance requirement.

-
The Cryptography Mailing List
Unsubscribe by sending unsubscribe cryptography to [EMAIL PROTECTED]



Re: [SIMSOFT] Protecting Privacy with Translucent Databases

2002-08-03 Thread R. A. Hettinga


--- begin forwarded text


Status: RO
Date: Sat, 3 Aug 2002 20:36:04 -0400
To: R. A. Hettinga [EMAIL PROTECTED],
 Digital Bearer Settlement List [EMAIL PROTECTED],
 [EMAIL PROTECTED]
From: Peter Wayner [EMAIL PROTECTED]
Subject: Re: [SIMSOFT] Protecting Privacy with Translucent
 Databases




I'm glad commentators are beginning to point out that
more care should be put into protected personal information.
However, solution proposed in this article seems to me to
be more complicated than necessary.

I can't find any legitimate reason why colleges should need your
SSN when deciding whether to admit you.  They get away with it because
they can, but that doesn't mean they are right to do so.

It seems to me that a much more privacy-friendly solution would be
to simply refrain from asking for sensitive personal information like
SSN and date of birth -- name and a random unique identifier printed
on the application form ought to suffice.  (If SSN is later needed
for financial aid purposes, it could be requested after the student
decides to matriculate.)

Am I missing anything?


Yes, a random nonce would be fine in many cases. The hash of the SSN,
the birthday, or combination, however, is much easier for a person to
remember. The random nonce requires a person to keep a copy. That may
be good practice, but it's not always practical. Hard disks crash.
Buildings burn down. Etc.

Hashing can also be quite flexible. In this case, PU might store
SHA(Yale sux+ssn) while YU might store SHA(Princeton sux+ssn) in
their databases. ('+' means concatenation.) The results would be
quite different and the databases couldn't be cross linked. But if
someone knows their ssn, they can call up the records quickly.

There are many limitations to this approach as there are limitations
in all cryptography, but I think it has a few advantages that are
well worth the few extra cycles for the hash function.  If this
computation is done on the client machine, the results are quite
secure even without SSL protecting the link. This is actually fairly
easy to implement with a Java applet.

--- end forwarded text


-- 
-
R. A. Hettinga mailto: [EMAIL PROTECTED]
The Internet Bearer Underwriting Corporation http://www.ibuc.com/
44 Farquhar Street, Boston, MA 02131 USA
... however it may deserve respect for its usefulness and antiquity,
[predicting the end of the world] has not been found agreeable to
experience. -- Edward Gibbon, 'Decline and Fall of the Roman Empire'

-
The Cryptography Mailing List
Unsubscribe by sending unsubscribe cryptography to [EMAIL PROTECTED]