Re: Difference between TCPA-Hardware and a smart card (was: example: secure computing kernel needed)

2003-12-28 Thread Seth David Schoen
Antonomasia writes:

> From: "Carl Ellison" <[EMAIL PROTECTED]>
> 
> > Some TPM-machines will be owned by people who decide to do what I
> > suggested: install a personal firewall that prevents remote attestation.
> 
> How confident are you this will be possible ?  Why do you think the
> remote attestation traffic won't be passed in a widespread service
> like HTTP - or even be steganographic ?

The main answer is that the TPM will let you disable attestation, so
you don't even have to use a firewall (although if you have a LAN, you
could have a border firewall that prevented anybody on the LAN from
using attestation within protocols that the firewall was sufficiently
familiar with).

When attestation is used, it likely will be passed in a service like
HTTP, but in a documented way (for example, using a protocol based on
XML-RPC).  There isn't really any security benefit obtained by hiding
the content of the attestation _from the party providing it_!

Certainly attestation can be used as part of a key exchange so that
subsequent communications between local software and a third party are
hidden from the computer owner, but because the attestation must
happen before that key exchange is concluded, you can still examine
and destroy the attestation fields.

One problem is that a client could use HTTPS to establish a session
key for a session within which an attestation would be presented.
That might disable your ability to use the border firewall to block
the attestation, but you can still disable it in the TPM on that
machine if you control the machine.

The steganographic thing is implausible because the TPM is a passive
device which can't control other components in order to get them to
signal information.

-- 
Seth David Schoen <[EMAIL PROTECTED]> | Very frankly, I am opposed to people
 http://www.loyalty.org/~schoen/   | being programmed by others.
 http://vitanuova.loyalty.org/ | -- Fred Rogers (1928-2003),
   |464 U.S. 417, 445 (1984)

-
The Cryptography Mailing List
Unsubscribe by sending "unsubscribe cryptography" to [EMAIL PROTECTED]


Re: Ousourced Trust (was Re: Difference between TCPA-Hardware and a smart card and something else before

2003-12-28 Thread Anne & Lynn Wheeler
At 02:01 PM 12/23/2003 -0500, Rich Salz wrote:
How many years have you been saying this, now? :)  How do those modern 
online environments achieve end-to-end content integrity and privacy? My 
guess is that they don't; their use of private value-add networks made it 
unnecessary.  If my guess is/was correct, than as more valuable 
transactions (or regulated data) flow over the commodity Internet, then 
those things will become important.  Make sense?  Am I right?
in days before the internet  it was there was a lot more lo-tech 
attacks on financial transactions ... and when things like the credit card 
master file got harvested  it was uaually pretty obviously an insider 
job. with the advent of the internet ... not only was it a open, insecure, 
commodity network  but a lot of the attached systems were never 
designed to operate in effectively a hostile environment  because of a lot 
of contributing factors  there was significant ambiguity when a 
merchant master file got harvested ... where the attack originated (insider 
or outsider). minor side thread regarding security proportional to risk 
with regard to attacks on the merchant master file:
http://www.garlic.com/~lynn/2001h.html#61

during the past ten years there have been some number of technologies for 
attempting to compensate for just the transport of the "shared-secret" 
account number in a transaction on an open, hostile network  aka 
primarily ssl, minor reference with regard to emerging ssl and the original 
payment gateway:
http://www.garlic.com/~lynn/aadsm5.htm#asrn2
http://www.garlic.com/~lynn/aadsm5.htm#asrn3

there has been a lot of threads about how much fraud SSL actually prevented 
 since the major consumer retail financial related fraud ... both 
non-internet, pre-internet, and internet has been bulk harvesting of 
repositories like a merchant master transaction file (for possibly the same 
effort to evesdrop packets in flight and extract a single account number 
 it might be possible to harvest a merchant transaction file with tens 
of thousands of account numbers.

so the x9a10 working group was given the requirement for preserving the 
integrity of the financial infrastructure for all electronic retail 
transactions. To meet that, the x9.59 standard was defined which basically 
requires end-to-end authenticated transactions between the consumer and the 
consumer's financial infrastructure and that account numbers used in 
authenticated transactions can't be used in non-authenticated transactions. 
With strong, end-to-end authentication, it is possible to evesdrop a x9.59 
transaction, extract the account number and still not be able to execute a 
fraudulent financial transaction. It is also possible to harvest x9.59 
account numbers from merchant transaction files and still not be able to 
execute fraudulent financial transaction.

Hiding account numbers has been associated with identity theft, since in 
environment where the transactions aren't authenticated  the account 
numbers have to be effectively treated as shared-secrets. The downside is 
that numerous business processes all along the processing chain require 
access and use of the account number. Just hiding the account number with 
SSL did little to address the major vulnerabilities and threats.  In 
effect, the analysis shows that it is effectively impossible to provide 
necessarily protection for a shared-secret account number, nobody of how 
deep the earth was blanketed with cryptographic technology. The solution 
was to change the business process, require end-to-end strong 
authentication and eliminate the account number as a shared-secret (i.e. 
knowing the account number is not sufficient for performing a fraudulent 
transaction). misc. x9.59 standard refs:
http://www.garlic.com/~lynn/index.html#x959

There was actually a couple other issues differentiating internet-based 
transactions and the VPN environment. The VPN environment was circuit 
based, it is possible to get service level agreements and utilized 
technology like modem loop-back diagnostics as part of a bootstrap problem 
determination procedure.  Such an environment has a trouble desk and 
expects to finish first level problem determination in something like 5 
minutes.

One of the last projects my wife and I had done before taking the early out 
(and doing some consulting for the payment gateway and ec-commerce stuff) 
was the HA/CMP product  i.e. high availability/cluster multi-processing.
http://www.garlic.com/~lynn/subtopic.html#hacmp
There is a slight reference in one of the above aadsm5.htm archive posting to
http://www.garlic.com/~lynn/95.html#13
because some of the people in the above meeting had left and joined a 
client/server startup and were responsible for this thing called a commerce 
server  who we then working with on this thing called a payment server 
for this thing that would be called e-commerce.

In any case, packet-based internet not only 

Re: Non-repudiation (was RE: The PAIN mnemonic)

2003-12-28 Thread Amir Herzberg
Ian proposes below two draft-definitions for non-repudiation - legal and 
technical. Lynn also sent us a bunch of definitions. Let's focus on the 
technical/crypto one for now - after all this is a crypto forum (I agree 
the legal one is also somewhat relevant to this forum).

In my work on secure e-commerce, I use (technical, crypto) definitions of 
non-repudiation, and consider these as critical to many secure e-commerce 
problems/scenarios/requirements/protocols. Having spent considerable time 
and effort on appropriate definitions and analysis (proofs), I was/am a bit 
puzzled and alarmed to find that others in our community seem so vehemently 
against non-repudiation.

Of course, like other technical terms, there can be many variant 
definitions; that is not really a problem (the community will gradually 
focus on few important and distinct variants). Also it's an unavoidable 
fact of life (imho) that other communities (e.g. legal) use the same term 
in somewhat different meaning.

So my question is only to people like Ben and Carl who have expressed, if I 
understood correctly, objection to any form of technical, crypto definition 
of non-repudiation. I repeat: do you really object and if so why? What of 
applications/scenarios that seem to require non-repudiation, e.g. certified 
mail, payments, contract signing,...?

Best regards,

Amir Herzberg
Computer Science Department, Bar Ilan University
Lectures: http://www.cs.biu.ac.il/~herzbea/book.html
Homepage: http://amir.herzberg.name
Enclosed: At 21:33 23/12/2003, Ian Grigg wrote:
Amir Herzberg wrote:
>
> Ben, Carl and others,
>
> At 18:23 21/12/2003, Carl Ellison wrote:
>
> > > >and it included non-repudiation which is an unachievable,
> > > nonsense concept.
>
> Any alternative definition or concept to cover what protocol designers
> usually refer to as non-repudiation specifications? For example
> non-repudiation of origin, i.e. the ability of recipient to convince a
> third party that a message was sent (to him) by a particular sender (at
> certain time)?
>
> Or - do you think this is not an important requirement?
> Or what?
I would second this call for some definition!

FWIW, I understand there are two meanings:

   some form of legal inability to deny
   responsibility for an event, and
   cryptographically strong and repeatable
   evidence that a certain piece of data
   was in the presence of a private key at
   some point.
Carl and Ben have rubbished "non-repudiation"
without defining what they mean, making it
rather difficult to respond.
Now, presumably, they mean the first, in
that it is a rather hard problem to take the
cryptographic property of public keys and
then bootstrap that into some form of property
that reliably stands in court.
But, whilst challenging, it is possible to
achieve legal non-repudiability, depending
on your careful use of assumptions.  Whether
that is a sensible thing or a nice depends
on the circumstances ... (e.g., the game that
banks play with pin codes).
So, as a point of clarification, are we saying
that "non-repudiability" is ONLY the first of
the above meanings?  And if so, what do we call
the second?  Or, what is the definition here?
From where I sit, it is better to term these
as "legal non-repudiability" or "cryptographic
non-repudiability" so as to reduce confusion.
iang
-
The Cryptography Mailing List
Unsubscribe by sending "unsubscribe cryptography" to [EMAIL PROTECTED]


Re: Non-repudiation (was RE: The PAIN mnemonic)

2003-12-28 Thread Ed Gerck
Yes, the term "non-repudiation" has been badly misused in
old PKIX WG drafts (in spite of warnings by myself and
others) and some crypto works of reference -- usually
by well-intentioned but otherwise misguided people trying
to add "value" to digital certificates.

However, IMO non-repudiation refers to a useful and
essential cryptographic primitive. It does not mean the
affirmation of a truth (which is authentication). It means
the denial of a falsity -- such as:

(1) the ability to prevent the effective denial of an act (in
other words, denying the act becomes a falsity); or

(2) the ability to prevent the denial of the origin or delivery
of transactions.

Note that, except for a boolean system, the affirmation of
a truth is not the same as the denial of a falsity. Hence, the
usefulness of "non-repudiation" as a primitive. Take away
"non-repudiation" and you end up with a lesser "language"
with which to describe security processes.

Cheers,
Ed Gerck

-
The Cryptography Mailing List
Unsubscribe by sending "unsubscribe cryptography" to [EMAIL PROTECTED]


Re: example: secure computing kernel needed

2003-12-28 Thread William Arbaugh


I must confess I'm puzzled why you consider strong authentication
the same as remote attestation for the purposes of this analysis.
It seems to me that your note already identifies one key difference:
remote attestation allows the remote computer to determine if they wish
to speak with my machine based on the software running on my machine,
while strong authentication does not allow this.
That is the difference, but my point is that the result with respect to 
the control of your computer is the same. The distant end either 
communicates with you or it doesn't. In authentication, the distant end 
uses your identity to make that decision. In remote attestation, the 
distant end uses your computer's configuration (the computer's identity 
to some degree) to make that same decision.

As a result, remote attestation enables some applications that strong
authentication does not.  For instance, remote attestation enables DRM,
software lock-in, and so on; strong authentication does not.  If you
believe that DRM, software lock-in, and similar effects are 
undesirable,
then the differences between remote attestation and strong 
authentication
are probably going to be important to you.

So it seems to me that the difference between authenticating software
configurations vs. authenticating identity is substantial; it affects 
the
potential impact of the technology.  Do you agree?  Did I miss 
something?
Did I mis-interpret your remarks?

My statement was that the two are similar to the degree to which the 
distant end has control over your computer. The difference is that in 
remote attestation we are authenticating a system and we have some 
assurance that the system won't deviate from its programming/policy (of 
course all of the code used in these applications will be formally 
verified :-)). In user authentication, we're authenticating a human and 
we have significantly less assurance that the authenticated subject in 
this case (the human) will follow policy. That is why remote 
attestation and authentication produce different side effects enabling 
different applications: the underlying nature of the authenticated 
subject. Not because of a difference in the technology.



P.S. As a second-order effect, there seems to be an additional 
difference
between remote attestation ("authentication of configurations") and
strong authentication ("authentication of identity").  Remote 
attestation
provides the ability for "negative attestation" of a configuration:
for instance, imagine a server which verifies not only that I do have
RealAudio software installed, but also that I do not have any Microsoft
Audio software installed.  In contrast, strong authentication does
not allow "negative attestation" of identity: nothing prevents me from
sharing my crypto keys with my best friend, for instance.

Well- biometrics raises some interesting Gattica issues.  But, I'm not 
going to go there on the list. It is a discussion that is better done 
over a few pints.

So to summarize- I was focusing only on the control issue and noting 
that even though the two technologies enable different applications 
(due to the assurance that we have in how the authenticated subject 
will behave), they are very similar in nature.


-
The Cryptography Mailing List
Unsubscribe by sending "unsubscribe cryptography" to 
[EMAIL PROTECTED]
-
The Cryptography Mailing List
Unsubscribe by sending "unsubscribe cryptography" to [EMAIL PROTECTED]


Re: Ousourced Trust (was Re: Difference between TCPA-Hardware and a smart card and something else before

2003-12-28 Thread Peter Gutmann
Anne & Lynn Wheeler <[EMAIL PROTECTED]> writes:

>1) x.509 certificates broadcast all over the world attacked to every
>transaction were in serious violation of all sorts of privacy issues
>2) certificates were fundamentally designed to address a trust issue in
>offline environments where a modicum of static, stale data was better than
>nothing
>3) offline, certificate oriented static stale processing was a major step
>backward compared to online, timely, dynamic processing.

X.509 certs were designed to solve the problem of authenticating users to the
global X.500 directory.  So they're good at what they were designed for
(solving a problem that doesn't exist [0]), and bad at everything else
(solving any other sort of problem).

Peter.

[0] Actually they're adequate at what they were designed for.  The original
directory authentication work was really just a bunch of suggestions as to
how you'd do it, ranging from passwords through to certs, and a lot of the
cert stuff was more a set of suggestions than any firm guideline.

-
The Cryptography Mailing List
Unsubscribe by sending "unsubscribe cryptography" to [EMAIL PROTECTED]


RE: Non-repudiation (was RE: The PAIN mnemonic)

2003-12-28 Thread Carl Ellison
Amir,

my objection is to the word "sender" which, in definitions I've
read, refers to the human being associated with a particular key.  As long
as we refer to a private key with no implication that this in any way incurs
liability for a human being, then I'm happy -- but e-commerce folks are not.

It is important to be able to authenticate a message origin and
verify its integrity - the things that a dsig or MAC give you.  When you use
a public-key dsig, you have the added security advantage that the key
capable of forming that signature does not need to be used to verify it.
This is the original technical meaning of the term we're struggling over.
However, in Diffie and Hellman's original paper, (which referred to this as
"undeniable", if I remember correctly), the confusion had already set in.  A
key would never deny or repudiate anything. That's an action by a human
being.  However, the use of public key cryptography does not imply anything
about the human being to whom that key pair was assigned.

So, I would use the terms "authentication" and "integrity
verification" and avoid the term "non-repudiation", since that one refers to
human behavior and invokes liability on human beings.  Since we have no idea
how to make computer systems that capture proof of a human being's behavior
and intentions, we can not claim to have any evidence that could be
presented in court to show that a particular human being made a particular
commitment, just based on some digital signature.  We can prove that a given
private key (to wit, the one private key corresponding to a public key that
is entered into evidence) formed a signature over some message or file.
However, any attempt to infer more than that is fallacious.

If you want to use cryptography for e-commerce, then IMHO you need a
contract signed on paper, enforced by normal contract law, in which one
party lists the hash of his public key (or the whole public key) and says
that s/he accepts liability for any digitally signed statement that can be
verified with that public key.

Any attempt to just assume that someone's acceptance of a PK
certificate amounts to that contract is extremely dangerous, and might even
be seen as an attempt to victimize a whole class of consumers.

 - Carl

+--+
|Carl M. Ellison [EMAIL PROTECTED]  http://theworld.com/~cme |
|PGP: 75C5 1814 C3E3 AAA7 3F31  47B9 73F1 7E3C 96E7 2B71   |
+---Officer, arrest that man. He's whistling a copyrighted song.---+ 

> -Original Message-
> From: [EMAIL PROTECTED] 
> [mailto:[EMAIL PROTECTED] On Behalf Of Amir Herzberg
> Sent: Tuesday, December 23, 2003 1:18 AM
> To: [EMAIL PROTECTED]
> Subject: Re: Non-repudiation (was RE: The PAIN mnemonic)
> 
> Ben, Carl and others,
> 
> At 18:23 21/12/2003, Carl Ellison wrote:
> 
> > > >and it included non-repudiation which is an unachievable,
> > > nonsense concept.
> 
> Any alternative definition or concept to cover what protocol 
> designers 
> usually refer to as non-repudiation specifications? For example 
> non-repudiation of origin, i.e. the ability of recipient to 
> convince a 
> third party that a message was sent (to him) by a particular 
> sender (at 
> certain time)?
> 
> Or - do you think this is not an important requirement?
> Or what?
> 
> 
> Best regards,
> 
> Amir Herzberg
> Computer Science Department, Bar Ilan University
> Lectures: http://www.cs.biu.ac.il/~herzbea/book.html
> Homepage: http://amir.herzberg.name
> 
> -
> The Cryptography Mailing List
> Unsubscribe by sending "unsubscribe cryptography" to 
> [EMAIL PROTECTED]
> 

-
The Cryptography Mailing List
Unsubscribe by sending "unsubscribe cryptography" to [EMAIL PROTECTED]


RE: Non-repudiation (was RE: The PAIN mnemonic)

2003-12-28 Thread Carl Ellison
Ian,

re. your two definitions:

> -Original Message-
> From: [EMAIL PROTECTED] 
> [mailto:[EMAIL PROTECTED] On Behalf Of Ian Grigg
> Sent: Tuesday, December 23, 2003 11:34 AM
> To: Amir Herzberg; [EMAIL PROTECTED]; Ben Laurie
> Cc: [EMAIL PROTECTED]
> Subject: Re: Non-repudiation (was RE: The PAIN mnemonic)
> 
> FWIW, I understand there are two meanings:
> 
>some form of legal inability to deny
>responsibility for an event, and

This one has no place in either technology or law because we do not know how
to make computer systems that are honest witnesses to a person's behavior
(incapable of being misused, infected by hostile S/W, etc.). 

>cryptographically strong and repeatable
>evidence that a certain piece of data
>was in the presence of a private key at
>some point.
> 

This might apply, as long as the thing that is in the presence of the
private key is a hash value.  However, this is not what I read in the ISO
definitions of "non-repudiation".  Those definitions refer to human beings
and their behavior.


> Carl and Ben have rubbished "non-repudiation"
> without defining what they mean, making it
> rather difficult to respond.
> 
> Now, presumably, they mean the first, in
> that it is a rather hard problem to take the
> cryptographic property of public keys and
> then bootstrap that into some form of property
> that reliably stands in court.
> 
> But, whilst challenging, it is possible to
> achieve legal non-repudiability, depending
> on your careful use of assumptions.  Whether
> that is a sensible thing or a nice depends
> on the circumstances ... (e.g., the game that
> banks play with pin codes).

I reject assumptions (e.g., that a home user has kept his computer locked
away so that no one else could get to its keyboard and has kept it free of
all hostile software) that are required to map from the cryptographic action
back to the human action.

> 
> So, as a point of clarification, are we saying
> that "non-repudiability" is ONLY the first of
> the above meanings?  And if so, what do we call
> the second?  Or, what is the definition here?

I believe that the standard definitions (e.g., in ISO documents) refer to
the first.  This is certainly what the PKI community refers to.  That
definition is not only wrong, technically, it is a violation of consumer
rights if it were ever to be enforced.  As a cryptographic community we
should make sure that no one in the world still believes such nonsense.

What we call the property of public-key cryptography is an interesting
problem.  Spelled out, the only property here is that we can do the same
kind of MAC we've always done with symmetric keys, only the verifier doesn't
need to know the key capable of making the signature.  This is a security
advantage - and nothing more.

The logical fallacy that start with Diffie-Hellman is to say:

1. with symmetric key MACs, the verifier needed to know the secret key
2. therefore, the verifier could forge a MAC
3. therefore, you could not take a MAC into court to prove a claim against
the other party
4. with public key signatures, you don't need to know the secret key
(flawed step) 5: therefore, you can take a PK signature into court.

All these improper assumptions about the behavior of the keyholder are
back-peddling to cover up all the other ways that a PK signature could be
made without the express consent of the alleged keyholder.  That is not
appropriate.  We do not have step 5. We should rid our texts of any
reference to that notion - and work with what we do have.  It's good, but
it's not magic.

 - Carl


> 
> >From where I sit, it is better to term these
> as "legal non-repudiability" or "cryptographic
> non-repudiability" so as to reduce confusion.

To me, "repudiation" is the action only of a human being (not of a key) and
therefore there is no such thing as "cryptographic non-repudiability".  We
need a different, more precise term for that - and we need to rid our
literature and conversation of any reference to the former - except to
strongly discredit it if/when it ever appears again.

> iang
> 



+--+
|Carl M. Ellison [EMAIL PROTECTED]  http://theworld.com/~cme |
|PGP: 75C5 1814 C3E3 AAA7 3F31  47B9 73F1 7E3C 96E7 2B71   |
+---Officer, arrest that man. He's whistling a copyrighted song.---+ 

-
The Cryptography Mailing List
Unsubscribe by sending "unsubscribe cryptography" to [EMAIL PROTECTED]


RE: Non-repudiation (was RE: The PAIN mnemonic)

2003-12-28 Thread Carl Ellison
> -Original Message-
> From: [EMAIL PROTECTED] 
> [mailto:[EMAIL PROTECTED] On Behalf Of Stefan Kelm
> Sent: Tuesday, December 23, 2003 1:44 AM
> To: [EMAIL PROTECTED]
> Subject: Re: Non-repudiation (was RE: The PAIN mnemonic)

> Ah. That's why they're trying to rename the corresponding keyUsage bit
> to "contentCommitment" then:
> 
>   http://www.pki-page.info/download/N12599.doc
> 
> :-)
> 
> Cheers,
> 
>   Stefan.

Maybe, but that page defines it as:

--

contentCommitment: for verifying digital signatures which are intended to
signal that the signer is committing to the content being signed. The
precise level of commitment, e.g. "with the intent to be bound" may be
signaled by additional methods, e.g. certificate policy.

Since a content commitment signing is considered to be a digitally signed
transaction, the digitalSignature bit need not be set in the certificate. If
it is set, it does not affect the level of commitment the signer has endowed
in the signed content.

Note that it is not incorrect to refer to this keyUsage bit using the
identifier nonRepudiation. However, the use this identifier has been
deprecated. Regardless of the identifier used, the semantics of this bit are
as specified in this standard.

--

Which still refers to the "signer" having an "intent to be bound".  One can
not bind a key to anything, legally, so the signer here must be a human or
organization rather than a key.  It is that unjustifiable linkage from the
actions of a key to the actions of one or more humans that needs to be
eradicated from the literature.

 - Carl


+--+
|Carl M. Ellison [EMAIL PROTECTED]  http://theworld.com/~cme |
|PGP: 75C5 1814 C3E3 AAA7 3F31  47B9 73F1 7E3C 96E7 2B71   |
+---Officer, arrest that man. He's whistling a copyrighted song.---+ 

-
The Cryptography Mailing List
Unsubscribe by sending "unsubscribe cryptography" to [EMAIL PROTECTED]


RE: Non-repudiation (was RE: The PAIN mnemonic)

2003-12-28 Thread Amir Herzberg
At 04:20 25/12/2003, Carl Ellison wrote:
...
If you want to use cryptography for e-commerce, then IMHO you need a
contract signed on paper, enforced by normal contract law, in which one
party lists the hash of his public key (or the whole public key) and says
that s/he accepts liability for any digitally signed statement that can be
verified with that public key.
Of course! I fully agree; in fact the first phase in the `trusted delivery 
layer` protocols I'm working on is exactly that - ensuring that the parties 
(using some external method) agreed on the keys and the resulting 
liability. But when I define the specifications, I use `non-repudiation` 
terms for some of the requirements. For example, the intuitive phrasing of 
the Non-Repudiation of Origin (NRO) requirement is: if any party outputs an 
evidence evid s.t. valid(agreement, evid, sender, dest, message, 
time-interval, NRO), then either the sender is corrupted or sender 
originated message to the destination dest during the indicated 
time-interval. Notice of course that sender here is an entity in the 
protocol, not the human being `behind` it. Also notice this is only 
intuitive description, not the formal specifications.

> Best regards,
>
> Amir Herzberg
> Computer Science Department, Bar Ilan University
> Lectures: http://www.cs.biu.ac.il/~herzbea/book.html
> Homepage: http://amir.herzberg.name
>
> -
> The Cryptography Mailing List
> Unsubscribe by sending "unsubscribe cryptography" to
> [EMAIL PROTECTED]
>
-
The Cryptography Mailing List
Unsubscribe by sending "unsubscribe cryptography" to [EMAIL PROTECTED]


Re: Ousourced Trust (was Re: Difference between TCPA-Hardware and a smart card and something else before

2003-12-28 Thread Anne & Lynn Wheeler
At 02:29 PM 12/25/2003 +1300, Peter Gutmann wrote:
X.509 certs were designed to solve the problem of authenticating users to the
global X.500 directory.  So they're good at what they were designed for
(solving a problem that doesn't exist [0]), and bad at everything else
(solving any other sort of problem).
disclaimer: I never actually worked on either X.500 or X.509 standards 
...  however, I do remember an acm sigmod meeting circa  '90 where somebody 
did characterize x.500 as a bunch of networking engineers trying to 
re-invent 1960s database technology. minor past refs:
http://www.garlic.com/~lynn/2002g.html#24 Why did OSI fail compared with 
TCP-IP?
http://www.garlic.com/~lynn/2002g.html#28 Why did OSI fail compared with 
TCP-IP?
http://www.garlic.com/~lynn/aepay10.htm#77 Invisible Ink, E-signatures slow 
to broadly catch on (addenda)
http://www.garlic.com/~lynn/aadsm13.htm#7 OCSP and LDAP

also, (not knowing about original intent of x.509) ... the PKI 
infrastructures I saw in the early to mid 90s ... had x.509 identity 
certificates that appeared to be populated with stale, static (and possibly 
subset) of information from a database entry  targeted for use by 
relying parties in lieu of the relying parties actually being able to 
contact the real database (contained some piece of a x.500 directory entry 
that a relying-party could presumably use if they didn't have direct access 
to the x.500 directory).

the relying-party-only certificates of mid ot late 90s appeared to be much 
more of something that would authenticated an entity to a operational 
service  having thrown out nearly all of the information that might be 
found in a database (especially anything that might possibly represent a 
privacy and/or liability issue) . However,  relying-party-only certificates 
could still be shown to be redundant and superfluous ... aka if i'm sending 
a digitally signed transaction containing an account number (or other 
database indexing value) to a relying party having the database  then 
appending any kind of certificate that contains a small subset of the 
complete information from the database entry (including any public key or 
authentication material) is redundant and superfluous.

the IETF OCSP standards work seems to be all about a real-time protocol 
that a relying party can use to check with a (LDAP?) database about whether 
the information that might be in a specific certificate can still be relied 
on. It has some of the flavor of a distributed filesystem/database cache 
entry invalidation protocol  All of the CRL and OCSP stuff isn't about 
using the certificate for authenticating to an x.500 directory  but 
whether the stale, static copy of information in the certificate is still good.

one of the PKI related efforts from the mid-90s specified adding a digital 
signature and a relying-party-only certificate to a iso8583 oriented 
financial transaction. It turns out that the typical iso8583 financial 
transaction eventually gets packaged as something like 60-80 bytes  
while the typically implemented relying-party-only certificate for this 
effort was between 4k bytes and 12k bytes. In this case, not only was the 
relying-party-only certificate redundant and superfluous but also 
represented a two orders of magnitude payload bloat.
--
Anne & Lynn Wheelerhttp://www.garlic.com/~lynn/
Internet trivia 20th anv http://www.garlic.com/~lynn/rfcietff.htm
 

-
The Cryptography Mailing List
Unsubscribe by sending "unsubscribe cryptography" to [EMAIL PROTECTED]


RE: Non-repudiation (was RE: The PAIN mnemonic)

2003-12-28 Thread Carl Ellison
Amir,

I am glad to see that you are treating this seriously.

It is always possible to use the term "non-repudiation" for some
legitimately defined thing - but I personally would prefer not to use the
term because it has been tarred by over a decade of misuse (implying some
presumption of liability on the part of a human being as a result of the
behavior of a cryptographic key).

I wish you luck with your protocols and look forward to seeing them.

 - Carl


+--+
|Carl M. Ellison [EMAIL PROTECTED]  http://theworld.com/~cme |
|PGP: 75C5 1814 C3E3 AAA7 3F31  47B9 73F1 7E3C 96E7 2B71   |
+---Officer, arrest that man. He's whistling a copyrighted song.---+ 

> -Original Message-
> From: Amir Herzberg [mailto:[EMAIL PROTECTED] 
> Sent: Thursday, December 25, 2003 2:47 AM
> To: Carl Ellison; [EMAIL PROTECTED]
> Subject: RE: Non-repudiation (was RE: The PAIN mnemonic)
> 
> At 04:20 25/12/2003, Carl Ellison wrote:
> ...
> > If you want to use cryptography for e-commerce, 
> then IMHO you need a
> >contract signed on paper, enforced by normal contract law, 
> in which one
> >party lists the hash of his public key (or the whole public 
> key) and says
> >that s/he accepts liability for any digitally signed 
> statement that can be
> >verified with that public key.
> 
> Of course! I fully agree; in fact the first phase in the 
> `trusted delivery 
> layer` protocols I'm working on is exactly that - ensuring 
> that the parties 
> (using some external method) agreed on the keys and the resulting 
> liability. But when I define the specifications, I use 
> `non-repudiation` 
> terms for some of the requirements. For example, the 
> intuitive phrasing of 
> the Non-Repudiation of Origin (NRO) requirement is: if any 
> party outputs an 
> evidence evid s.t. valid(agreement, evid, sender, dest, message, 
> time-interval, NRO), then either the sender is corrupted or sender 
> originated message to the destination dest during the indicated 
> time-interval. Notice of course that sender here is an entity in the 
> protocol, not the human being `behind` it. Also notice this is only 
> intuitive description, not the formal specifications.
> 
> > > Best regards,
> > >
> > > Amir Herzberg
> > > Computer Science Department, Bar Ilan University
> > > Lectures: http://www.cs.biu.ac.il/~herzbea/book.html
> > > Homepage: http://amir.herzberg.name
> > >
> > > 
> -
> > > The Cryptography Mailing List
> > > Unsubscribe by sending "unsubscribe cryptography" to
> > > [EMAIL PROTECTED]
> > >
> 

-
The Cryptography Mailing List
Unsubscribe by sending "unsubscribe cryptography" to [EMAIL PROTECTED]


Broken Machine Politics

2003-12-28 Thread R. A. Hettinga




Wired 12.01: January 2004




Broken Machine Politics

Introducing the User-Friendly, Error-Free, Tamper-Proof Voting Machine of
the Future!
(WARNING: Satisfaction not guaranteed if used before 2006.)

By Paul O'Donnell

On a cool afternoon last February, five politicians gathered in the heart
of Silicon Valley for a meeting of the Santa Clara County Board of
Supervisors. Their task: to replace the county's antiquated punch card
voting system with $20 million worth of touchscreen computers.

Executives and lobbyists from three different voting-tech vendors were
scheduled to present their wares, but the outcome was practically
predetermined. Supervisors on the board's finance committee had already
anointed a winner: Sequoia Voting Systems, based 35 miles north in Oakland.
It was all over but the voting.

And then the computer scientists showed up: Peter Neumann, principal
computer scientist at R&D firm SRI; Barbara Simons, past president of the
Association for Computing Machinery; and Stanford computer science
professor David Dill. They had been fidgeting in the front of the room
through three hours of what Dill would later call "garbage." Finally, they
stood up and, one by one, made their case.

Voting, they explained, is too important to leave up to computers - at
least, these types of computers. They're vulnerable to malfunction and
mischief that could go undetected. Where they'd already been adopted, the
devices - known in the industry as DREs, short for direct recording
electronic - had experienced glitches that could have called into question
entire elections. And they keep no paper record, no backup. "We said, 'Slow
down. You've got a problem,'" recalls Neumann.

It felt odd - computer scientists inveighing against their own technology
in the tone of geniuses lecturing undergraduates. They had been lobbying
for months, and now "it was like they were making a last stand at Santa
Clara," says one person who was at the meeting. The supervisors listened
politely. "But the board didn't seem to see what it had to do with
anything," says Liz Kniss, a supervisor who shared the concerns raised by
the scientists.

In the end, Kniss and her colleagues voted 3 to 2 to award the contract.
The last stand had failed - almost. At the final moment, the supes insisted
that Sequoia be ready to produce DREs with a paper backup, should the
county ever ask for them. It seemed like a sop to the geeks, but months
later it would prove to be the smartest thing the board did that afternoon.

After Florida and the chaos of the 2000 presidential election, the nation's
voting masters vowed: Never again. Never again would an election be
jeopardized because the mechanics failed, and never again would parsing a
winner be left to human discretion. Officials have scrambled to update
voting equipment, creating a weird, three-pointed confluence of interests:
civil servants, suits, and geeks.

Thanks to Florida, local governments find themselves sitting on piles of
fix-it money - millions from city and county coffers and $3.9 billion from
Congress, thanks to the Help America Vote Act of 2002. The companies that
make voting equipment are rushing to produce machines; at the same time,
big players like Diebold, with almost $2 billion in revenue last year, are
touting transparent, efficient, and chad-free elections. Meanwhile, some of
the nation's elite computer experts and election watchdogs are
hyperventilating. They see a fumbled opportunity - instead of using the
tech to make democracy secure and accurate for the first time, we're
building an electoral infrastructure with more holes than a punch card
ballot. This future is getting hashed out not in Washington (the Feds don't
run elections) but in the nooks and crannies of American politics, like
that Silicon Valley board meeting. "Every year there are legislative
proposals that make election administrators' eyes roll," says Warren
Slocum, chief elections officer for San Mateo County, just south of San
Francisco. "The voting registrar's life has become wildly complex."

That's ironic, because electronic voting is supposed to make elections
easier. The systems themselves are as simple to use as an ATM, and
overvotes - one of the problems in Florida - are impossible. You can't
select a second candidate without deselecting the first. The interface
notes skipped races or ballot questions. With the addition of a simple
numeric keypad and headphones, the visually impaired can vote independently.

Electoral officials get their own set of benefits. For example, some
precincts in Southern California print ballots in Spanish, Vietnamese,
Korean, and Tagalog, among other languages; registrars must guess how many
of each to print before election day. And printed ballots often show
candidates who have dropped out (such as Arianna Huffington in the
California recall). By contrast, touchscreens can be quickly reprogrammed
with new languages and

stego in the wild: bomb-making CDs

2003-12-28 Thread John Denker
] Thursday 25 December 2003, 17:13 Makka Time, 14:13 GMT
]
] Saudis swoop on DIY bomb guide
] 
] Authorities in the kingdom have arrested five people after
] raiding computer shops selling compact disks containing
] hidden bomb-making instructions, a local newspaper reported
] on Thursday.
] 
] Police were questioning four owners of computer shops in the
] southern Jazan region and a fifth person believed to have
] supplied the CDs to the shops, Al-Watan newspaper said.
] 
] Officials were not immediately available for comment.
] 
] The daily said some of the shop owners might not have known
] about the bomb-making tutorial files hidden on the CDs. Only
] someone with technical knowledge would be able to find the
] files.

That was quoted from:
http://english.aljazeera.net/NR/exeres/C8061E36-E4E5-4EB5-A103-19DCF838E835.htm
and the same story, almost verbatim, was carried by Reuters.

Comments:
 1) This is not entirely unprecedented.  Al Qaeda for years has
been hiding recruitment and training footage in the middle
of otherwise-innocuous video tape cassettes.  
 2) OTOH using a commercial distribution channel bespeaks a 
certain boldness ... somebody is "thinking big".  
 3) Also: as a rule, anything you do with computers generates 
more headlines than doing the same thing with lower-tech methods.
This is significant to terrorists, who are always looking for
headlines.  Conversely it is significant to us, who have much
to lose when our not-so-fearless leaders over-react.
 4) One wonders how many CDs were distributed before the operation
was terminated.
 5) I wonder how the authorities found out about it.
 6) The article speaks of technical skill ... I wonder how 
much technical skill was required.  Probably not much.
 7) Did it rely entirely on security-by-obscurity, or was there 
crypto involved also?
(The latter is possible;  whatever leak told the authorities
where to look could also have told them the passphrase...
but the article didn't mention crypto.)

I suspect there is a lot more to this story..

-
The Cryptography Mailing List
Unsubscribe by sending "unsubscribe cryptography" to [EMAIL PROTECTED]


Microsoft aims to make spammers pay

2003-12-28 Thread R. A. Hettinga


The BBC

Your Say
Friday, 26 December, 2003, 03:29 GMT

Microsoft aims to make spammers pay
By Jo Twist
BBC News Online technology reporter


Despite efforts to stem the billions of spam e-mails flooding inboxes,
unwanted messages are still turning e-mail into a quagmire of misery.
Spammers send out tens of millions of e-mails to unsuspecting computer
users every day, employing a myriad of methods to ensure their pills, loans
and "requests for our lord" pleas fox e-mail filters.

Some are even turning to prose and poetry to fool the technological
safeguards people put in place.

But a group of researchers at Microsoft think they may have come up with a
solution that could, at least, slow down and deter the spammers.

The development has been called the Penny Black project, because it works
on the idea that revolutionised the British postage system in the 1830s -
that senders of mail should have to pay for it, not whoever is on the
receiving end.

Stamp of approval

"The basic idea is that we are trying to shift the equation to make it
possible and necessary for a sender to 'pay' for e-mail," explained Ted
Wobber of the Microsoft Research group (MSR).

The payment is not made in the currency of money, but in the memory and the
computer power required to work out cryptographic puzzles.

"For any piece of e-mail I send, it will take a small amount computing
power of about 10 to 20 seconds."

" For this scheme to work, it would want to be something all mail agents
would want to do "
Ted Wobber, MSR


"If I don't know you, I have to prove to you that I have spent a little bit
of time in resources to send you that e-mail.

"When you see that proof, you treat that message with more priority."

Once senders have proved they have solved the required "puzzle", they can
be added to a "safe list" of senders.

It means the spammer's machine is slowed down, but legitimate e-mailers do
not notice any delays.

Mr Wobber and his group calculated that if there are 80,000 seconds in a
day, a computational "price" of a 10-second levy would mean spammers would
only be able to send about 8,000 messages a day, at most.

"Spammers are sending tens of millions of e-mails, so if they had to do
that with all the messages, they would have to invest heavily in machines."

As a result of this extra investment, spamming would become less profitable
because costs would skyrocket in order to send as many e-mails.

All this clever puzzle-solving is done without the recipient of the e-mail
being affected.

Bogging them down

The idea was originally formulated to use CPU memory cycles by team member
Cynthia Dwork in 1992.

But they soon realised it was better to use memory latency - the time it
takes for the computer's processor to get information from its memory chip
- than CPU power.
That way, it does not matter how old or new a computer is because the
system does not rely on processor chip speeds, which can improve at rapid
rates.

A cryptographic puzzle that is simple enough not to bog down the processor
too much, but that requires information to be accessed from memory, levels
the difference between older and newer computers.

It all sounds like a good idea, said Paul Wood, chief analyst at e-mail
security firm MessageLabs.

"One of the fundamental problems with spam is that it costs nothing to
send, but has associated costs for the recipient which include loss of
bandwidth, problems with usage, and lost productivity," he said.

"Microsoft's idea is to shift this cost burden from the recipient to the
sender, which in itself seems like a reasonable sentiment."

But, he said, for such a scheme to be all-encompassing, there would have to
be some provision for open standards, so that it is not proprietary to
Microsoft.

Work for all

MSR is in talks with various people to put the system into a useful
anti-spam product.

It could easily be built into e-mail software like Outlook, e-mail servers
or web browsers, said Mr Wobber.

"For this scheme to work, it would want to be something all mail agents
would want to do," explained Mr Wobber.

And because it is the receiver who sets the puzzle requirement, spammers
will not have any advantage by using non-Microsoft products.

It is certainly not going to stop all spam for good, admitted Mr Wobber.

"I don't think any one spam scheme is a panacea, we have to use a wide
variety of schemes to be successful in stopping spam."

"Spam is probably going to get worse before it gets better, and I really
hope it does not get to a point that it deters people using e-mail."

E-mail this to a friend
Related to this story:
Top UK sites 'fail privacy test' (11 Dec 03  |  Technology )
New laws on spam come into force (11 Dec 03  |  Technology )
US anti-spam law edges closer (09 Dec 03  |  Technology )
Spammers turn to classic prose (01 Dec 03  |  Technology )
Spam watchdog 'needs more bite' (06 Oct 03  |  Technology )
Spam 'turning people off e-mail' (24 Oct 

Re: Non-repudiation (was RE: The PAIN mnemonic)

2003-12-28 Thread Ian Grigg
Carl Ellison wrote:

> > >From where I sit, it is better to term these
> > as "legal non-repudiability" or "cryptographic
> > non-repudiability" so as to reduce confusion.
> 
> To me, "repudiation" is the action only of a human being (not of a key) and
> therefore there is no such thing as "cryptographic non-repudiability".


Ah.  Now I understand.  The verb is wrong, as it
necessarily implies the act of the human who is
accused of the act.  (And, thus, my claim that it
is possible, was also wrong.)

Whereas the cryptographic property implies no such
thing, and a cryptographic actor can only affirm
or not, not repudiate.  I.e., it's a meaningless
term.


> We
> need a different, more precise term for that -


Would "irrefutable" be a better term?  Or non-
refutability, if one desires to preserve the N?

The advantage of this verb is that it has no
actor involved, and evidence can be refuted on
its own merits, as it were.

As a test, if one were to replace repudiate
with refute in the ISO definition, would it
then stand?


> and we need to rid our
> literature and conversation of any reference to the former - except to
> strongly discredit it if/when it ever appears again.

I think more is needed.  A better definition is
required, as absence is too easy to ignore.  People
and courts will use what they have available, so it
is necessary to do more; indeed it is necessary to
actively replace that term with another.

Generally, the way the legal people work is to
create simple "tests".  Such as:

  A Document was signed by a private key if:

  1. The signature is verifiable by the public key,
  2. the public key is paired with the private key,
  3. the signature is over a cryptographically strong
 message digest,
  4. the Message Digest was over the Document.

Now, this would lead to a definition of irrefutable
evidence.  How such evidence would be used would be
of course dependent on the circumstances;  it then
becomes a further challenge to tie a human's action
to that act / event.



iang


PS: Doing a bit of googling, I found the ISO definition
to be something like:

http://lists.w3.org/Archives/Public/w3c-ietf-xmldsig/1999OctDec/0149.html
>> >... The ISO
>> >10181-4 document (called non repudiation Framework) starts with:
>> >"The goal of the non-repudiation service is to collect, maintain,
>> >make available and validate irrefutable evidence concerning a
>> >claimed event or action in order to solve disputes about the
>> >occurrence of the event or action".

But, the actual standard costs money (!?) so it is
not surprising that it is the subject of much
controversy :)

-
The Cryptography Mailing List
Unsubscribe by sending "unsubscribe cryptography" to [EMAIL PROTECTED]


Re: Ousourced Trust (was Re: Difference between TCPA-Hardware and a smart card and something else before

2003-12-28 Thread Peter Gutmann
Anne & Lynn Wheeler <[EMAIL PROTECTED]> writes:

>the IETF OCSP standards work seems to be all about a real-time protocol that
>a relying party can use to check with a (LDAP?) database about whether the
>information that might be in a specific certificate can still be relied on.
>It has some of the flavor of a distributed filesystem/database cache entry
>invalidation protocol  All of the CRL and OCSP stuff isn't about using the
>certificate for authenticating to an x.500 directory  but whether the
>stale, static copy of information in the certificate is still good.

That's my big gripe with OCSP, it's compromised in almost every way in order
to make it completely bug-compatible with CRLs.  It's really mostly an online
CRL query protocol rather than any kind of status protocol (in other words a
responder can give you an, uhh, "live" response from a week-old CRL via OCSP).
A recent update to the protocol even removes the use of nonces, to make replay
attacks possible.

Peter.

-
The Cryptography Mailing List
Unsubscribe by sending "unsubscribe cryptography" to [EMAIL PROTECTED]


Re: Ousourced Trust (was Re: Difference between TCPA-Hardware and a smart card and something else before

2003-12-28 Thread Anne & Lynn Wheeler
At 02:07 AM 12/28/2003 +1300, Peter Gutmann wrote:
That's my big gripe with OCSP, it's compromised in almost every way in order
to make it completely bug-compatible with CRLs.  It's really mostly an online
CRL query protocol rather than any kind of status protocol (in other words a
responder can give you an, uhh, "live" response from a week-old CRL via OCSP).
A recent update to the protocol even removes the use of nonces, to make replay
attacks possible.
in general, distributed cache/filesystem cache consistency algorithms 
aren't about trust or trust propogation but integrity and consistency.

I had done the initial distributed lock manager for ha/cmp. misc. past posts:
http://www.garlic.com/~lynn/2001.html#40 Disk drive behavior
http://www.garlic.com/~lynn/2001c.html#66 KI-10 vs. IBM at Rutgers
http://www.garlic.com/~lynn/2001e.html#2 Block oriented I/O over IP
http://www.garlic.com/~lynn/2001j.html#47 OT - Internet Explorer V6.0
http://www.garlic.com/~lynn/2001k.html#5 OT - Internet Explorer V6.0
http://www.garlic.com/~lynn/2002e.html#67 Blade architectures
http://www.garlic.com/~lynn/2002f.html#1 Blade architectures
http://www.garlic.com/~lynn/2002k.html#8 Avoiding JCL Space Abends
http://www.garlic.com/~lynn/2003i.html#70 A few Z990 Gee-Wiz stats
issue with certficates as cache entries ... is that they are purely r/o, 
static entries ... and the cache consistency protocols (either CRLs or 
OCSP) is purely with respect to whether the information is still fresh or 
not. however, I still contend that the primary design point for these 
deployed certificates is to allow relying-parties to perform offline 
operations when they wouldn't nominally have access to the real data (from 
which the certificate is derived).

the issue with the CRLs is that the are an electronic version of the paper 
booklets of invalid numbers in the credit card industry before online 
transactions. the issue is that the switch to a real online paradigm in the 
credit card industry in the '70s pretty much obsoleted the need for offline 
credentials (they retained the same form factor but added the magstripe for 
online transactions) and any infrastructure support for offline paradigm 
(like CRLs). OCSP appears to acquire all the infrastructure costs of doing 
online transaction while retaining all the disadvantages of CRL paradigm 
... i.e. undergo the costs of doing an actual online transaction w/o having 
any of the advantages of actually having done an online transaction. a 
trivial example is there is none of the benefits of aggregation (credit 
limit, fraud use patterns, etc) that comes with having a real online 
transaction.

the market niche for certificates are still the offline world (which is 
rapidly disappearing) or for extremely low value operations that don't 
justify the expense of online transaction. This issue in the later is 
two-fold 1) online transaction related costs continue to rapidly decline 
and 2) for low/no value operations it is difficult to justify the cost and 
complexity of PKI infrastructure.
--
Anne & Lynn Wheelerhttp://www.garlic.com/~lynn/
Internet trivia 20th anv http://www.garlic.com/~lynn/rfcietff.htm
 

-
The Cryptography Mailing List
Unsubscribe by sending "unsubscribe cryptography" to [EMAIL PROTECTED]


Re: I don't know PAIN...

2003-12-28 Thread Ben Laurie
Raymond Lillard wrote:

Ben Laurie wrote:

Ian Grigg wrote:

What is the source of the acronym PAIN?
Lynn said:
... A security taxonomy, PAIN:
* privacy (aka thinks like encryption)
* authentication (origin)
* integrity (contents)
* non-repudiation


I.e., its provenance?

Google shows only a few hits, indicating
it is not widespread.


Probably because non-repudiation is a stupid idea: 
http://www.apache-ssl.org/tech-legal.pdf.


OK, I'm a mere country mouse when it comes to cryptography,
so be kind.
:-)

I have read most of the above paper on non-repudiation and
noticed on p3 the following footnote:
"Note that there is no theoretical reason that it should be
possible to figure out the public key given the private key,
either, but it so happens that it is generally possible to
do so"
So what's this "generally possible" business about?
Well, AFAIK its always possible, but I was hedging my bets :-) I can 
imagine a system where both public and private keys are generated from 
some other stuff which is then discarded.

A few references will do.
If you want the gory details, I recommend the Handbook of Applied 
Cryptography by Menezes et al., _not_ the Schneier brick. Warning: 
pretty technical.

Cheers,

Ben.

--
http://www.apache-ssl.org/ben.html   http://www.thebunker.net/
"There is no limit to what a man can do or how far he can go if he
doesn't mind who gets the credit." - Robert Woodruff
-
The Cryptography Mailing List
Unsubscribe by sending "unsubscribe cryptography" to [EMAIL PROTECTED]


Re: Identity Based Encryption

2003-12-28 Thread Tim Dierks
At 03:54 PM 12/23/2003, Al wrote:
I have had a look at Identity Based Encryption but I have not been able to
find out whether there are any protecting patents.
US Patent Application 20030081785, May 1, 2003: http://tinyurl.com/3fo8e

 - Tim

-
The Cryptography Mailing List
Unsubscribe by sending "unsubscribe cryptography" to [EMAIL PROTECTED]


Re: Non-repudiation (was RE: The PAIN mnemonic)

2003-12-28 Thread Ben Laurie
Ian Grigg wrote:
Carl and Ben have rubbished "non-repudiation"
without defining what they mean, making it
rather difficult to respond.
I define it quite carefully in my paper, which I pointed to.

Now, presumably, they mean the first, in
that it is a rather hard problem to take the
cryptographic property of public keys and
then bootstrap that into some form of property
that reliably stands in court.
But, whilst challenging, it is possible to
achieve legal non-repudiability, depending
on your careful use of assumptions.  Whether
that is a sensible thing or a nice depends
on the circumstances ... (e.g., the game that
banks play with pin codes).
Actually, its very easy to achieve legal non-repudiability. You pass a 
law saying that whatever-it-is is non-repudiable. I also cite an example 
of this in my paper (electronic VAT returns are non-repudiable, IIRC).

So, as a point of clarification, are we saying
that "non-repudiability" is ONLY the first of
the above meanings?  And if so, what do we call
the second?  Or, what is the definition here?
From where I sit, it is better to term these
as "legal non-repudiability" or "cryptographic
non-repudiability" so as to reduce confusion.
Read my paper (it was co-authored with a lawyer, so I believe we've got 
both the crypto and legal versions covered).

Cheers,

Ben.

--
http://www.apache-ssl.org/ben.html   http://www.thebunker.net/
"There is no limit to what a man can do or how far he can go if he
doesn't mind who gets the credit." - Robert Woodruff
-
The Cryptography Mailing List
Unsubscribe by sending "unsubscribe cryptography" to [EMAIL PROTECTED]


Re: Identity Based Encryption

2003-12-28 Thread Dan Riley
"Al" <[EMAIL PROTECTED]> writes:
> I have had a look at Identity Based Encryption but I have not been able to
> find out whether there are any protecting patents.

Patents pending.

sci.crypt thread from early July, "Boneh/Franklin IBE patent":

http://groups.google.com/groups?threadm=85c09e0.030704.53bc05bf%40posting.google.com

USPTO application Boneh, et al.:

http://appft1.uspto.gov/netacgi/nph-Parser?Sect1=PTO2&Sect2=HITOFF&p=1&u=%2Fnetahtml%2FPTO%2Fsearch-bool.html&r=1&f=G&l=50&co1=AND&d=PG01&s1=Boneh.IN.&OS=IN/Boneh&RS=IN/Boneh

"Worldwide" WIPO patent application, Boneh & Franklin:

http://l2.espacenet.com/espacenet/viewer?PN=WO03017559&CY=ep&LG=en&DB=EPD

-dan

-
The Cryptography Mailing List
Unsubscribe by sending "unsubscribe cryptography" to [EMAIL PROTECTED]


Re: Non-repudiation (was RE: The PAIN mnemonic)

2003-12-28 Thread Ian Grigg
Ben Laurie wrote:
> 
> Ian Grigg wrote:
> > Carl and Ben have rubbished "non-repudiation"
> > without defining what they mean, making it
> > rather difficult to respond.
> 
> I define it quite carefully in my paper, which I pointed to.


Ah.  I did read your paper, but deferred any comment
on it, in part because I didn't understand what its
draft/publication status was.


Ben Laurie said:
> Probably because non-repudiation is a stupid idea:
> http://www.apache-ssl.org/tech-legal.pdf.


You didn't state which of the two definitions
you were rubbishing, so I shall respond to both!



Let's take the first definition - your "technical
definition" (2.7):

  "Non-repudiation", in its technical sense, is a property of a communications
  system such that the system attributes the sending of a message to a person
  if, but only if, he did in fact send it, and records a person as having received
  a message if, but only if, he did in fact receive it. If such systems exist at all,
  they are very rare.

  Non-repudiability is often claimed to be a property of electronic signatures of
  the kind described above. This claim is unintelligible if "non-repudiation" is
  used in its correct technical sense, and in fact represents an attempt to confer a
  bogus technical respectability on the purely commercial assertion the the owners
  of private keys should be made responsible for their use, whoever in fact uses
  them.

Some comments.

1. This definition seems to be only one of the many
out there [1].  The use of the term "correct technical
sense" then would be meaningless as well as brave
without some support of references.  Although it does
suffice to ground the use within the paper.

2. The definition is muddied by including the attack
inside the definition.  The attack on the definition would
fit better in section 6. "Is \non-repudiation" a useful
concept?"

3. Nothing in either the definition 2.7 or the proper
section of 6. tells us above why the claim is "unintelligable".

To find this, we have to go back to Carl's comment
which gets to the nub of the legal and literal meaning
of the term:

"To me, "repudiation" is the action only of a human being (not of a key)..."

Repudiate can only be done by a human [2].  A key cannot
repudiate, nor can a system of technical capabilities [3].
(Imagine here, a debate on how to tie the human to the
key.)

That is, it is an agency problem, and unless clearly
cast in those terms, for which there exists a strong
literature, no strong foundation can be made of any
conclusions [4].



4. The discussion resigns itself to being somewhat
dismissive, by leaving open the possibility that
there are alternative possibilities.  There is
a name for this fallacy, stating the general and
showing only the specific, but I forget its name.

In the first para, 2.7, it states that "If such systems
exist at all, they are very rare."  Thus, allowing
for existance.  Yet in the second para, one context
is left as "unintelligable."  In section 6, again,
"most discussions ... are more confusing than helpful."

This hole is created, IMHO, by the absence of Carl's
killer argument in 3. above.  Only once it is possible
to move on from the fallacy embodied in the term
repudiation itself, is it possible to start considering
what is "good" and useful about the irrefutability (or
otherwise) of a digital signature [5].

I.e., throwing out the bathwater is a fine and regular
thing to do.  Let's now start looking for the baby.



> > But, whilst challenging, it is possible to
> > achieve legal non-repudiability, depending
> > on your careful use of assumptions.  Whether
> > that is a sensible thing or a nice depends
> > on the circumstances ... (e.g., the game that
> > banks play with pin codes).
> 
> Actually, its very easy to achieve legal non-repudiability. You pass a
> law saying that whatever-it-is is non-repudiable. I also cite an example
> of this in my paper (electronic VAT returns are non-repudiable, IIRC).

Which brings us to your second definition, again,
in 2.7:

To lawyers, non-repudiation was not a technical legal term before techies gave
it to them. Legally it refers to a rule which defines circumstances in which a
person is treated for legal purposes as having sent a message, whether in fact
he did or not, or is treated as having received a message, whether in fact he
did or not. Its legal meaning is thus almost exactly the opposite of its technical
meaning.


I am not sure that I'd agree that the legal
fraternity thinks in the terms outlined in the
second sentance.  I'd be surprised if the legal
fraternity said any more than "what you are
trying to say is perhaps best seen by these
sorts of rules..."

Much of law already duplicates what is implied
above, anyway, which makes one wonder (a) what
is the difference between the above and the
rules of evidence and presumption, etc, etc
and (b) why did the legal fraternity adopt
the techies' term with such abandon that they
did

Re: Non-repudiation (was RE: The PAIN mnemonic)

2003-12-28 Thread Richard Johnson
On Sun, Dec 21, 2003 at 09:45:54AM -0700, Anne & Lynn Wheeler wrote:
> note, however, when I did reference PAIN as (one possible) security 
> taxonomy  i tended to skip over the term non-repudiation and primarily 
> made references to privacy, authentication, and integrity.


In my eperience, the terminology has more often been "confidentiality,
integrity, and authentication".  Call it CIA if you need an acronym easy
to memorize, if only due to its ironic similarity with that for the name of
a certain US government agency. :-)


Richard

-
The Cryptography Mailing List
Unsubscribe by sending "unsubscribe cryptography" to [EMAIL PROTECTED]


Re: Norwegian DVD Hacker Acquitted on Piracy Charges

2003-12-28 Thread Steve Schear
At 12:49 PM 12/22/2003, R. A. Hettinga wrote:
In 2001, the 2nd U.S. Circuit Court of Appeals in New York said postings of
the encryption program violated the 1998 federal Digital Millennium
Copyright Act, which prohibits the circumvention of copy controls along
with discussions on how to do so.
I think that should have read "... decryption program..."

steve 

-
The Cryptography Mailing List
Unsubscribe by sending "unsubscribe cryptography" to [EMAIL PROTECTED]


Microsoft publicly announces Penny Black PoW postage project

2003-12-28 Thread Steve Schear


http://news.bbc.co.uk/2/hi/technology/3324883.stm

Adam Back is part of this team, I think.

Similar approach to Camram/hahscash.  Memory-based approaches have been 
discussed.  Why hasn't Camram explored them?

steve

BTW, Penny Black stamp was only used briefly.  It was the Penny Red which 
was used for decades. 

-
The Cryptography Mailing List
Unsubscribe by sending "unsubscribe cryptography" to [EMAIL PROTECTED]


Re: Microsoft publicly announces Penny Black PoW postage project

2003-12-28 Thread David Honig
At 09:13 AM 12/26/03 -0800, Steve Schear wrote:
>http://news.bbc.co.uk/2/hi/technology/3324883.stm
>

>>Mr Wobber and his group calculated that if there are 80,000
seconds in a day, a computational "price" of a 10-second levy
would mean spammers would only be able to send about 8,000
messages a day, at most. 

"Spammers are sending tens of millions of e-mails, so if they had
to do that with all the messages, they would have to invest
heavily in machines." <<


Replace "invest" with "trojan" and remind Mr. W. that he
works for the major facilitator of trojaned machines.







-
The Cryptography Mailing List
Unsubscribe by sending "unsubscribe cryptography" to [EMAIL PROTECTED]


Re: Microsoft publicly announces Penny Black PoW postage project

2003-12-28 Thread Adam Back
I did work at Microsoft for about a year after leaving ZKS, but I quit
a month or so ago (working for another startup again).

But for accuracy while I was at Microsoft I was not part of the
microsoft research/academic team that worked on penny black, though I
did exchange a few emails related to that project and hashcash etc
with the researchers.

I thought the memory-bound approaches discussed on CAMRAM before were
along the lines of hash functions which chewed off artificially large
code foot-print as a way to impose the need for memory.  

Arnold Reinhold's HEKS [1] (Hash Extended Key Stretcher) key stretching
algorithm is related also.  HEKS aims to make hardware attacks on key
stretching more costly: both by increasing the memory footprint
required to efficiently compute it, and by requiring operations that
are more expensive in silicon (32 bit multiplies, floating point is
another suggestion he makes).

The relationship to hashcash is you could simply use HEKS in place of
SHA1 to get the desired complexity and hence silicon cost increase.

"The main design goal of this algorithm is to make massively parallel
key search machines it as expensive as possible by requiring many
32-bit multiplies and large amounts of memory."

I think I also recall discussing with Peter Gutmann the idea of using
more complex hash functions (composed of existing hash functions for
security) to increase the cost of hardware attacks.


The innovation in the papers referred to by the Penny Black project
was the notion of building a cost function that was limited by memory
bandwidth rather CPU speed.  In otherwords unlike hashcash (which is
CPU bound and has minimal working memory or code footprint) or a
notional hashcash built on HEKS or other similar system (which is
supposed to take memory and generaly expensive operations to build in
silicon), the two candidate memory-bound functions are designed to be
computationally cheap but require a lot of random access memroy
utilization in a way which frustrates time-space trade-offs (to reduce
space consumption by using a faster CPU).  They then argue that this
is desirable because there is less discrepency in memory latency
between high end systems and low end systems than there is discrepency
in CPU power.

The 2nd memory [3] bound paper (by Dwork, Goldber and Naor) finds a
flaw in in the first memory-bound function paper (by Adabi, Burrows,
Manasse, and Wobber) which admits a time-space trade-off, proposes an
improved memory-bound function and also in the conclusion suggests
that memory bound functions may be more vulnerable to hardware attack
than computationally bound functions.  Their argument on that latter
point is that the hardware attack is an economic attack and it may be
that memory-bound functions are more vulnerable to hardware attack
because you could in their view build cheaper hardware more
effectively as the most basic 8-bit CPU with slow clock rate could
marshall enough fast memory to under-cut the cost of general purpose
CPUs by a larger margin than a custom hardware optimized
hashcash/computationally bound function.

I'm not sure if their conclusion is right, but I'm not really
qualified -- it's a complex silicon optimization / hardware
acceleration type question.

Adam

[1] http://world.std.com/~reinhold/HEKSproposal.html

[2] Abadi, Burrows, Manasse and Wobber "Moderately Hard, Memory-bound
Functions", Proceedings of the 10th Annual Network and Distributed
System Security Symposium, February 2003

http://research.microsoft.com/research/sv/PennyBlack/demo/memory-final-ndss.pdf

[3] Dwork, Goldberg, and Naor, "On Memory-Bound Functions for Fighting
Spam", Proceedings of the 23rd Annual International Cryptology
Conference (CRYPTO 2003), August 2003.

http://research.microsoft.com/research/sv/PennyBlack/demo/lbdgn.pdf


On Fri, Dec 26, 2003 at 09:13:23AM -0800, Steve Schear wrote:
> http://news.bbc.co.uk/2/hi/technology/3324883.stm
> 
> Adam Back is part of this team, I think.
> 
> Similar approach to Camram/hahscash.  Memory-based approaches have been 
> discussed.  Why hasn't Camram explored them?
> 
> steve

-
The Cryptography Mailing List
Unsubscribe by sending "unsubscribe cryptography" to [EMAIL PROTECTED]


Re: Microsoft publicly announces Penny Black PoW postage project

2003-12-28 Thread Ben Laurie
Steve Schear wrote:

http://news.bbc.co.uk/2/hi/technology/3324883.stm

Adam Back is part of this team, I think.

Similar approach to Camram/hahscash.  Memory-based approaches have been 
discussed.  Why hasn't Camram explored them?
They were only invented recently, and indeed, I've been planning to 
introduce them to the camram arena. I wonder if they're being discussed 
as a result of the pub conversation I had recently with a Microsoft 
person on this very subject?

One major advantage of memory-based proof-of-work over hashcash is that 
the variation between machines is much smaller (estimated to be a factor 
of 4 from slowest to fastest PCs, for example).

BTW, for those who don't know, SpamAssassin now supports hashcash.

Cheers,

Ben.

--
http://www.apache-ssl.org/ben.html   http://www.thebunker.net/
"There is no limit to what a man can do or how far he can go if he
doesn't mind who gets the credit." - Robert Woodruff
-
The Cryptography Mailing List
Unsubscribe by sending "unsubscribe cryptography" to [EMAIL PROTECTED]


CIA - the cryptographer's intelligent aid?

2003-12-28 Thread Ian Grigg
Richard Johnson wrote:
> 
> On Sun, Dec 21, 2003 at 09:45:54AM -0700, Anne & Lynn Wheeler wrote:
> > note, however, when I did reference PAIN as (one possible) security
> > taxonomy  i tended to skip over the term non-repudiation and primarily
> > made references to privacy, authentication, and integrity.
> 
> In my eperience, the terminology has more often been "confidentiality,
> integrity, and authentication".  Call it CIA if you need an acronym easy
> to memorize, if only due to its ironic similarity with that for the name of
> a certain US government agency. :-)


I would agree that CIA reins supreme.  It's easy to
remember, and easy to teach.  It covers the basic
crypto techniques, those that we are sure about and
can be crafted simply with primitives.

CIA doesn't overreach itself.  CAIN, by introducing
non-repudiation, brings in a complex multilayer
function that leads people down the wrong track.

PAIN is worse, as it introduces Privacy instead of
Confidentiality.  The former is a higher level term
that implies application requirements, arguably, not
a crypto term at all.  At least with Confidentiality
it is possible to focus on packets and connections
and events as being confidential at some point in
time; but with Privacy, we are launched out of basic
crypto and protocols into the realm of applications.

iang

-
The Cryptography Mailing List
Unsubscribe by sending "unsubscribe cryptography" to [EMAIL PROTECTED]


Re: Microsoft publicly announces Penny Black PoW postage project

2003-12-28 Thread Jim Gillogly
Steve Schear wrote:

http://news.bbc.co.uk/2/hi/technology/3324883.stm
Rather than proving that you've wasted a signficant amount of computing
resources, wouldn't it be preferable to prove that you've contributed
the same amount of power to a useful compute-bound project, such as
NFSNET.org or GIMPS or [EMAIL PROTECTED] or [EMAIL PROTECTED]  It would require some
change on the qualifying cycle charities to issue "stamps": the spammer
or sender would send the cycle charity a hash of the message and the
completed assignment, and the charity would return a signature for the
message.  The receiver would pay only to verify the signature, including
the date and whatever headers are necessary for personalization.  Perhaps
something easy could deal with replay, but even without this addition
it's easier to deal with identical spams than random ones -- I suppose
one's mailer could keep track of the last few hashes it's accepted if
this turns out to be an issue.
It's more work for a cycle sink to become a recognized mail certifier,
but worth it -- in exchange for the signature mechanisms they get more
(potentially a lot more) contributors.
--
Jim Gillogly
-
The Cryptography Mailing List
Unsubscribe by sending "unsubscribe cryptography" to [EMAIL PROTECTED]


Repudiating non-repudiation

2003-12-28 Thread Ian Grigg
In response to Ed and Amir,

I have to agree with Carl here and stress that the
issue is not that the definition is bad or whatever,
but the word is simply out of place.  Repudiation is
an act of a human being.  So is the denial of that
or any other act, to take a word from Ed's 1st definition.

We can actually learn a lot more from the legal world
here, in how they solve this dilemma.  Apologies in
advance, as what follows is my untrained understanding,
derived from a legal case I was involved with in
recent years [1].  It is an attempt to show why the
use of the word "repudiation" will never help us and
will always hinder us.



The (civil) courts resolve disputes.  They do *not*
make contracts right, or tell wrong-doers to do the
right thing, as is commonly thought.

Dispute resolution by definition starts out with a
dispute, of course.  That dispute, for sake of argument,
is generally grounded in a denial, or a repudiation.

One party - a person - repudiates a contract or a
bill or a something.

So, one might think that it would be in the courts'
interest to reduce the number of repudiations.  Quite
the reverse - the courts bend over backwards, sideways,
and tie themselves in knots to permit and encourage
repudiations.  In general, the rule is that anyone
can file *anything* into a court.

The notion of "non-repudiation" is thus anathema to
the courts.  From a legal point of view, we, the
crypto community, will never make headway if we use
this term [2].  What terms we should use, I suggest
below, but to see that, we need to get the whole
process of the courts in focus.



Courts encourage repudiations so as to encourage
all the claims to get placed in front of the forum
[3].  The full process that is then used to resolve
the dispute is:

   1. filing of claims, a.k.a. "pleadings".
   2. presentation of evidence
   3. application of law to the evidence
   4. a reasoned ruling on 1 is delivered based on 2,3

Now, here's where cryptographer's have made the
mistake that has led us astray.  In the mind of a
cryptographer, a statement is useless if it cannot
be proven beyond a shred of doubt.

The courts don't operate that way - and neither does
real life.  In this, it is the cryptographers that
are the outsiders [4].

What the courts do is to encourage the presentation
of all evidence, even the "bad" stuff.  (That's what
hearings are, the presentation of evidence.)

Then, the law is applied - and this means that each
piece of evidence is measured and filtered and
rated.  It is mulled over, tested, probed, and
brought into relationship with all the other pieces
of evidence.

Unlike no-risk cryptography, there isn't such a
thing as bad evidence.  There is, instead, strong
evidence and weak evidence.  There is stuff that
is hard to ignore, and stuff that doesn't add
much. But, even the stuff that adds little is not
discriminated against, at least in the early phases.



And this is where the cryptography field can help:
a digital signature, prima facea, is just another
piece of evidence.  In the initial presentation of
evidence, it is neither weak nor strong.

It is certainly not "non-repudiable."  What it is
is another input to be processed.  The digsig is
as good as all the others, first off.  Later on,
it might become stronger or weaker, depending.

We, cryptographers, help by assisting in the
process of determining the strength of the
evidence.  We can do it in, I think, three ways:



Firstly, the emphasis should switch from the notion
of non-repudiation to the strength of evidence.  A
digital signature is evidence - our job as crypto
guys is to improve the strength of that evidence,
with an eye to the economic cost of that strength,
of course.

Secondly, any piece of evidence will, we know, be
scrutinised by the courts, and assessed for its
strength.  So, we can help the process of dispute
resolution by clearly laying out the assumptions
and tests that can be applied.  In advance.  In
as accessible a form as we know how.

For example, a simple test might be that a
receipt is signed validly if:

   a. the receipt has a valid hash,
   b. that hash is signed by a private key,
   c. the signature is verified by a public
  key, paired with that private key

Now, as cryptographers, we can see problems,
which we can present as caveats, beyond the
strict statement that the receipt has a valid
signature from the signing key:

   d. the public key has been presented by
  the signing party (person) as valid
  for the purpose of receipts
   e. the signing party has not lost the
  private key
   f. the signature was made based on best
  and honest intents...

That's where it gets murky.  But, the proper
place to deal with these murky issues is in
the courts.  We can't solve those issues in
the code, and we shouldn't try.  What we should
do is instead surface all the assumptions we
make, and list out the areas where further
care is needed.

Thirdly, we can create protocols that bear
in mind the co

Re: Non-repudiation (was RE: The PAIN mnemonic)

2003-12-28 Thread Anne & Lynn Wheeler
At 01:34 AM 12/24/2003 -0800, Ed Gerck wrote:
However, IMO non-repudiation refers to a useful and
essential cryptographic primitive. It does not mean the
affirmation of a truth (which is authentication). It means
the denial of a falsity -- such as:
(1) the ability to prevent the effective denial of an act (in
other words, denying the act becomes a falsity); or
(2) the ability to prevent the denial of the origin or delivery
of transactions.
so another way of looking at it ... is that somebody repudiates, refutes, 
and/or disavovs ... typically after the fact.

non-repudiation would be those things that would support countering claims 
of repudiation, refuting, and/or disavowing.

authentication is typically demonstrating that an entity is allowed to do 
something. authentication can include having a passphrase that is known by 
everybody in the organization. knowing the passphrase is sufficient to 
authenticate that somebody is allowed to do something. however, if somebody 
refutes that they had done something  showing that they knew the 
passphrase (known by everybody in the organization) isn't sufficient to 
counter the repudiation claim.

an infrastructure that requires a unique passphrase for every person would 
help counter repudiation claims

public/private asymmetric cryptography systems where the infrastructure 
requires that a single person only has access to a particular private key 
would help counter repudiation claims. In that sense  public/private 
key system can be seen as addressing both privacy and non-repudiation 
issues.  the policies governing the determination of private key in a 
asymmetric cryptography infrastructure can influence whether it just 
pertains to just privacy and authentication and/or whether it can also be 
used to counter repudiation claims.
while making sure that one & only one person has knowledge of a specific 
private key, in no way impacts the asymmetric cryptography operations 
...  the process can be used to countering repudiation claims.

while repudiation tends to be a human act  it is entirely possible to 
have infrastructure and organizational implementation features that support 
countering claims of repudiation when they occur.

say dozens of people know (the same) vault combination lock 
(authentication)   which doesn't do anything to counter a particular 
person's claim that they didn't enter the vault,
however video surveillance and door badge access logs could be considered 
as part of security taxonomy for countering repudiation claims.
--
Anne & Lynn Wheelerhttp://www.garlic.com/~lynn/
Internet trivia 20th anv http://www.garlic.com/~lynn/rfcietff.htm
 

-
The Cryptography Mailing List
Unsubscribe by sending "unsubscribe cryptography" to [EMAIL PROTECTED]


Re: Microsoft publicly announces Penny Black PoW postage project

2003-12-28 Thread Adam Back
Oh yes forgot one comment:

One down-side of memory bound is that it is memory bound.  That is to
say it will be allocated some amount of memory, and this would be
chosen to be enough memory to that a high end machine should not have
that much cache so think multiple MB, maybe 8MB, 16MB or whatever.
(Not sure what is the max L2 cache on high end servers).

And what the algorithm will do is make random accesses to that memory
as fast as it can.

So effectively it will play badly with other applications -- tend to
increase likelihood of swapping, decrease memory available for other
applications etc.  You could think of the performance implications as
a bit like pulling 8MB of ram or whatever the chosen value is.

hashcash / computationally bound functions on the other hand have a
tiny footprint and CPU consumption by hashcash can be throttled to
avoid noticeable impact on other applications.

Adam

On Fri, Dec 26, 2003 at 09:37:18PM -0500, Adam Back wrote:
> I did work at Microsoft for about a year after leaving ZKS, but I quit
> a month or so ago (working for another startup again).
> 
> But for accuracy while I was at Microsoft I was not part of the
> microsoft research/academic team that worked on penny black, though I
> did exchange a few emails related to that project and hashcash etc
> with the researchers.
> 
> I thought the memory-bound approaches discussed on CAMRAM before were
> along the lines of hash functions which chewed off artificially large
> code foot-print as a way to impose the need for memory.  
> 
> Arnold Reinhold's HEKS [1] (Hash Extended Key Stretcher) key stretching
> algorithm is related also.  HEKS aims to make hardware attacks on key
> stretching more costly: both by increasing the memory footprint
> required to efficiently compute it, and by requiring operations that
> are more expensive in silicon (32 bit multiplies, floating point is
> another suggestion he makes).
> 
> The relationship to hashcash is you could simply use HEKS in place of
> SHA1 to get the desired complexity and hence silicon cost increase.
> 
> "The main design goal of this algorithm is to make massively parallel
> key search machines it as expensive as possible by requiring many
> 32-bit multiplies and large amounts of memory."
> 
> I think I also recall discussing with Peter Gutmann the idea of using
> more complex hash functions (composed of existing hash functions for
> security) to increase the cost of hardware attacks.
> 
> 
> The innovation in the papers referred to by the Penny Black project
> was the notion of building a cost function that was limited by memory
> bandwidth rather CPU speed.  In otherwords unlike hashcash (which is
> CPU bound and has minimal working memory or code footprint) or a
> notional hashcash built on HEKS or other similar system (which is
> supposed to take memory and generaly expensive operations to build in
> silicon), the two candidate memory-bound functions are designed to be
> computationally cheap but require a lot of random access memroy
> utilization in a way which frustrates time-space trade-offs (to reduce
> space consumption by using a faster CPU).  They then argue that this
> is desirable because there is less discrepency in memory latency
> between high end systems and low end systems than there is discrepency
> in CPU power.
> 
> The 2nd memory [3] bound paper (by Dwork, Goldber and Naor) finds a
> flaw in in the first memory-bound function paper (by Adabi, Burrows,
> Manasse, and Wobber) which admits a time-space trade-off, proposes an
> improved memory-bound function and also in the conclusion suggests
> that memory bound functions may be more vulnerable to hardware attack
> than computationally bound functions.  Their argument on that latter
> point is that the hardware attack is an economic attack and it may be
> that memory-bound functions are more vulnerable to hardware attack
> because you could in their view build cheaper hardware more
> effectively as the most basic 8-bit CPU with slow clock rate could
> marshall enough fast memory to under-cut the cost of general purpose
> CPUs by a larger margin than a custom hardware optimized
> hashcash/computationally bound function.
> 
> I'm not sure if their conclusion is right, but I'm not really
> qualified -- it's a complex silicon optimization / hardware
> acceleration type question.
> 
> Adam
> 
> [1] http://world.std.com/~reinhold/HEKSproposal.html
> 
> [2] Abadi, Burrows, Manasse and Wobber "Moderately Hard, Memory-bound
> Functions", Proceedings of the 10th Annual Network and Distributed
> System Security Symposium, February 2003
> 
> http://research.microsoft.com/research/sv/PennyBlack/demo/memory-final-ndss.pdf
> 
> [3] Dwork, Goldberg, and Naor, "On Memory-Bound Functions for Fighting
> Spam", Proceedings of the 23rd Annual International Cryptology
> Conference (CRYPTO 2003), August 2003.
> 
> http://research.microsoft.com/research/sv/PennyBlack/demo/lbdgn.pdf
> 
> 
> On Fri, Dec 26, 2003 at 09:13:23AM

Re: stego in the wild: bomb-making CDs

2003-12-28 Thread Peter Gutmann
John Denker <[EMAIL PROTECTED]> writes:

>] Thursday 25 December 2003, 17:13 Makka Time, 14:13 GMT
>]
>] Saudis swoop on DIY bomb guide
>
>[...]
>
>I suspect there is a lot more to this story..

The story could apply to any one of hundreds (thousands?) of hacker/warez CDs
available off-the-shelf in the US.  Heck, it could apply to the Encyclopedia 
Britannica CD edition.  So I'd pick:

> 3) Also: as a rule, anything you do with computers generates
>   more headlines than doing the same thing with lower-tech methods.

because:

>] Saudis swoop on DIY bomb guide

sounds a lot better than:

>] Saudis swoop on Britannica vendors

Peter.

-
The Cryptography Mailing List
Unsubscribe by sending "unsubscribe cryptography" to [EMAIL PROTECTED]


RE: Non-repudiation (was RE: The PAIN mnemonic)

2003-12-28 Thread Peter Gutmann
"Carl Ellison" <[EMAIL PROTECTED]> writes:

>>Ah. That's why they're trying to rename the corresponding keyUsage bit
>>to "contentCommitment" then:
>
>Maybe, but that page defines it as:
>
>contentCommitment: for verifying digital signatures which are intended to
>signal that the signer is committing to the content being signed. The
>precise level of commitment, e.g. "with the intent to be bound" may be
>signaled by additional methods, e.g. certificate policy.

This refers to the second (and IMHO more sensible) use of the X.509
nonRepudiation bit, which uses digitalSignature for short-term signing (e.g.
user authentication) and nonRepudiation for long-term signing (e.g. signing
a document).  The other definition uses digitalSignature for everything,
and nonRepudiation as an additional service on top of digitalSignature.  The
problem with that definition is that no two people in the X.509 world can
agree on what nonRepudiation actually signifies.  The best suggestion I've
seen for the nonRepudiation bit is that CAs should set it to random values
to disabuse users of the notion that it has any meaning.  For the
"additional-service" definition of nonRepudiation, the X.509 Style Guide 
says:

  Although everyone has their own interpretation, a good practical definition 
  is "Nonrepudiation is anything which fails to go away when you stop 
  believing in it".  Put another way, if you can convince a user that it isn't 
  worth trying to repudiate a signature then you have nonrepudiation.  This 
  can take the form of having them sign a legal agreement saying they won't 
  try to repudiate any of their signatures, giving them a smart card and 
  convincing them that it's so secure that any attempt to repudiate a 
  signature generated with it would be futile, threatening to kill their kids, 
  or any other method which has the desired effect.  One advantage (for 
  vendors) is that you can advertise just about anything as providing 
  nonrepudiation, since there's sure to be some definition which matches 
  whatever it is you're doing (there are "nonrepudiation" schemes in use today 
  which employ a MAC using a secret shared between the signer and the verifier, 
  which must be relying on a particularly creative definition of 
  nonrepudiation).

Peter.

-
The Cryptography Mailing List
Unsubscribe by sending "unsubscribe cryptography" to [EMAIL PROTECTED]