Re: Seth Schoen's Hard to Verify Signatures

2004-09-10 Thread "Hal Finney"
Hi, David - I had thought about that short-validity-proof but I don't
believe it works.  It works for the signer but not for the verifier.

> This signature scheme has an interesting property: once you've done
> the lengthy computation, you can quickly prove to someone else that
> the signature is valid.  In particular, there is a short and easy to
> prove that the signature is valid, based on listing x, x^2, x^4, x^16,
> x^256, x^65536, ..., x^e and giving a zero knowledge proof that your
> list is correct.  The details can be found here (see esp. Section 3):
>   D. Boneh, M. Naor, "Timed Commitments", CRYPTO 2000.
>   http://crypto.stanford.edu/~dabo/abstracts/timedcommit.html
> The signer can also create a proof like this if he wants.  Note that
> Boneh and Naor consider a somewhat similar but not equivalent notion of
> timed signatures, which may also be of interest.

The timeline proof is based on the standard ZK proof of two equal
discrete logs.  In this case we have (x, x^(2^(2^n))) as one pair and
(x^(2^(2^n)), x^(2^(2^(n+1 as the other.  In each case the 2nd number
is the first number raised to the 2^(2^n) power.

The standard ZK proof for dual discrete logs works as follows.
We have y=g^u and z=h^u and want to prove that we know such a u.
We choose a random k and commit with v=g^k and w=h^k.  We then create
a challenge c by hashing various relevant values; at a minimum
c = hash(v,w).  Then we create a response r=c*u+k.  The verification is
g^r==y^c*v and h^r==z^c*w.

This works fine for the signer as all the values can be less than phi.
But for the verifier who does not know phi, the problem is that the
values are too big.  u in this case is 2^(2^n) from my example above,
where 2^n will be the number of squarings the verifier must do, no doubt
a billion or more likely billions of billions, as we get to the last
step in the timeline.  And further, r will be of the same order as u.
Then running the verification requires computing g^r and h^r, which
will take approximately 2^n squarings.  But that is how many squarings
it took to compute these timeline elements in the first place.  So the
2nd verifier cannot verify the ZK proof created by the first verifier
in much less time than the first verifier took to compute the values
the slow way.

I think the paper you pointed to by Boneh and Naor actually has a
different approach which works better.  The recipient gets a value
which, once he does the long work of decryption, will turn into a valid
signature.  And the timeline provides a ZK proof that it will work out
that way.  Then you'd want to make the ZK proof be designated verifier
so that the recipient can't show it around.  This way the recipient
is assured that if he does the work he will get the sig; and once he
gets the sig he can show it; but until he does the work he doesn't have
anything he can show.

I haven't read the paper for a few years so I'm not sure I'm remembering
it right; these extensions might be in a subsequent paper.  I think in
particular the ability to recover a vanilla signature was follow-on work.
But the basic idea is in the Boneh and Naor paper.

Hal

-
The Cryptography Mailing List
Unsubscribe by sending "unsubscribe cryptography" to [EMAIL PROTECTED]


Re: Seth Schoen's Hard to Verify Signatures

2004-09-10 Thread David Wagner
> Hi, David - I had thought about that short-validity-proof but I don't
> believe it works.  It works for the signer but not for the verifier.

Ahh, yes, I see.  You're absolutely right.  I was too quick to jump
to conclusions, but your argument convinces me.  I withdraw my claim
about the verifier being able to convince others of the validity of
the signature -- sorry about that.  Thanks.

-
The Cryptography Mailing List
Unsubscribe by sending "unsubscribe cryptography" to [EMAIL PROTECTED]


Re: Seth Schoen's Hard to Verify Signatures

2004-09-08 Thread "Hal Finney"
Hi, Adam - Yes, that's interesting.  Seth Schoen's posting and
subsequent blog entries do compare his goals with hashcash and similar
stamp minting systems; where hashcash wants to make minting expensive
and verification easy, Seth's HTV signatures aim to make signing easy
and verifying expensive.

> I think maybe you have observed an additional simplification.  In my
> case I use sender chooses x randomly (actually hash output of random
> value and resource string), and computes y = x^{x^w} mod n as the work
> function (expensive operation); and z = x^w mod phi(n), y =?  x^z mod
> n as the cheap operation (verification).
>
> I think your approach could be applied on the encryption side too
> resulting in simpler, faster verification.  Instead it would be:
>
> x is random, compute y = x^{2^t+1} mod n; verify x =? y^d mod n

The main advantage here I think is that d can be precomputed.  However you
could do the same by using y = x^{2^w} instead of x^{x^w}.  Then you could
precompute z = 2^w mod phi and you would have a single exponentiation
to verify just like in my scheme.  The RSW time-lock-puzzle paper does
it this way, they use 2^w as the exponent where w is the work factor.

Hal

-
The Cryptography Mailing List
Unsubscribe by sending "unsubscribe cryptography" to [EMAIL PROTECTED]


Re: Seth Schoen's Hard to Verify Signatures

2004-09-08 Thread Adam Back
Hi

I proposed a related algorithm based on time-lock puzzles as a step
towards non-parallelizable, fixed-minting-cost stamps in section 6.1
of [1], also Dingledine et al observe the same in [2].

The non-parallelizable minting function is in fact the reverse: sender
encrypts (expensively) and the verifier encrypts again (but more
cheaply) and compares, but I think the relationship is quite analogous
to the symmetry between RSA encryption and RSA signatures.

I think maybe you have observed an additional simplification.  In my
case I use sender chooses x randomly (actually hash output of random
value and resource string), and computes y = x^{x^w} mod n as the work
function (expensive operation); and z = x^w mod phi(n), y =?  x^z mod
n as the cheap operation (verification).

I think your approach could be applied on the encryption side too
resulting in simpler, faster verification.  Instead it would be:

x is random, compute y = x^{2^t+1} mod n; verify x =? y^d mod n

I'll add a note about that when I get around to updating it next.

Adam

[1] Hashcash - Amortizable Publicly Auditable Cost-Functions
http://www.hashcash.org/papers/amortizable.pdf

[2] Andy Oram, editor.  Peer-to-Peer: Harnessing the Power of
Disruptive Technologies. O'Reilly and Associates, 2001.  Chapter 16
also available as
http://freehaven.net/doc/oreilly/accountability-ch16.html.

On Wed, Sep 08, 2004 at 11:48:02AM -0700, "Hal Finney" wrote:
> Seth Schoen of the EFF proposed an interesting cryptographic primitive
> called a "hard to verify signature" in his blog at
>
> An alternative is based on the paper, "Time-lock puzzles and
> timed release Crypto", by Rivest, Shamir and Wagner, from 1996,
> [...]
> Choose the number of modular squarings, t, that you want the
> verifier to have to perform.  [...] you will sign your value using
> an RSA key whose exponent e = 2^t + 1.  > The way you sign, even
> using such a large e, is to compute phi = (p-1)*(q-1) and to compute
> e' = e mod phi, which can be done using about 30 squarings of 2 mod
> phi.  You then compute the secret exponent d as the multiplicative
> inverse mod phi of e', in the standard way that is done for RSA
> keys.  Using this method you can sign about as quickly as for
> regular RSA keys.

-
The Cryptography Mailing List
Unsubscribe by sending "unsubscribe cryptography" to [EMAIL PROTECTED]


Seth Schoen's Hard to Verify Signatures

2004-09-08 Thread David Wagner
Hal Finney wrote:
>[...] "hard to verify signature" [...]
>Choose the number of modular squarings, t, that you want the verifier
>to have to perform.  Suppose you choose t = 1 billion.  Now you will
>sign your value using an RSA key whose exponent e = 2^t + 1.
>The way you sign, even using such a large e, is to compute
>phi = (p-1)*(q-1) and to compute e' = e mod phi, which can be done using
>about 30 squarings of 2 mod phi.  You then compute the secret exponent
>d as the multiplicative inverse mod phi of e', in the standard way that
>is done for RSA keys.  Using this method you can sign about as quickly
>as for regular RSA keys.
>
>However, the verifier has a much harder problem. [...] To verify [...]
>will require t = 1 billion modular squaring operations.

This signature scheme has an interesting property: once you've done
the lengthy computation, you can quickly prove to someone else that
the signature is valid.  In particular, there is a short and easy to
prove that the signature is valid, based on listing x, x^2, x^4, x^16,
x^256, x^65536, ..., x^e and giving a zero knowledge proof that your
list is correct.  The details can be found here (see esp. Section 3):
  D. Boneh, M. Naor, "Timed Commitments", CRYPTO 2000.
  http://crypto.stanford.edu/~dabo/abstracts/timedcommit.html
The signer can also create a proof like this if he wants.  Note that
Boneh and Naor consider a somewhat similar but not equivalent notion of
timed signatures, which may also be of interest.

-
The Cryptography Mailing List
Unsubscribe by sending "unsubscribe cryptography" to [EMAIL PROTECTED]


Seth Schoen's Hard to Verify Signatures

2004-09-08 Thread "Hal Finney"
Seth Schoen of the EFF proposed an interesting cryptographic primitive
called a "hard to verify signature" in his blog at
http://vitanuova.loyalty.org/weblog/nb.cgi/view/vitanuova/2004/09/02 .
The idea is to have a signature which is fast to make but slow to verify,
with the verification speed under the signer's control.  He proposes
that this could be useful with trusted computing to discourage certain
objectionable applications.

The method Seth describes is to include a random value in the signature
but not to include it in the message.  He shows a sample signature
with 3 decimal digits hidden.  The only way to verify it is to try all
possibilities for the random values.  By controlling how much data is
hidden in this way, the signer can control how long it will take to
verify the signature.

This idea is nice and simple, but one disadvantage is that it is
probabilistic.  It works on average, but occasionally someone might
choose an n digit value which happens to be low (assuming the values are
chosen at random).  Then they don't get as much protection.  They could
fix this by eliminating too-low values, but then verifiers might exploit
that by doing low values last in their testing.

Another problem is that this method is inherently parallelizable, so
that someone with N computers could solve it N times faster, by having
each computer test a subset of the values.

An alternative is based on the paper, "Time-lock puzzles and
timed release Crypto", by Rivest, Shamir and Wagner, from 1996,
http://theory.lcs.mit.edu/~rivest/RivestShamirWagner-timelock.pdf or .ps.
They are looking more at the problem of encrypting data such that it can
be decrypted only after a chosen amount of computing time, but many of
their techniques are applicable to signatures.

The first solution they consider is essentially the same as Seth's,
doing an encryption where n bits of the encryption key are unknown, and
letting people search for the decryption key.  They identify the problems
I noted above (which I stole from their paper).  They also point out BTW
that this concept was used by Ralph Merkle in his paper which basically
foreshadowed the invention of public key cryptography.  Merkle had to
fight for years to get his paper published, otherwise he would be known
as the father of the field rather than just a pioneer or co-inventor.

The next method they describe can be put into signature terms as follows.
Choose the number of modular squarings, t, that you want the verifier
to have to perform.  Suppose you choose t = 1 billion.  Now you will
sign your value using an RSA key whose exponent e = 2^t + 1.  (You also
need to make sure that this value is relatively prime to p-1 and q-1,
but that is easily arranged, for example by letting p and q be strong
primes.)  The way you sign, even using such a large e, is to compute
phi = (p-1)*(q-1) and to compute e' = e mod phi, which can be done using
about 30 squarings of 2 mod phi.  You then compute the secret exponent
d as the multiplicative inverse mod phi of e', in the standard way that
is done for RSA keys.  Using this method you can sign about as quickly
as for regular RSA keys.

However, the verifier has a much harder problem.  He does not know phi,
hence cannot reduce e.  To verify, he has to raise the signature to
the e power as in any RSA signature, which for the exponent I described
will require t = 1 billion modular squaring operations.  This will be a
slow process, and the signer can control it by changing the size of t,
without changing his own work factor materially.

The authors also point out that modular squaring is an intrinsically
sequential process which cannot benefit from applying multiple computers.
So the only speed variations relevant will be those between individual
computers.

Another idea I had for a use of hard to verify signatures would be if
you published something anonymously but did not want to be known as
the author of it until far in the future, perhaps after your death.
Then you could create a HTV signature on it, perhaps not identifying
the key, just the signature value.  Only in the future when computing
power is cheap would it be possible to try verifying the signature under
different keys and see which one worked.

Hal Finney

-
The Cryptography Mailing List
Unsubscribe by sending "unsubscribe cryptography" to [EMAIL PROTECTED]