Seth Schoen's Hard to Verify Signatures

2004-09-08 Thread Hal Finney
Seth Schoen of the EFF proposed an interesting cryptographic primitive
called a hard to verify signature in his blog at
http://vitanuova.loyalty.org/weblog/nb.cgi/view/vitanuova/2004/09/02 .
The idea is to have a signature which is fast to make but slow to verify,
with the verification speed under the signer's control.  He proposes
that this could be useful with trusted computing to discourage certain
objectionable applications.

The method Seth describes is to include a random value in the signature
but not to include it in the message.  He shows a sample signature
with 3 decimal digits hidden.  The only way to verify it is to try all
possibilities for the random values.  By controlling how much data is
hidden in this way, the signer can control how long it will take to
verify the signature.

This idea is nice and simple, but one disadvantage is that it is
probabilistic.  It works on average, but occasionally someone might
choose an n digit value which happens to be low (assuming the values are
chosen at random).  Then they don't get as much protection.  They could
fix this by eliminating too-low values, but then verifiers might exploit
that by doing low values last in their testing.

Another problem is that this method is inherently parallelizable, so
that someone with N computers could solve it N times faster, by having
each computer test a subset of the values.

An alternative is based on the paper, Time-lock puzzles and
timed release Crypto, by Rivest, Shamir and Wagner, from 1996,
http://theory.lcs.mit.edu/~rivest/RivestShamirWagner-timelock.pdf or .ps.
They are looking more at the problem of encrypting data such that it can
be decrypted only after a chosen amount of computing time, but many of
their techniques are applicable to signatures.

The first solution they consider is essentially the same as Seth's,
doing an encryption where n bits of the encryption key are unknown, and
letting people search for the decryption key.  They identify the problems
I noted above (which I stole from their paper).  They also point out BTW
that this concept was used by Ralph Merkle in his paper which basically
foreshadowed the invention of public key cryptography.  Merkle had to
fight for years to get his paper published, otherwise he would be known
as the father of the field rather than just a pioneer or co-inventor.

The next method they describe can be put into signature terms as follows.
Choose the number of modular squarings, t, that you want the verifier
to have to perform.  Suppose you choose t = 1 billion.  Now you will
sign your value using an RSA key whose exponent e = 2^t + 1.  (You also
need to make sure that this value is relatively prime to p-1 and q-1,
but that is easily arranged, for example by letting p and q be strong
primes.)  The way you sign, even using such a large e, is to compute
phi = (p-1)*(q-1) and to compute e' = e mod phi, which can be done using
about 30 squarings of 2 mod phi.  You then compute the secret exponent
d as the multiplicative inverse mod phi of e', in the standard way that
is done for RSA keys.  Using this method you can sign about as quickly
as for regular RSA keys.

However, the verifier has a much harder problem.  He does not know phi,
hence cannot reduce e.  To verify, he has to raise the signature to
the e power as in any RSA signature, which for the exponent I described
will require t = 1 billion modular squaring operations.  This will be a
slow process, and the signer can control it by changing the size of t,
without changing his own work factor materially.

The authors also point out that modular squaring is an intrinsically
sequential process which cannot benefit from applying multiple computers.
So the only speed variations relevant will be those between individual
computers.

Another idea I had for a use of hard to verify signatures would be if
you published something anonymously but did not want to be known as
the author of it until far in the future, perhaps after your death.
Then you could create a HTV signature on it, perhaps not identifying
the key, just the signature value.  Only in the future when computing
power is cheap would it be possible to try verifying the signature under
different keys and see which one worked.

Hal Finney



Re: Seth Schoen's Hard to Verify Signatures

2004-09-08 Thread Major Variola (ret)
At 11:48 AM 9/8/04 -0700, Hal Finney wrote:
Seth Schoen of the EFF proposed an interesting cryptographic primitive
called a hard to verify signature in his blog at
http://vitanuova.loyalty.org/weblog/nb.cgi/view/vitanuova/2004/09/02 .
The idea is to have a signature which is fast to make but slow to
verify,
with the verification speed under the signer's control.  He proposes
that this could be useful with trusted computing to discourage certain
objectionable applications.

The method Seth describes is to include a random value in the signature

but not to include it in the message.  He shows a sample signature
with 3 decimal digits hidden.  The only way to verify it is to try all
possibilities for the random values.  By controlling how much data is
hidden in this way, the signer can control how long it will take to
verify the signature.

This could be called a salt-free algorithm :-)   Basically its like
the
problem that a salted-password cracker has to solve when the salt has
to be guessed.

As far as a modexp() solution, I suggest this, which is as far as I can
tell
different from what you reference:

In an RSA cryptosystem the public exponent is typically low, often
3 or 65537 (for efficiency reasons only a few bits are set; the other
constraint is that your message, raised to that power, wraps in your
modulus, which makes 65537 a little better).  The private exponent
is big.

Therefore, traditional encryption is fast, and decryption is slow;
the reverse is that signing is slow, verifying a signature is fast.
This can be used to achieve Seth's required fast to make, slow
to verify.  To achieve the required user-controllable, the user
gets to set the number of bits in the modulus.  One might have
to use extraordinarily long moduli (making 4Kbits look puny), depending
on the time-scale of slow and fast, but so what, primes are free :-)

and might even be re-used.

If this passes group-muster pass it on..






Re: Seth Schoen's Hard to Verify Signatures

2004-09-08 Thread Adam Back
Hi

I proposed a related algorithm based on time-lock puzzles as a step
towards non-parallelizable, fixed-minting-cost stamps in section 6.1
of [1], also Dingledine et al observe the same in [2].

The non-parallelizable minting function is in fact the reverse: sender
encrypts (expensively) and the verifier encrypts again (but more
cheaply) and compares, but I think the relationship is quite analogous
to the symmetry between RSA encryption and RSA signatures.

I think maybe you have observed an additional simplification.  In my
case I use sender chooses x randomly (actually hash output of random
value and resource string), and computes y = x^{x^w} mod n as the work
function (expensive operation); and z = x^w mod phi(n), y =?  x^z mod
n as the cheap operation (verification).

I think your approach could be applied on the encryption side too
resulting in simpler, faster verification.  Instead it would be:

x is random, compute y = x^{2^t+1} mod n; verify x =? y^d mod n

I'll add a note about that when I get around to updating it next.

Adam

[1] Hashcash - Amortizable Publicly Auditable Cost-Functions
http://www.hashcash.org/papers/amortizable.pdf

[2] Andy Oram, editor.  Peer-to-Peer: Harnessing the Power of
Disruptive Technologies. O'Reilly and Associates, 2001.  Chapter 16
also available as
http://freehaven.net/doc/oreilly/accountability-ch16.html.

On Wed, Sep 08, 2004 at 11:48:02AM -0700, Hal Finney wrote:
 Seth Schoen of the EFF proposed an interesting cryptographic primitive
 called a hard to verify signature in his blog at

 An alternative is based on the paper, Time-lock puzzles and
 timed release Crypto, by Rivest, Shamir and Wagner, from 1996,
 [...]
 Choose the number of modular squarings, t, that you want the
 verifier to have to perform.  [...] you will sign your value using
 an RSA key whose exponent e = 2^t + 1.   The way you sign, even
 using such a large e, is to compute phi = (p-1)*(q-1) and to compute
 e' = e mod phi, which can be done using about 30 squarings of 2 mod
 phi.  You then compute the secret exponent d as the multiplicative
 inverse mod phi of e', in the standard way that is done for RSA
 keys.  Using this method you can sign about as quickly as for
 regular RSA keys.



Re: Seth Schoen's Hard to Verify Signatures

2004-09-08 Thread John Kelsey
From: \Hal Finney\ [EMAIL PROTECTED]
Sent: Sep 8, 2004 2:48 PM
To: [EMAIL PROTECTED]
Subject: Seth Schoen's Hard to Verify Signatures

The method Seth describes is to include a random value in the signature
but not to include it in the message.  He shows a sample signature
with 3 decimal digits hidden.  The only way to verify it is to try all
possibilities for the random values.  By controlling how much data is
hidden in this way, the signer can control how long it will take to
verify the signature.

I've seen this described in a paper by Abadi, Lomas  Needham as an alternative to a 
high iteration count for password hashing.  

Hal Finney

--John Kelsey




Re: Seth Schoen's Hard to Verify Signatures

2004-09-08 Thread Jack Lloyd
On Wed, Sep 08, 2004 at 12:44:39PM -0700, Major Variola (ret) wrote:
[...]
 In an RSA cryptosystem the public exponent is typically low, often
 3 or 65537 (for efficiency reasons only a few bits are set; the other
 constraint is that your message, raised to that power, wraps in your
 modulus, which makes 65537 a little better).  The private exponent
 is big.
 
 Therefore, traditional encryption is fast, and decryption is slow;
 the reverse is that signing is slow, verifying a signature is fast.
 This can be used to achieve Seth's required fast to make, slow
 to verify.  To achieve the required user-controllable, the user
 gets to set the number of bits in the modulus.  One might have
 to use extraordinarily long moduli (making 4Kbits look puny), depending
 on the time-scale of slow and fast, but so what, primes are free :-)
 
 and might even be re-used.
 
 If this passes group-muster pass it on..

Can't be too short, less than about a third the size of the modulus you start
running into problems [*], which, with the sizes you're suggesting (you would
need, what, a 100K+ bit key to do this?) would make signature generation pretty
slow too.

Easier to do standard RSA and then encrypt the whole thing with a 64 or 80 bit
symmetric key.

[*] http://crypto.stanford.edu/~dabo/papers/RSA-survey.pdf

-Jack