Cryptography-Digest Digest #55

2000-10-31 Thread Digestifier

Cryptography-Digest Digest #55, Volume #13   Tue, 31 Oct 00 07:13:01 EST

Contents:
  Re: RSA Multiprime (Francois Grieu)
  Re: XOR based key exchange protocol - flawed? (Mike Connell)
  Legal reqiurements for CCTV watermarking (Andrew Cogger)
  Re: Searching for a good PRNG (Tim Tyler)
  Re: Searching for a good PRNG (Tim Tyler)
  Re: Legal reqiurements for CCTV watermarking (Volker Hetzer)
  3-dimensional Playfair? (Juergen Nieveler)
  Re: Padding scheme? (Tim Tyler)
  Re: BEST BIJECTIVE RIJNDAEL YET? (Tim Tyler)
  Re: Newbie about Rijndael ("mac")
  Re: DATA PADDING FOR ENCRYPTION (Tim Tyler)



From: Francois Grieu [EMAIL PROTECTED]
Subject: Re: RSA Multiprime
Date: Tue, 31 Oct 2000 11:20:23 +0100

[EMAIL PROTECTED] (Scott Contini) wrote:
 The only thing more ridiculous than Compaq patenting this is
 RSA Security licensing the patent.

Agreed, if true. I have not seen the patent claims, and do not
know the details of the cross-licensing agreements between Compaq
and RSA Security. I hope scientists still have some influence at
RSA security (Bob are you listening ?). I bet the net flow of cash
from Compaq to RSA Security will be remain non-negative.


I do fell it would be ridiculous to apply in 1999-2000 for a patent
claiming asymetric cryptography based on modular exponentiation
modulo a composite formed of 3 or more primes, with use of the
Chinese Remainder Theorem to perform the exponentiation. For
example, back in 1994, Solaic (a French Smart Card manufacturer,
now merged with Schlumberger) proposed to use of this technique
in a bid for the French "CPS" (a Smart Card for the health
practitioner, that can digitaly sign). This was seen as a
solution to implement 768 bit secret-key RSA operation on the
ST16F48 IC (however with execution time in the 30 seconds) to
circumvent the late availability of the ST16CF54 with
cryptoprocessor. Professor Jean-Jacques Quisquater actually proposed
the use of multiple primes, and I studied the implementation on an
8 bit processor. I still have the code fragments, and memos in
electronic form.


 While (multiple primes) gives some speedups, it opens up the
 (RSA) algorithm to possible new attacks: if a faster special
 purpose algorithm than the elliptic curve method is invented,
 then multi-prime RSA easily could become insecure.

Well, GNFS and even MPQS are faster than ECM for pratical purpose,
and all three are equaly efficient against two-prime and
multi-prime RSA. The product of 2 random 288 bit primes is just
as hard to factor as the product of 3 random 192 primes, and this
situation has not evolved in the last 20 years. Yes, it is
conceivable this could change, but it is also conceivable, and
IMHO more likely, that some other breakthru will be made on
factorisation or the discrete log problem over elliptic curves.


 (ECC enables) somewhat efficient operation on 8-bit processors
 without a coprocessor.  If you're concerned about the speed of
 private key operations, my recommendation is to use ECC (for
 security concerns, use a randomly generated curve)

Have you any firm reference of ECC on 8 bits processors without a
coprocessor ? Certicom once proposed this on the 68HC05, but it
was apparently retracted. I do not know the reason, and still
wonder if attacks have been found (on the special field used, or
by power/timing attacks).


In summary, I think multiple primes is a useful idea, giving a
sizable (not dramatic) speed increase; but not a patentable one.


   Francois Grieu

--

From: Mike Connell [EMAIL PROTECTED]
Subject: Re: XOR based key exchange protocol - flawed?
Date: 31 Oct 2000 11:40:28 +0100

David Schwartz [EMAIL PROTECTED] writes:

 Mike Connell wrote:
 
 Your second case doesn't work. The MITM can create any number of
   anonymous personas.
  
  Sure, they can.
  
   So he can make you think that he is the person who
   gave you all those stock tips even though he isn't.
  
  Now you've lost me. How? In order to do that, they must present the
  same public key as the 'real' guy has. For that public key to be
  useful, they must compute the secret key that goes with it. Isn't that
  hard?
 
   Not at all, because he can trivially create any number of public keys
 and private keys.
  

OK, here is my public key 0xfefefefecabba9efefe.123456. I am A. My 
brother 'B' knows that my public key is 0xfefe 

The MITM can create as many pairs as he wants. In order to impersonate 
me to B, he must have my secret key. Why? Because in step 4 when B
sends me Xb encrypted with 0xfefe... only I can decrypt it.

This works because B knows my public key. If he didn't, and got a
message that said "I am A, and my key is 0x6660666...", he wouldn't
(shouldn't!) believe it.

The protocol doesn't attempt to convince the users that a given public 
key is that of a given real-world indentity, only that t

Cryptography-Digest Digest #55

1999-08-16 Thread Digestifier

Cryptography-Digest Digest #55, Volume #10   Mon, 16 Aug 99 11:13:07 EDT

Contents:
  Re: CRYPTO DESIGN MY VIEW (SCOTT19U.ZIP_GUY)
  Re: Do I have a problem with semantics? (Nicol So)
  Something For Cryptanalysts to Do
  Re: NIST AES FInalists are
  Re: decryption verification methods (Paul Crowley)
  Re: NIST AES FInalists are
  Re: White page on Thermal noise ([EMAIL PROTECTED])
  Re: IDEA in AES (Anssi Bragge)
  compress then encrypt? (Tom St Denis)
  Re: NIST AES FInalists are (Patrick Juola)
  Re: Statistical occurrence of letters in english (Patrick Juola)
  Re: New encryption algorithm (Patrick Juola)
  Re: Q.  a hash of a hash ... (Anton Stiglic)



From: [EMAIL PROTECTED] (SCOTT19U.ZIP_GUY)
Subject: Re: CRYPTO DESIGN MY VIEW
Date: Mon, 16 Aug 1999 07:02:19 GMT

In article [EMAIL PROTECTED], "Douglas A. Gwyn" [EMAIL PROTECTED] wrote:
"SCOTT19U.ZIP_GUY" wrote:
  By the way do you know of any other compression that has these
 properties.

I haven't researched this, but I do know there are a few different
lossless compression schemes.  (Can't use lossy ones if every bit
of data needs to be conveyed without distortion.)  Lempel-Ziv has
been popular for several years; possibly one could use a large
"word size" (dictionary) and prime it with an agreed-upon text
before feeding the message text to it.
   I think that since I was talking about compressing messages it
was obvious that I was talking about lossless compression. There
are many such methods. I have been looking most recently 
at using the BWT method and thought I was close to getting it to
work but my requirement that every file even if the wrong file should
be a valid compressed file so no information in sturcture could be
used by attacker to tell if it is a valid file.

I occasionally get e-mail about data compression conferences,
but after I forward it to more-interested parties I delete it.
Probably a Web search would turn up something helpful.

  The search of the web turns up nothing useful in this subject.
It appears PGP used a freeware routine that is standard in
the product called ZIP. Though it compresses text very well
I don't think it meets my requirement about all files acting
like they are a compressed file. So an attacker can get no
information from it. I have no idea what current PGP uses
only the old DOS versions. 
 I think the main reason certain compression routines are picked for
use with encryption are there speed and the amount they compress
very little thought seems to be given to the fact that many can give
clues due to there structure that would help an attacker.



David A. Scott
--
SCOTT19U.ZIP NOW AVAILABLE WORLD WIDE
http://www.jim.com/jamesd/Kong/scott19u.zip
http://members.xoom.com/ecil/index.htm
NOTE EMAIL address is for SPAMERS

--

From: Nicol So [EMAIL PROTECTED]
Subject: Re: Do I have a problem with semantics?
Date: Sun, 15 Aug 1999 10:37:29 -0400

rosi wrote:
 
Concrete and specific: "Secret Key Agreement by Public
 Discussion", which I will refer to as Paper here. (C.f. a post
 from David Molnar on July 17, 1999)
 
 ...
 
1. To me some assumptions are left out from Paper and I likely do
 not understand the term 'provably secure'. I also seem to get the
 impression that the system Paper illustrates is referred to as
 'unconditionally secure'. (Can some help tell exactly which is
 being characterized with?)

I visited the website David Molnar pointed to and took a cursory look at
the pages, but I don't want to say I have read the paper.

Generally, provable security is what it says--that a cipher satisfies
some precisely defined notion of security can be demonstrated with a
rigorous mathematical proof.  Often times (but not always) results about
provably secure ciphers take the form "cipher C is secure (in the sense
of S) if the well-known conjecture A is true".  Assuming the proof is
correct, a result like that is rigorous--no new discoveries will
invalidate it.  However, if the well-known conjecture turns out to be
false, the result is only *vacuously* true.

Instead of focussing on what "provably secure" means independent of the
context, it would be more fruitful to try to understand what an author
is trying to say in a particular context.

BTW, what assumptions do you think have been left out in the paper?

2. (a more specific extension of 1) 'unconditionally secure' seem
 to refer to no-better-chance-than-half even with unlimited computing
 power. As the state of the art stands, this seems to be a very weak
 security.

It seems that you misunderstood definition of "unconditional secrecy".

(Security is a multi-faceted notion, of which secrecy is one aspect.  I
use the term "unconditional secrecy" to emphasis th