Cryptography-Digest Digest #698, Volume #12      Sun, 17 Sep 00 09:13:00 EDT

Contents:
  Re: RC4: Tradeoff key/initialization vector size? (Guy Macon)
  Re: RC4: Tradeoff key/initialization vector size? (Paul Rubin)
  Dangers of using same public key for encryption and signatures? (Paul Rubin)
  Re: Lossless compression defeats watermarks (Niklas Frykholm)
  Re: Capability of memorizing passwords (Chris Rutter)
  Re: RC4: Tradeoff key/initialization vector size? (Guy Macon)
  Re: Dangers of using same public key for encryption and signatures? ("Brian Gladman")
  Re: Double Encryption Illegal? (Mok-Kong Shen)
  On secret Huffman compression (Mok-Kong Shen)
  Re: Double Encryption Illegal? (Mok-Kong Shen)
  Re: Tying Up Loose Ends - Correction (Mok-Kong Shen)

----------------------------------------------------------------------------

From: [EMAIL PROTECTED] (Guy Macon)
Subject: Re: RC4: Tradeoff key/initialization vector size?
Date: 17 Sep 2000 11:35:34 GMT

Tom St Denis wrote:
>
>  David Crick <[EMAIL PROTECTED]> wrote:
>>
>> From the CipherSaber[-1] documentation (http://ciphersaber.gurus.com)
>>
>>   The user key is a text string, rather than a hex value, because
>>   humans are more likely to be able to memorize a text string with
>>   sufficient entropy. To leave room for the initialization vector,
>>   the length of the user key must be less than 246 bytes. To insure
>>   adequate mixing of the initialization vector and user key, we
>>   recommend you select a user key of 54 bytes or less.
>
>I would strongly recommend against using ASCII text as the key for
>RC4.  You should really hash it first.
>

I believe that the implementation of RC4 described on the web
page [ http://ciphersaber.gurus.com ] is secure without any
such hashing.  Ciphersaber has withstood a lot of analysis
and attacks so far.

The reason I reference Ciphersaber instead of RC4 is because
the Ciphersaber implementation of RC4 (ARCFOUR, really - none of
us has proof that what we are looking at is really RC4) is
that it has a standard set of decisions concerning such mundane
details as whether the key is ASCII, how big the initialization
vector shouild be, etc. that have withstood a lot of scrutiny.


------------------------------

From: Paul Rubin <[EMAIL PROTECTED]>
Subject: Re: RC4: Tradeoff key/initialization vector size?
Date: 17 Sep 2000 04:48:21 -0700

[EMAIL PROTECTED] (Guy Macon) writes:
> I believe that the implementation of RC4 described on the web
> page [ http://ciphersaber.gurus.com ] is secure without any
> such hashing.  Ciphersaber has withstood a lot of analysis
> and attacks so far.

Are you kidding?

------------------------------

From: Paul Rubin <[EMAIL PROTECTED]>
Subject: Dangers of using same public key for encryption and signatures?
Date: 17 Sep 2000 04:56:46 -0700

Current practice seems to prefer using two separate keys, though some
systems (PGP 2.x, and effectively SSL) use the same public key for
both encryption and authentication.  I have an application where space
for keys is quite scarce.  I'd like to use the same key (point on
elliptic curve) for both encryption and signing (El-Gamal / ECDSA).
What kind of trouble am I asking for, aside from the "FBI attack"
(they make you turn over your decryption key so they can read
something, and that means they can also sign your name to stuff)?

Also, how long do my keys need to be to satisfy the paranoids in this
crowd?  Assume I'm using some constant (shared) curve over GF(p) for
some large p.  Is 140 bits enough?  How about 170?  Robert Harley has
been breaking ECDL over GF(2^n) for n=112 or so, IIRC.  But those are
easier than GF(p) curves.

thanks

------------------------------

From: [EMAIL PROTECTED] (Niklas Frykholm)
Subject: Re: Lossless compression defeats watermarks
Date: 15 Sep 2000 07:34:39 GMT

In article <8ps1ov$rt4$[EMAIL PROTECTED]>, Matthew Skala wrote:
>It seems to me that this should be obvious, but my impression is that most
>people don't quite realize it, so I'd just like to point it out:
>
>If a watermarking scheme works perfectly (in the sense of being
>inmperceptible by humans) and a lossy compression scheme works perfectly
>(in the sense of maximizing compression without harming perceptual
>quality) then compressing and decompressing a signal will have the effect
>of removing the watermark.

That's perfectly true, and I think it's recognized now by (some of the)
people in the watermarking business. (Anyone else getting the feeling
that the people who do watermarking are more often Image Processing than
Security experts?) 

See for example "A review of watermarking principles and practices" by
Miller, Cox, Linnartz & Kalker --- it's available online just search on
Google --- pages 6--7, which mentions just this problem. A watermark has
to change the perceived content. Hopefully the change is so small that it
will not be noticed.

// Niklas

------------------------------

From: Chris Rutter <[EMAIL PROTECTED]>
Subject: Re: Capability of memorizing passwords
Date: Sun, 17 Sep 2000 00:04:59 +0100

Benjamin Goldberg <[EMAIL PROTECTED]> wrote:

> --
> ... perfection has been reached not when there is nothing left to
> add, but when there is nothing left to take away. (from RFC 1925)

But, more importantly, of Antoine de Saint Exupery.  The quote's
English form is usually

        "Perfection is achieved, not when there is nothing more
         to add, but when there is nothing left to take away."

In fact, all the English variants I can find are so uniform perhaps he
stated it in English.

(Spot the person who has nothing cryptographic to contribute and so
has to make pedantic remarks instead.)

c.

------------------------------

From: [EMAIL PROTECTED] (Guy Macon)
Subject: Re: RC4: Tradeoff key/initialization vector size?
Date: 17 Sep 2000 12:23:39 GMT

Paul Rubin wrote:
>
>[EMAIL PROTECTED] (Guy Macon) writes:
>> I believe that the implementation of RC4 described on the web
>> page [ http://ciphersaber.gurus.com ] is secure without any
>> such hashing.  Ciphersaber has withstood a lot of analysis
>> and attacks so far.
>
>Are you kidding?

Why would you think that I am kidding?  If you know of a weakness
in Ciphersaber thaty would allow an attacker to read an encrypted
message, I would like to know about it.

Of course, in the larger sense, all schemes have strengths and
weaknesses.  If you need a public key system, a shared secret
key system won't do.


------------------------------

From: "Brian Gladman" <[EMAIL PROTECTED]>
Subject: Re: Dangers of using same public key for encryption and signatures?
Date: Sun, 17 Sep 2000 13:26:59 +0100


"Paul Rubin" <[EMAIL PROTECTED]> wrote in message
news:[EMAIL PROTECTED]...
> Current practice seems to prefer using two separate keys, though some
> systems (PGP 2.x, and effectively SSL) use the same public key for
> both encryption and authentication.  I have an application where space
> for keys is quite scarce.  I'd like to use the same key (point on
> elliptic curve) for both encryption and signing (El-Gamal / ECDSA).
> What kind of trouble am I asking for, aside from the "FBI attack"
> (they make you turn over your decryption key so they can read
> something, and that means they can also sign your name to stuff)?
>
> Also, how long do my keys need to be to satisfy the paranoids in this
> crowd?  Assume I'm using some constant (shared) curve over GF(p) for
> some large p.  Is 140 bits enough?  How about 170?  Robert Harley has
> been breaking ECDL over GF(2^n) for n=112 or so, IIRC.  But those are
> easier than GF(p) curves.
>
> thanks

In the UK keys used for signature only are not subject to Government Access
to Keys (GAK). But keys that perform both signature and encryption functions
can be seized under warrant by a number of UK authorities.  And there is no
requirement that you need to be under suspicion in order for keys to be
seized.

These powers were recently introduced in the Regulation of Investigatory
Powers Act 2000 - strictly speaking they have not yet come into force but
they will do shortly when the authorities are deemed ready.

Of course this is the UK equivalent of your 'FBI attack'. Basically this
significantly reduces trust in signature keys.

    Brian Gladman






------------------------------

From: Mok-Kong Shen <[EMAIL PROTECTED]>
Crossposted-To: comp.databases.oracle
Subject: Re: Double Encryption Illegal?
Date: Sun, 17 Sep 2000 15:09:06 +0200



wtshaw wrote:
> 
> <[EMAIL PROTECTED]> wrote:
> ...
> > You meant it should be triple, like 3-DES??
> 
> When a person uses 3-DES, they are single encrypting with 3-DES. An
> algorithm can be made of any conbination of steps.  When two or more
> pieces are combined, the result is one piece. Consider that such a
> request, regulation, standard, whim, or pipe dream to limit so called
> double encryption is a fog to confuse whereever possible; ambiguity shows
> dualism of purpose.

Ah, I understand. In your definition there is never 
any multiple encryption and a superencipherment is 
simply a single (big) encipherment, there being 
(presumably in your view) no need to mention that the 
whole is made of certain (in general) different 
components. I don't partake your viewpoint. For the 
components can, and are in fact commonly, used and 
evaluated singly. It is the art of combination that 
is of interest in a multiple encryption. We need to 
know (to emphasize) what the components are and how 
they get combined.

M. K. Shen

------------------------------

From: Mok-Kong Shen <[EMAIL PROTECTED]>
Subject: On secret Huffman compression
Date: Sun, 17 Sep 2000 15:08:59 +0200


A Huffman tree for compression is built according to the
frequncy distribution in the manner that is well-known.
We assume that the opponent can build the same tree.
Now we do modifications to the coding as follows such
that the opponent cannot decompress to obtain the 
original message: 

Use a secret key as seed of a PRNG. At each non-terminal 
node of the given Huffman tree, use a psudo-random number 
to determine whether the two branches are to be flipped,
i.e. whether their markings of 0/1 are to be exchanged.
Use the modified tree to do compression.

We note that in order to cater for the byte/word boundary 
issue of the output file, one can include an end-of-file 
symbol (with the least frequency) in the Huffman tree
and after output of that symbol use random bits to fill 
to the desired byte/word boundary.

M. K. Shen
================================
http://home.t-online.de/home/mok-kong.shen

------------------------------

From: Mok-Kong Shen <[EMAIL PROTECTED]>
Crossposted-To: comp.databases.oracle
Subject: Re: Double Encryption Illegal?
Date: Sun, 17 Sep 2000 15:09:12 +0200



Tom St Denis wrote:
> 
>   [EMAIL PROTECTED] (Paul Schlyter) wrote:

> > So you're claiming that triple-DES is no more secure than single-
> DES ???
> 
> Read my message.  Geez.  I said "double" encryption is not the way to
> go about added security.

Could you be more explicit and explain why? Are you
saying that superencipherment is always nonsense?
Is 2-DES not better than DES?

M. K. Shen

------------------------------

From: Mok-Kong Shen <[EMAIL PROTECTED]>
Subject: Re: Tying Up Loose Ends - Correction
Date: Sun, 17 Sep 2000 15:08:52 +0200



"SCOTT19U.ZIP_GUY" wrote:
> 
>    One if you use a random key what is the chance that the
> decompression will even hit the stop code. If it does not
> then you obviously can reject that candidate. Of course if you
> want to just say a few lines and pad to a million bytes then
> you may hit the stop code. But where does one get these nice
> random bytes. The only safe way to use a stop code is for short
> messages with lots of varible length random padding. But to pad
> just a couple of bytes. Means when one is testing a random key
> if the stop code is hit it must be in some fixed range. Not very
> likely to occur. Don't for get with laws like RIP if one uses
> compression and encryption one may have to come up with a key to
> stay out of jail. If only one key decrypts and uncompresses you
> could be in trouble if you get remember it or if you can and its
> stuff you don't want big brother to stick his nose into.

First of all, the Huffman codes can be made secret, so
that the analyst doesn't know how to properly decompress.
See the thread 'On secret Huffman compression' just 
posted by me to the group.

Secondly, isn't it that normally when we talk about the 
security of a block algorithm we are thinking of cases
where the plaintext input is in readable English? In 
case that's safe, we actually don't need any non-secret
compression. Non-secret compression serves the main 
purpose of saving bandwidth in my view, though it can
also contribute some (albeit difficult to evaluate) 
additional security (against the cost of doing 
compression). Note that not everybody uses compression
(up till now only non-secret compression) in 
conjunction with encryption in practice. Anyway, I 
would personally choose to concentrate my effort on 
seeing that the proper encryption algorithm is strong 
rather than relying on any support given by a 
non-secret compression.

M. K. Shen

------------------------------


** FOR YOUR REFERENCE **

The service address, to which questions about the list itself and requests
to be added to or deleted from it should be directed, is:

    Internet: [EMAIL PROTECTED]

You can send mail to the entire list (and sci.crypt) via:

    Internet: [EMAIL PROTECTED]

End of Cryptography-Digest Digest
******************************

Reply via email to