Cryptography-Digest Digest #73, Volume #13        Thu, 2 Nov 00 08:13:00 EST

Contents:
  Re: BENNY AND THE MTB? (Tim Tyler)
  Re: Crypto Export Restrictions (CiPHER)
  Re: ECC choice of field and basis (Nigel Smart)
  Re: On introducing non-interoperability (Mok-Kong Shen)
  Re: Give it up? (Mok-Kong Shen)
  Re: Crypto Export Restrictions ("Dmitry Softman")
  Re: Give it up? (Richard Heathfield)
  index of coincidence of Spanish/Turkey ([EMAIL PROTECTED])
  Re: Give it up? (Tom St Denis)
  Re: ECC choice of field and basis (Scott Contini)
  Re: Give it up? (Tom St Denis)

----------------------------------------------------------------------------

From: Tim Tyler <[EMAIL PROTECTED]>
Subject: Re: BENNY AND THE MTB?
Reply-To: [EMAIL PROTECTED]
Date: Thu, 2 Nov 2000 10:11:12 GMT

[EMAIL PROTECTED] wrote:
: "Tim Tyler" <[EMAIL PROTECTED]> wrote:

:> Matt's code takes a message, compresses it, maps the result to
:> a 128-bit granular file, encrypts it, and maps the result to
:> an 8-bit granular file.

: Actually that clarified the issue [...]

Ah, good - please ignore my last message, then.

: Correct me if I'm wrong, but what matt has done is taken a file,
: compressed it, encrypted it, and (de) compressed it (I'm a little hazy
: on whether the last step should be considered compression or
: decompression).

AFAIK, the last step doesn't really do much in the way of compression
or expansion.  It doesn't attempt to compress the statistically random
cyphertext.  However it does result in slightly shorter files - simply
because the size of granularity of the files is decreased.

In other words, if you must label this last step as compression or
decompression, the former is likely to be more appropriate.

: I was wrong, this can result in even a 1-bit ciphertext, so an 8-bit
: ciphertext is clearly possible.

The map is *designed* to go to 8-bit files.  You could go to a finer
granularity, but at the expense of not producing a neat output file that
can be stored in a typical filing system.

: However I would still consider it proper to call it a 128-bit
: ciphertext, as that should be close to the average length [...]

Hmm.  Cyphertexts can be any number of bytes long.  Multiples of 128 bits
are not especially common.

: (there will be tiny biases in the ciphertext which can be compressed
: further [...]

That would be suprising.  Usually attempts to compress cyphertext result
in slight expansion - not slight compression.

: [...] and there will be encoding in the compression that adds a tiny
: amount of space). [...]

Don't say this - you'll only rub David Scott up the wrong way ;-)
-- 
__________  Lotus Artificial Life  http://alife.co.uk/  [EMAIL PROTECTED]
 |im |yler  The Mandala Centre   http://mandala.co.uk/  Free gift.

------------------------------

From: CiPHER <[EMAIL PROTECTED]>
Crossposted-To: talk.politics.crypto,talk.politics.misc,alt.freespeech
Subject: Re: Crypto Export Restrictions
Date: Thu, 02 Nov 2000 10:13:03 GMT

In article <[EMAIL PROTECTED]>,
  Anthony Stephen Szopa <[EMAIL PROTECTED]> wrote:

> Anyone who would make such a claim and not support it is clearly a
> nasty person.

*waggles fingers* OoooOOOooo! 'Nasty person'! lol

--
Marcus
---
[ www.cybergoth.cjb.net ] [ alt.gothic.cybergoth ]


Sent via Deja.com http://www.deja.com/
Before you buy.

------------------------------

From: Nigel Smart <[EMAIL PROTECTED]>
Subject: Re: ECC choice of field and basis
Date: Thu, 2 Nov 2000 10:39:49 GMT

Scott Contini wrote:
> 
> In article <Mo%L5.12208$[EMAIL PROTECTED]>,
> Michael Scott <[EMAIL PROTECTED]> wrote:
> >
> ><[EMAIL PROTECTED]> wrote in message news:8tpumv$hd9$[EMAIL PROTECTED]...
> >> Hello,
> >>.....
> >> 1) What are the advantages and disadvantages of choosing GF(2^m) or
> >> GF(p) (and why not GF(p^m) in general)?
> >>
> >
> >Good questions. Generally, and rather surprisingly, GF(p) is significantly
> >faster in software, not pushed by any commercial interest, much less subject
> >to patents. GF(2^m) is faster in special hardware, certain interests are
> >pushing it hard, and is more likely to involve patents. Some restricted
> >variants of GF(2^m) curves allow fast implementations, but some of these
> >have been found to be weak (giving the whole field a bad name).
> >
> 
> My personal experience is that  GF(p)  and  GF(2^m)  are about the same
> speed: depending on the operation (public key/private key) and some
> other factors which I should not discuss, you may get one faster than
> the other but the times (for me) have always been within a factor of 2
> of each other.  BTW my comparisons were done on a Pentium pro where the
> GF(p)  implementation had assembly code, but the  GF(2^m)  implementation
> did not since we were unable to improve on the compiler's optimisation
> for this case.
> 

I would agree, the field ops in GF(p) are faster but this is compensated
by faster curve ops in GF(2^p).

They are both as good as each other in terms of security as well.

Nigel

-- 
Dr Nigel P. Smart                  | Phone: +44 (0)117 954 5163
Computer Science Department,       | Fax:   +44 (0)117 954 5208
Woodland Road,                     | Email: [EMAIL PROTECTED]
University of Bristol, BS8 1UB, UK | URL:  
http://www.cs.bris.ac.uk/~nigel/

------------------------------

From: Mok-Kong Shen <[EMAIL PROTECTED]>
Subject: Re: On introducing non-interoperability
Date: Thu, 02 Nov 2000 12:39:10 +0100



wtshaw wrote:
> 
> Mok-Kong Shen<[EMAIL PROTECTED]> wrote:
> 
> >
> > I realize that I forgot to mention in this thread the most
> > simple (minimal 'invasive') way of modifying the
> > keyscheduling. I described that quite a time back in the
> > group, namely a (secret) permutation of the round keys,
> > i.e. using the i-th round key for the j-th round etc.

> 
> If an algorithm sets given round keys or other such things which might be
> changed, clear distinction is to be made as to what the keyspace becomes,
> including all changables/variables in the new algorithm's keystructure.
> 
> However, I consider that all strange values drawn from a little black bag
> somewhere should be indicated as to size, a form of keyspace.

The simple modification of the keyscheduling as given above 
evidently cannot effect the keyspace, since the round keys 
only change their positions not their values. 

M. K. Shen

------------------------------

From: Mok-Kong Shen <[EMAIL PROTECTED]>
Subject: Re: Give it up?
Date: Thu, 02 Nov 2000 12:38:48 +0100



Tom St Denis wrote:
> 
> 
> The word "codec" is used quite often in compression groups because it
> refers to "COmpression DECompression engine".  Somies to "COding
> DECoding" as well...

However you should consider that not everybody subscribes
to the other groups and the same shorthand may have different
meaning in different disciplines. (I saw e.g. DES with
an entirely different meaning elsewhere.)

> 
> I don't see how compressing a message can make it any more random (i.e
> less redundant) then it already is.

Compression does concentrates the entropy and so is useful
in crypto. It lets the encryption be done on a shorter
string and hence is at least more efficient. I don't know, 
because of poor knowledge, how to properly describe the 
benefits of compression. But let's take a contrived example 
where one inserts a zero byte between every pair of bytes of 
a given sequence. That 'expanded' sequence would certainly 
be considered undesirable for encryption purposes, I suppose.
 
> Unless their attacks break the cipher when the input is redundant ASCII
> there is no security advantage to using contrived inefficient codecs
> instead of something good like bzip or deflate.

As far as I understand, proponents of 1-1 claim that 
certain bit sequences (because of the ending bits) could 
be shown to be impossible to be the result of compression
of any given plaintext, since doing decompression to get 
the presumed plaintexts and doing compression aginn would 
not lead to the same sequences. Hence this could be utilized 
to check whether a key is correct (after processing the 
whole file and doing the decompression and compression). 
I have currently no intention to further discuss with people 
on the benefit of 1-1 or the method of obtaining 1-1 or 
possibility of avoiding requiring 1-1 but you may like to 
do so.

M. K. Shen

------------------------------

From: "Dmitry Softman" <[EMAIL PROTECTED]>
Crossposted-To: talk.politics.crypto,talk.politics.misc,alt.hacker
Subject: Re: Crypto Export Restrictions
Date: Thu, 2 Nov 2000 17:41:55 +0600

Hello,

"Richard Heathfield" <[EMAIL PROTECTED]> wrote in message
news:[EMAIL PROTECTED]...
> [Bit of a massive crosspost, isn't it? alt.freespeech snipped - my news
> server doesn't like it - I wonder why not? :-) ]
>
> David Schwartz wrote:
> >
> > Anthony Stephen Szopa wrote:
> > >
> [snip]
> > > I then read the latest BXA info restricting the export of such
> > > encryption software.
> >
> >         Have you even _read_ the current policy? It places practically
no
> > restrictions on crypto export to Austria, Australia, Belgium, Canada,
> > Czech Republic, Denmark, Finland, France, Germany, Greece, Hungary,
> > Ireland, Italy, Japan, Luxembor, Netherlands, New Zealand, Norway,
> > Poland, Portugal, Spain, Sweden, Switzerland, or the United Kingdom. In
> > anby event, your algorithms look suspect, so it's just as well that
> > you're not peddling them as suitable for encryption.
>
> Is the US Government aware that its crypto export policy is a
> laughing-stock across the world?
>
> a) Some non-US people are capable of inventing cryptographic techniques
> ***outside*** the USA (gasp!) - I think I'm right in saying RSA falls
> into this category (although my being mistaken on this specific example
> would not disprove the general point). But even if this were false:
> b) The beans spilled out of the can some years ago. Source code in
> electronic form is widely accessible all over the world, for most USA
> cryptographic techniques. But even if this were false:
> c) Source code in /printed/ form is also widely available, and my
> understanding is that the USA's First Amendment was (rightly)
> instrumental in allowing American cryptographic techniques to be
> disseminated in this way. Does the US Government think that terrorists
> are incapable of typing, or of hiring (or coercing) a programmer to type
> for them? The books are on the shelves, all over the world. But even if
> this were false:

> d) What makes them think we want their crypto anyway? If they won't let
> us use theirs, we (and I use this word in the general sense - I myself
> am no cryptographer!) are perfectly capable of developing our own, thank
> you. (It has just occurred to me that this is the same point I was
> making in (a) above. Oh well, that's life...)

You are perfectly right here, it takes about a week to write a fast
implementation of the DES or GOST ciphers, test and deploy it. And it may be
done by a college student or even someone younger. You do not have to  be a
cryptographer in order to code few lines of quite simple premutations. Also
note that this big guy USA's DES standard is published on the FIPS site,
anyone can take it and write it in a metter of days.

WBR,

Anonymous




------------------------------

Date: Thu, 02 Nov 2000 12:04:06 +0000
From: Richard Heathfield <[EMAIL PROTECTED]>
Subject: Re: Give it up?

Tom St Denis wrote:
> 
<snip>
> 
> Unless their attacks break the cipher when the input is redundant ASCII
> there is no security advantage to using contrived inefficient codecs
> instead of something good like bzip or deflate.


Tom - you probably already know this, but I'll say it anyway:

#include "iamnotacryptographer.h"

Right, now that we understand each other, I have some difficulty with
using compression to enhance encryption, which you could perhaps explain
to me...

Firstly, we can agree, I think that (a) the compression is part of the
algorithm, and (b) we must assume that Eve knows the algorithm.

Now, let's take a specific example: PKZIP. If Z is the compression
function, then C = E(Z(P)).

Since PKZIP'd headers begin with the two octets 0x504B ("PK" in ASCII),
we have a known plaintext attack against E(), do we not? I know two
octets isn't much, but it's the /first/ two octets, which probably helps
somewhat.

Do you see my difficulty?

(It could be argued, perhaps, that one could prevent this attack by
using a 'headerless' algorithm, such as RLE.)


-- 
Richard Heathfield
"Usenet is a strange place." - Dennis M Ritchie, 29 July 1999.
C FAQ: http://www.eskimo.com/~scs/C-faq/top.html
K&R Answers: http://users.powernet.co.uk/eton/kandr2/index.html

------------------------------

From: [EMAIL PROTECTED]
Subject: index of coincidence of Spanish/Turkey
Date: Thu, 02 Nov 2000 12:03:46 GMT

Do you know the IC of Spanish and Turkey?
Are my values correct?
german IC=0.0762
english IC=0.0658
french IC=0.0778
thanx


Sent via Deja.com http://www.deja.com/
Before you buy.

------------------------------

From: Tom St Denis <[EMAIL PROTECTED]>
Subject: Re: Give it up?
Date: Thu, 02 Nov 2000 12:06:55 GMT

In article <[EMAIL PROTECTED]>,
  Mok-Kong Shen <[EMAIL PROTECTED]> wrote:
>
>
> Tom St Denis wrote:
> >
> >
> > The word "codec" is used quite often in compression groups because
it
> > refers to "COmpression DECompression engine".  Somies to "COding
> > DECoding" as well...
>
> However you should consider that not everybody subscribes
> to the other groups and the same shorthand may have different
> meaning in different disciplines. (I saw e.g. DES with
> an entirely different meaning elsewhere.)

Well if you want to talk about crypto/compression you have to know the
lingo.

> >
> > I don't see how compressing a message can make it any more random
(i.e
> > less redundant) then it already is.
>
> Compression does concentrates the entropy and so is useful
> in crypto. It lets the encryption be done on a shorter
> string and hence is at least more efficient. I don't know,
> because of poor knowledge, how to properly describe the
> benefits of compression. But let's take a contrived example
> where one inserts a zero byte between every pair of bytes of
> a given sequence. That 'expanded' sequence would certainly
> be considered undesirable for encryption purposes, I suppose.

A good test for a block cipher is given a ciphertext block C and (let's
say this is an AES cipher) and 127 bits of P (the plaintext) you cannot
guess the last bit faster then brute force without the key.

If your block cipher cannot stand upto this test it's not secure.  If
it does then your original point is moot.  Sure a good codec will
increase the efficiency of the system, but if your message is not known
to the attacker, compression will not make it any easier/harder.

> > Unless their attacks break the cipher when the input is redundant
ASCII
> > there is no security advantage to using contrived inefficient codecs
> > instead of something good like bzip or deflate.
>
> As far as I understand, proponents of 1-1 claim that
> certain bit sequences (because of the ending bits) could
> be shown to be impossible to be the result of compression
> of any given plaintext, since doing decompression to get
> the presumed plaintexts and doing compression aginn would
> not lead to the same sequences. Hence this could be utilized
> to check whether a key is correct (after processing the
> whole file and doing the decompression and compression).
> I have currently no intention to further discuss with people
> on the benefit of 1-1 or the method of obtaining 1-1 or
> possibility of avoiding requiring 1-1 but you may like to
> do so.

That arguement (and my spelling sometimes) is laughable.  So what if
you can tell the right key from the wrong one?  A program that could
have upto 2^128 steps (i.e searching the keyspace) is infeasible
anyways.  And not only that I can break a 1-1 scheme with brute force
anyways (it's the nature of finite information coding).

Tom


Sent via Deja.com http://www.deja.com/
Before you buy.

------------------------------

From: [EMAIL PROTECTED] (Scott Contini)
Subject: Re: ECC choice of field and basis
Date: 2 Nov 2000 12:25:45 GMT

In article <[EMAIL PROTECTED]>,
Nigel Smart  <[EMAIL PROTECTED]> wrote:
>Scott Contini wrote:
>> 
>> In article <Mo%L5.12208$[EMAIL PROTECTED]>,
>> Michael Scott <[EMAIL PROTECTED]> wrote:
>> >
>> ><[EMAIL PROTECTED]> wrote in message news:8tpumv$hd9$[EMAIL PROTECTED]...
>> >> Hello,
>> >>.....
>> >> 1) What are the advantages and disadvantages of choosing GF(2^m) or
>> >> GF(p) (and why not GF(p^m) in general)?
>> >>
>> >
>> >Good questions. Generally, and rather surprisingly, GF(p) is significantly
>> >faster in software, not pushed by any commercial interest, much less subject
>> >to patents. GF(2^m) is faster in special hardware, certain interests are
>> >pushing it hard, and is more likely to involve patents. Some restricted
>> >variants of GF(2^m) curves allow fast implementations, but some of these
>> >have been found to be weak (giving the whole field a bad name).
>> >
>> 
>> My personal experience is that  GF(p)  and  GF(2^m)  are about the same
>> speed: depending on the operation (public key/private key) and some
>> other factors which I should not discuss, you may get one faster than
>> the other but the times (for me) have always been within a factor of 2
>> of each other.  BTW my comparisons were done on a Pentium pro where the
>> GF(p)  implementation had assembly code, but the  GF(2^m)  implementation
>> did not since we were unable to improve on the compiler's optimisation
>> for this case.
>> 
>
>I would agree, the field ops in GF(p) are faster but this is compensated
>by faster curve ops in GF(2^p).
>
>They are both as good as each other in terms of security as well.
>
>Nigel
>

I forgot to say that the timings I previously gave were for approx 160-bit
orders.

Scott



------------------------------

From: Tom St Denis <[EMAIL PROTECTED]>
Subject: Re: Give it up?
Date: Thu, 02 Nov 2000 12:50:41 GMT

In article <[EMAIL PROTECTED]>,
  Richard Heathfield <[EMAIL PROTECTED]> wrote:
> Tom St Denis wrote:
> >
> <snip>
> >
> > Unless their attacks break the cipher when the input is redundant
ASCII
> > there is no security advantage to using contrived inefficient codecs
> > instead of something good like bzip or deflate.
>
> Tom - you probably already know this, but I'll say it anyway:
>
> #include "iamnotacryptographer.h"
>
> Right, now that we understand each other, I have some difficulty with
> using compression to enhance encryption, which you could perhaps
explain
> to me...
>
> Firstly, we can agree, I think that (a) the compression is part of the
> algorithm, and (b) we must assume that Eve knows the algorithm.
>
> Now, let's take a specific example: PKZIP. If Z is the compression
> function, then C = E(Z(P)).
>
> Since PKZIP'd headers begin with the two octets 0x504B ("PK" in
ASCII),
> we have a known plaintext attack against E(), do we not? I know two
> octets isn't much, but it's the /first/ two octets, which probably
helps
> somewhat.
>
> Do you see my difficulty?
>
> (It could be argued, perhaps, that one could prevent this attack by
> using a 'headerless' algorithm, such as RLE.)


So what?  I could give you a terabyte of known plaintext and you
couldn't guess the key or plaintext (assuming the known plaintext and
the original plaintext message are not the same (or a subset)).

Even knowing two bytes of one block is not sufficient to get a single
known plaintext (assuming the block size is larger then 16-bits).

Tom


Sent via Deja.com http://www.deja.com/
Before you buy.

------------------------------


** FOR YOUR REFERENCE **

The service address, to which questions about the list itself and requests
to be added to or deleted from it should be directed, is:

    Internet: [EMAIL PROTECTED]

You can send mail to the entire list (and sci.crypt) via:

    Internet: [EMAIL PROTECTED]

End of Cryptography-Digest Digest
******************************

Reply via email to