Cryptography-Digest Digest #534, Volume #9       Wed, 12 May 99 11:13:03 EDT

Contents:
  Re: TwoDeck solution (but it ain't pretty) ([EMAIL PROTECTED])
  Re: Crypto export limits ruled unconstitutional (Mok-Kong Shen)
  Re: TwoDeck solution (but it ain't pretty) ([EMAIL PROTECTED])
  Re: True Randomness & The Law Of Large Numbers (R. Knauer)
  Re: A challenge for you ! ("Douglas A. Gwyn")
  Re: A simple challenge for Tomstdenis ("Ulrich Kuehn")
  Re: Crypto export limits ruled unconstitutional (Mok-Kong Shen)
  Searching for algorithm overview ("Jonas Krantz")
  Re: True Randomness & The Law Of Large Numbers ("Douglas A. Gwyn")
  Re: Shamir's TWINKLE Factoring Machine (Bob Silverman)
  Re: AES ([EMAIL PROTECTED])
  Newbie:  Encrypting short strings ("Steve K")

----------------------------------------------------------------------------

From: [EMAIL PROTECTED]
Subject: Re: TwoDeck solution (but it ain't pretty)
Date: Wed, 12 May 1999 11:20:51 GMT


> --------------------
>  LadyCow, stop this cunttalk and take a shower - you stink.
>

Although I agree, please keep the discussion civil.  I (like others)
want to keep it cool about these punk @#! losers...

Tom
--
PGP public keys.  SPARE key is for daily work, WORK key is for
published work.  The spare is at
'http://members.tripod.com/~tomstdenis/key_s.pgp'.  Work key is at
'http://members.tripod.com/~tomstdenis/key.pgp'.  Try SPARE first!


--== Sent via Deja.com http://www.deja.com/ ==--
---Share what you know. Learn what you don't.---

------------------------------

From: Mok-Kong Shen <[EMAIL PROTECTED]>
Crossposted-To: talk.politics.crypto
Subject: Re: Crypto export limits ruled unconstitutional
Date: Wed, 12 May 1999 14:30:09 +0200

Johnny Bravo wrote:
> 

>   There is a program that can convert C code directly to english
> readable text and back again.  Here is a sample of the code for the

Can you provide a pointer to that? I am interested in the manner
the translation gets done.

M. K. Shen

------------------------------

From: [EMAIL PROTECTED]
Subject: Re: TwoDeck solution (but it ain't pretty)
Date: Wed, 12 May 1999 11:18:58 GMT


> As someone has observed, your website is about the BATTLE data
> compression contest - there are no pointers on it to a page with
> pointers to the paper and stuff. (We can't see your directory, so we
> need links from HTML...)
>
> You could put a link labelled "About Me" on the bottom of the page
> which would link to a page saying "Hi, I'm Tom St. Denis", and then
> mentioning your other interests, including "Cryptography" - and on
> that include a link to your paper. That way you avoid mentioning
> cryptography on the BATTLE page itself if you don't want to.

True, that's why I posted links though.  I will be making a website for
the cryptography stuff this weekend (if not tonight :) ).

Sorry about the confusion, the paper (in it's current form) is at

http://members.tripod.com/~tomstdenis/TwoDeck.ps

And the rough sketch for the sieving crack is at

http://members.tripod.com/~tomstdenis/solution.ps

Thanks for your time,
Tom
--
PGP public keys.  SPARE key is for daily work, WORK key is for
published work.  The spare is at
'http://members.tripod.com/~tomstdenis/key_s.pgp'.  Work key is at
'http://members.tripod.com/~tomstdenis/key.pgp'.  Try SPARE first!


--== Sent via Deja.com http://www.deja.com/ ==--
---Share what you know. Learn what you don't.---

------------------------------

From: [EMAIL PROTECTED] (R. Knauer)
Subject: Re: True Randomness & The Law Of Large Numbers
Date: Wed, 12 May 1999 12:41:29 GMT
Reply-To: [EMAIL PROTECTED]

On Wed, 12 May 1999 06:07:42 GMT, "Douglas A. Gwyn" <[EMAIL PROTECTED]>
wrote:

>> Then why does Feller claim that it is fundamentally incorrect to infer
>> the properties of random number generation from the time average of a
>> single sequence?

>Who cares why he says that, it's not relevant.

Are you saying that Feller does not know what he is talking about?

>It's hard to be sure what you mean by terms like "1-bit bias".

There should be no difficulty in understanding that term at all. It is
quite commonly used in discussions about randomness. It is one of the
different kinds of group bias.

>The Monobit Test checks the first moment of the distribution,

I think you need to demonstrate that formally.

>If John Jones makes only $25,000/year, then there is evidence that he
>isn't a very good salesman, and you should consider not using him to
>peddle your product.

It depends on whether that is his consistent earnings level. One
annual earnings report is not sufficient to characterize his overall
performance over his productive years.

If you want to formulate a monobit-type test that takes a large number
of samples into account, then I would agree that it might have merit.
But just one test on one small sample is not sufficient.

>But a better analogy would be:  John Jones
>normally makes around $100,000/year, but suddenly his sales plummet
>to $25,000/year.  Should you suspect that he has developed a problem?
>How long are you going to retain him as a sales representative if his
>performance drops to that level and stays there?

I would not assume he is a bad salesman until he turned in several
poor performances. One bad year does not make a poor salesman.

>> ... How can Jone's poor earnings be a
>> reflection on the typical earnings of the vast majority of salesmen?

>Of course, it isn't -- it reflects only on Jones.

Then one "abnormal" sequence cannot reflect on the TRNG's performance
either.

>Due to practical considerations, it's as large a sample as we can get
>before we have to make a decision.

Nonsense. You can buffer and test - memory and disk space is cheap.

>Sorry, we can't DO that.

Sure you can. The time you lose fooling around with false alarms is
more than the time to generate a few million bits and test them
properly.

>We cannot wait until 200,000,000 stuck
>key bits have been used to encrypt vital information.

There are other tests like the Long Run Test for determining stuck
keys.

>As with *any* statistic, the data is "boiled down" to one number that
>summarizes something about the distribution.

That is not necessarily true. If you had an actual distribution of
such numbers you could analyze it, which would tell you a lot more
about the process than one simplistic test could ever do.

> What is your problem?

The use of simplistic tests on one small sample.

>The *reason*
>the methodolgy works is because the computed statistic is essentially
>an indicator of membership in a *class* of samples, and those classes
>have different relative sizes, thus differing likelihoods.

That is an expression of the central fallacy here. You believe that
one sample such as 20,000 bits taken by itself can be representative
of a complete distribution of samples. That is fallacious, as Feller
and others have pointed out.

>You cannot *know* anything with *certainty*, no matter *how* much you
>sample nor what tests you perform.  So it is not a matter of certainty,
>but rather of *likelihood*.

I have used the term "reasonable certainty". Likelihood is a fiction
from probability theory.

>Consider an industrial plant, e.g. a nuclear power facility.  There
>are continual tests and monitoring, with alarms raised when unlikely
>(on the assumption of correct operation) conditions are detected.
>Even though some of the alarms are false, would you argue that they
>should be dispensed with altogether?

The alarms have to be based on something in the real world and not
some fallacious notion about what should be happening.

Just because a TRNG *should* output sequences with small bias does not
mean that any given sequence must be of low bias.

My objection is that you cannot infer the properties of the TRNG from
one sequence, which is what the Monobit Test is attempting to do.

Bob Knauer

"There is much to be said in favour of modern journalism. By giving us the opinions
of the uneducated, it keeps us in touch with the ignorance of the community."
-- Oscar Wilde


------------------------------

From: "Douglas A. Gwyn" <[EMAIL PROTECTED]>
Subject: Re: A challenge for you !
Date: Fri, 07 May 1999 06:29:58 GMT

wtshaw wrote:
> Strangely, he is probably right, not the tag, but the previous statement;
> proper lengths provided, ciphertext should be decodable to anything with
> the appropriate key. Otherwise, some plaintext-ciphertext pairs would be
> excluded from being valid, a weakness in any algorithm.

Such an argument works only for a one-time-pad system, where the key
is as big as the message.  The vast majority of actual cryptosystems
use a key that is much smaller than the message being encrypted.

Ciphertext-only cryptanalysis not only is possible, it is quite
feasible for a large number of cryptosystems.  And there is rarely
much ambiguity in the decipherment produced by the cryptanalysis.

You guys need to get out of your armchairs and go crack a few systems.

------------------------------

From: "Ulrich Kuehn" <[EMAIL PROTECTED]>
Subject: Re: A simple challenge for Tomstdenis
Date: 12 May 1999 15:13:42 +0200

[EMAIL PROTECTED] writes:

> Apply either linear or differential cryptanalysis to this algorithm, oh
> person who uses these terms so frequently to other people:
> All quantities are 32-bit, unsigned. + is addition mod 2^32, ^ is XOR
> 
> It's a 8-round feistel network where f(a,b) is
> (a+b)^(a*b)
> 
How ist the * operation defined? A problem could be that -- assuming
the usual multiplication mod 2^32 -- that a*0 = 0*b = 0. 

> The round key for round "i" is:
> RK_i = (K[0] + i*0x12345678) + (K[1] + i*0x87654321)
> 
So RK_i is (K[0]+K[1]+i*0x99999999), assuming that here the * operation
is i times addition to itself.

Ciao,
Ulrich
-- 
Ulrich Kuehn ------------------ [EMAIL PROTECTED]
                    [EMAIL PROTECTED]
        http://wwwmath.uni-muenster.de/~kuehn/



------------------------------

From: Mok-Kong Shen <[EMAIL PROTECTED]>
Crossposted-To: talk.politics.crypto
Subject: Re: Crypto export limits ruled unconstitutional
Date: Wed, 12 May 1999 14:24:29 +0200

Mike Eisler wrote:
> 
> The purpose of export control regulations is to limit
> proliferation of the use of crypto software.

But France had even banned domestic use of strong crypto but later
found it wasn't particularly clever to do that. I personally can't
think of a genuine reason of the bureaucrats trying to do the
(impossible) task of suppressing the development of crypto other than 
to get some items of work to justify their own payroles. The science
of cryptology is not a monopoly of a number of people or a number
of nations. Control cannot work. If you have good ideas, you can
just as well develop your crypto software in the wildness.

> 
> Compiling large software packages into binary form is
> straightforward for a skilled programmer, but time
> consuming at best, and non-triival at worst. The probem is
> that source for a computer program is rarely in one monothilic file,
> and so compiling each source file and combining into a single
> binary takes work. Certainly, programs like the UNIX "make" command make
> this easier, but in any case there is a lot fo work to
> go froma source code to a shrink wrap binary package on
> a a CD or a download.

The kernel, where the crypto-algorithm in the proper sense is, is
in any case small. It is this that is target of endeavours of
regulations. Software packages in general tend to be large often
because of provision of good user interfaces. For a commercial
product certainly one has to invest a sizable amount of resources.
But at least for a 'lean' version that is not required to have optimal
speed performance and user comfort, the implementation of the known 
crypto algorithms can't be a very big thing. There are plenty of much 
more complex jobs in software engineering that get done all over
the world.

M. K. Shen

------------------------------

From: "Jonas Krantz" <[EMAIL PROTECTED]>
Subject: Searching for algorithm overview
Date: Wed, 12 May 1999 15:48:29 +0200

Hello
I am a student at the Royal Institute of Technology in Stockholm, and I am
examining possible ways to secure a wireless network with internet access
from an ISP point of view. I have been looking at some IPsec solutions which
use different kinds of algorithms for encryption and authentication.
I have looked at many different algorithms, but I find it hard to evaluate
them. What are their strength and what are their weaknesses. Which ones are
better, and which ones are cracked.
Does any one know if there are an overview over the different algorithms on
the net, where they are compared with each other?

Thanks

Jonas Krantz
[EMAIL PROTECTED]




------------------------------

From: "Douglas A. Gwyn" <[EMAIL PROTECTED]>
Subject: Re: True Randomness & The Law Of Large Numbers
Date: Fri, 07 May 1999 06:37:21 GMT

"R. Knauer" wrote:
> On Thu, 06 May 1999 04:14:20 GMT, "Douglas A. Gwyn" <[EMAIL PROTECTED]>
> wrote:
> >       3)  We observe the actual output and find that it does
> >           not have those statistical characteristics.
> You did not take enough samples to make that determination. Claiming
> that you took 20,000 single bit samples is fallacious.

No, it isn't!  You missed the point of "proof by contradiction".
During the course of such a proof, one is *entitled to take the
initial assumption for granted*, even though the net result of
the proof will eventually be to disprove the assumption.  That
assumption in this case was that the TRNG instance is producing
output that *does* have those statistical-independence properties.
So for purposes of the test, each of the 20,000 bits can indeed
be treated as an independent sample.  If the outcome of the test
shows that they weren't independent samples, *that does not
invalidate the overall argument*.  That's the beauty of proof by
contradiction.

------------------------------

From: Bob Silverman <[EMAIL PROTECTED]>
Subject: Re: Shamir's TWINKLE Factoring Machine
Date: Wed, 12 May 1999 13:44:06 GMT

In article <7gqg42$klk$[EMAIL PROTECTED]>,
  [EMAIL PROTECTED] wrote:
> [EMAIL PROTECTED] (Bruce Schneier) writes:
> >The important thing to note is that this new machine does not affect
> >the matrix step at all.  And this step explodes in complexity for
> >large factoring problems; its complexity grows much faster than the
> >complexity of the sieving step.  And it's not just the time, it's the
> >memory requirements.  With a 1024-bit number, for example, the matrix
> >step requires something like ten terrabytes of memory: not off-line
> >storage, but real computer memory.  No one has a clue how to solve
> >that kind of problem.

For factoring this is quite right.  For discrete-logs it will be
3 orders of magnitude larger,  i.e.  10 PetaBytes.

>
> Uh, give it time.
>
> Right now I admin some 8 gig RAM machines.  It's been maybe 15
> years since 8 meg machines were this expensive (okay, I was 14 in
> 1985, sue me if i'm off by a few years).  So presumably in 10-15
> years we'll have 8 terabytes of RAM.

Only if there is a need for machines with that much RAM, but I agree
it will probably be technically feasible.

However, everyone still ignores the fact that one must SOLVE this
matrix.  Suppose we assume that we had the world's fastest CRAY today
with enough memory.  It would still take "about"  60 Million WEEKS
to solve the matrix.  And we do not as yet have techniques to solve
such matrices in parallel. processor-to-processor latency will be a
MAJOR bottleneck even if bandwidth is sufficient.  And if one looks
at a shared-memory multiprocessor instead of message passing,  then
bus-contention becomes a major problem.


>
> And I'm sure the NSA could build a machine with 8 terabytes of RAM
> right now.  Certainly within a few years.

Dynamic RAM would be too slow. One needs 8-10 Tbytes of fast static
RAM (i.e.  1-2nsec register<-->memory latencies).   Is there even that
much fast static RAM in existence today?  And yes, it is probably
technically feasible. But what would it cost?

>  And would it be theoretically
> possible to tune the algorithm so that it could be cache- and swap-
friendly?

The block Lanczos algorithm can be tuned to be cache friendly. But for
this sized matrix we are talking about a big big cache. I'd have to
do the arithmetic to estimate how big.  Certainly in the Gbyte range.


>
> >This technique works just as well for discrete-logarithm public-key
> >algorithms (Diffie-Hellman, ElGamal, DSA, etc.) as it does for RSA.
> >(Although it is worth noting that the matrix problem is harder for
> >discrete-log problems than it is for factoring.)

MUCH harder.  Instead of working mod 2,  one must solve the matrix
modulo the order of the group or field.  Furthermore, storage goes
way up.  Instead of each matrix element being a single bit, it is now
the size of the field.  (log_2 p  for GF(p))



>  The technique does
> >not apply to elliptic-curve-based algorithms, as we don't know how to
> >use the sieving-based algorithms to solve elliptic-curve problems.
>
> Could there be mathematical advances which make the matrix problem
for
> the discrete-log problem equivalent to the factoring one?

Not for sieve based algorithms as we know them.

> Does the NSA
> know how to apply this technique to elliptic curve systems?

Noone does.


--
Bob Silverman
"You can lead a horse's ass to knowledge, but you can't make him think"


--== Sent via Deja.com http://www.deja.com/ ==--
---Share what you know. Learn what you don't.---

------------------------------

From: [EMAIL PROTECTED]
Subject: Re: AES
Date: Wed, 12 May 1999 13:54:04 GMT

In article <[EMAIL PROTECTED]>,
  "Douglas A. Gwyn" <[EMAIL PROTECTED]> wrote:
> Bruce Schneier wrote:
> > Multiple encryption is generally a good idea.  The only reason you
> > don't see it widely used in practice is that using N ciphers cuts
> > the performance by a factor of N (more or less).
>
> Without necessarily gaining proportionally in security.
> If you have a good algorithm, it would seem to be better
> to put the extra cycles into using that algorithm with a
> longer key (for example).

I understand that adding more rounds to a cipher increases its
resistance against known attacks exponentially. Combining ciphers
with different types of rounds should at the very least increase
resistance against known attacks exponentially too. In fact a
theoretical attack against one cipher may not work at all against
the other and therefore against the combination of the two. So I
think that multiple encryption increases security better than
proportionally.

Now, instead of increasing the "depth" of a cipher you propose to
increase its "width", i.e. the key size. I wonder how
cost-effective this is. Certainly it exponentially increases
resistance to exhaustive key search but this is the most primitive
type of attack. It seems to me that if the cipher has some kind of
structural flaw then increasing its key size may not significantly
increase its security. Basically all ciphers can be described as a
set of Boolean equations with as many unknowns as there are bits
in the key. The difficulty of solving sets of equations depends
more on the type of equations than on the number of unknowns.


--== Sent via Deja.com http://www.deja.com/ ==--
---Share what you know. Learn what you don't.---

------------------------------

From: "Steve K" <[EMAIL PROTECTED]>
Subject: Newbie:  Encrypting short strings
Date: Wed, 12 May 1999 10:57:23 -0400

Hey all,

I am looking for an algorithm that will be suitable for encrypting (and
decrypting) short strings around 3 to 10 characters in length.  I would like
the encryption function to produce a string of alphanumerics of maybe 20
characters, and have the requirement of similar inputs producing wildly
different outputs (for example, an input of 34532 would produce a cyphertext
that is very different than the input of 34533).

The application here is to obfuscate database ID numbers that will be
exposed on a web page.  I am trying to prevent users from trying to tweak
these ID numbers by changing a single digit in order to try to get access to
records that shouldn't.

Any help would be appreciated.  I have the "Applied Cryptography" book, so
if there are any relevant chapters in the book, please let me know.

Thanks!
-steve




------------------------------


** FOR YOUR REFERENCE **

The service address, to which questions about the list itself and requests
to be added to or deleted from it should be directed, is:

    Internet: [EMAIL PROTECTED]

You can send mail to the entire list (and sci.crypt) via:

    Internet: [EMAIL PROTECTED]

End of Cryptography-Digest Digest
******************************

Reply via email to