Cryptography-Digest Digest #653, Volume #13       Wed, 7 Feb 01 20:13:01 EST

Contents:
  Re: Phillo's alg is faster than index calculus ([EMAIL PROTECTED])
  Re: Some basic questions (Terry Ritter)
  Re: relative key strength private vs public key ("Joseph Ashwood")
  Re: Encrypting Predictable Files [now on AONTs] ("Joseph Ashwood")
  Re: CipherText: modification // correction ("Prichard, Chuck")
  Re: Some basic questions ("Joseph Ashwood")
  Re: Encrypting Predictable Files ("Douglas A. Gwyn")
  Re: CipherText patent still pending ("Douglas A. Gwyn")
  Re: On combining permutations and substitutions in encryption ("Douglas A. Gwyn")
  Re: Redundancy in algorithms ("Douglas A. Gwyn")
  Re: RSA, discrete log Both not secure... ("Douglas A. Gwyn")
  Re: Different cipher type ("Douglas A. Gwyn")
  Re: Phillo's alg is faster than index calculus (Tom St Denis)
  Re: CipherText: modification // correction (Tom St Denis)
  Re: Phillo's alg is faster than index calculus ("Douglas A. Gwyn")
  Re: relative key strength private vs public key ("Douglas A. Gwyn")
  Re: Bleichenbacher finds bug in DSA RNG (Wei Dai)
  Re: CipherText: modification // correction ("Prichard, Chuck")
  The law of Zipfs (Help) ("Kurt Flei�ig")

----------------------------------------------------------------------------

From: [EMAIL PROTECTED]
Subject: Re: Phillo's alg is faster than index calculus
Date: Wed, 07 Feb 2001 23:01:52 GMT

All my posts are a gift to the world. I don't benefit from them.

So I'll let researchers and paid scientists meticulously work out the
details.





Sent via Deja.com
http://www.deja.com/

------------------------------

From: [EMAIL PROTECTED] (Terry Ritter)
Subject: Re: Some basic questions
Date: Wed, 07 Feb 2001 23:19:10 GMT


On Wed, 07 Feb 2001 20:16:32 GMT, in
<A2ig6.389469$[EMAIL PROTECTED]>, in sci.crypt
"Geoff Blair" <[EMAIL PROTECTED]> wrote:

>OK. Im a newbie. With that said I will ask my questions and run for cover in
>case i get flamed for asking them in such a respected news group.
>
>    1. Can anyone point me to some source material about the very basics of
>cryptography?

   http://www.io.com/~ritter/LEARNING.HTM

---
Terry Ritter   [EMAIL PROTECTED]   http://www.io.com/~ritter/
Crypto Glossary   http://www.io.com/~ritter/GLOSSARY.HTM


------------------------------

From: "Joseph Ashwood" <[EMAIL PROTECTED]>
Subject: Re: relative key strength private vs public key
Date: Wed, 7 Feb 2001 14:58:48 -0800

Well you've already had some of the interesting responses that you will get
from this.

Basically there are two common theories on this, for the sake of naming (and
habit) call them the Johnson and the Silverman methods. Each of these
methods has a usefulness, and each has it's problems.

The Johnson method takes into account only the time required, I have called
it the Johnsom method because the individual best known around here for
pushing it is Don Johnson. It works very nicely under the assumption that
the primary advances made in the associated algorithms will be more likely
to have their memory requirements reduced than to have the algorithm itself
replaced. You can see the basic results of this examination in the post by
Don Johnson in this thread.

The basic list for the Johnson method is
(http://www.certicom.com/research/download/ECCFut.zip):
all are expressed in bits
Symmetric | RSA n | DSA p | DSA q | ECC n
56             | 512     | 512      | 112      | 112
80             | 1024   | 1024    | 160      | 161
112           | 2048   | 2048    | 224      | 224
128           | 3072   | 3072    | 256      | 256
192           | 7680   | 7680    | 384      | 384
256           | 15360 | 15360  | 512      | 512


The Silverman method is named after the primary proponent of this method Bob
Silverman (I'm fairly sure that given time at least one response from him
will be seen on this subject). This method captures the state of the art in
a monetary comparison. It works by setting a price per performance, and a
price per Megabyte of RAM, the 2 commodities that are of interest in
algorithms. From this the price to break a public key in a given time is
approximated, and prices are then compared. This method is useful if your
basic assumption is that the memory requirements will not fall faster than
the price/performance ratio.

The basic table for the Silverman method is
(http://www.rsasecurity.com/rsalabs/bulletins/bulletin13.html):
Symmetric | ECC | RSA
56             | 112   | 430
80             | 160   | 760
96             | 192   | 1020
128           | 256   | 1620

As you can see at the low end of the scale the values are for most efforts
identical, 430 vs 512, which is basically a worthless difference. It is only
when you get to the high security mechanisms that the difference becomes
very noticable, 1620 vs 3072. What you need to decide is which of these
models best suits your needs, or if perhaps neither of them is really
suitable. For example I know my PGP only needs to be  1024 bits for my
purposes, however as a matter of future-proofing it, because I can afford
the extra time to use a 4096-bit key, I do. This is a usability
consideration, I can use, there's no reason not to use it, and there are
potential reasons to use it.
                            Joe




------------------------------

From: "Joseph Ashwood" <[EMAIL PROTECTED]>
Subject: Re: Encrypting Predictable Files [now on AONTs]
Date: Wed, 7 Feb 2001 15:12:58 -0800

I know I said I wouldn't respond however the topic has changed substantially
enough that I think something more can be said, and should be said by
someone.

<[EMAIL PROTECTED]> wrote in message news:95sdqg$5jr$[EMAIL PROTECTED]...
>   Benjamin Goldberg <[EMAIL PROTECTED]> wrote:
[both talked about AONT in more detail then I went into]

I think both definitions are perfectly acceptable for defining an AONT
because the name by itself is under specified. There are Deterministic AONT
algorithms, this would be things like DSs concept of F(F'(F(x)))=F(x)
regardless of how many times it is run. And there is also non-deterministic
AONT algorithms, OAEP is a very good example of this. Each has it's uses,
just as Deterministic and Non-deterministic encryption algorithms each have
their place. In general Non-deterministic algorithms will be more secure if
for no other reason than there is more entropy in the design. It is also
worth noting that any deterministic AONT can be made into a
Non-Deterministic AONT by padding the pre transform data with random bits.
Arguments over which is "the" AONT is a worthless argument, maybe it would
be more helpful for everyone involved if we were to refer to them as DAONT
or NDAONT, where you could refer to the entire category as AONT. I think
this solution would allow the various degrees of specificity that everyone
is discussing.
                        Joe





------------------------------

From: "Prichard, Chuck" <[EMAIL PROTECTED]>
Subject: Re: CipherText: modification // correction
Date: Wed, 07 Feb 2001 23:28:15 GMT

Actually the intended new line should read:

$self->{S_KEY} .= $self->{MASK}[ord($key) - 31] + 31;

and NOT:

$self->{S_KEY} .= $self->{MASK}[ord($key) - 31] ;

As stated.

Sorry.

Charles Prichard (amateur cryptologist)
http://greentv.crosswalkcentral.com -PWS
http://members.nbci.com/chuck_prichard/



------------------------------

From: "Joseph Ashwood" <[EMAIL PROTECTED]>
Subject: Re: Some basic questions
Date: Wed, 7 Feb 2001 15:27:44 -0800

>     1. Can anyone point me to some source material about the very basics
of
> cryptography?

That depends on what kind of basics you want. Some people only want to know
what the algorithms are, others want to know all the attacks, some want to
learn how to attack, etc. However two good starting places are Applied
Cryptography (Schneier) and Handbook of Applied Cryptography (Menezes, et al
http://www.cacr.math.uwaterloo.ca/hac/).

>     2. I have been programing using several languages for years now and I
> was wondering about a code word I keep hearing about but have never see
> explained. What is XOR and what does it do?

It is eXclusive-OR, it's an operation that takes place on 2 bits at a time,
and outputs 1 bit. That bit is set to 1 if one and only one of the two input
bits is 1, otherwise it is zero.

>     3. What is the average key length for the best security systems online
> right now?

That is a constant debate, not because figuring out the average key length
is hard, but because figuring out the best security systems is hard. However
a good place to start would be:
Public key Algorithms
RSA 1024+ bits
DH/DSA 1024+ bits
ECC DH/DSA 160+ bits

Symmetric Algorithms
3DES 112/168
Rijndael 128/192/256
ARC4 128+ bits

If you stick to those you shouldn't have any problems with cryptographic
security. Some companies though will require things above and beyond this,
with for example many banks requiring the option of 2048 bit public keys
(quite a waste for ECC but that's a different matter).
                        Joe



------------------------------

From: "Douglas A. Gwyn" <[EMAIL PROTECTED]>
Subject: Re: Encrypting Predictable Files
Date: Wed, 7 Feb 2001 22:38:02 GMT

Matt Timmermans wrote:
> If you have ciphertext, but _no_ knowledge of the plaintext, then attack is
> impossible if the transformation from PT to CT is bijective.  All possible
> plaintexts are plausible, and so you have no way to check any guess at the
> key.

Well, that's a good tack to take, but "bijective" is misleading.
If the key has M bits and the ciphertext and plaintext have N > M
bits, as is normally the case, then not all 2^N plaintexts are
possible; only 2^M of them are possible.  Thus knowledge of the
general system itself contributes useful information about the
plaintext.  It's a "partial crack" that might or might not be
useful, depending on the application one has for the partially
recovered plaintext (in this case consisting of a subset of order
2^M of the 2^N-sized domain, all elements in the subset having
likelihood 2^-M and all other 2^(N-M) plaintexts having likelihood
0).  Perhaps another way of putting it is that true bijectiveness
requires key size equal to message size.

------------------------------

From: "Douglas A. Gwyn" <[EMAIL PROTECTED]>
Subject: Re: CipherText patent still pending
Date: Wed, 7 Feb 2001 22:52:43 GMT

wtshaw wrote:
> In article <95loff$9sn$[EMAIL PROTECTED]>, Bryan Olson
> <[EMAIL PROTECTED]> wrote:
> > Terry Ritter wrote:
> > > Generating new ciphering structures is *important* in the sense that
> > > everyone needs to understand just how little is known of this area,
> > > and the extent to which what we think we know, we don't.
> > Only analysis of ciphers does that.   Though we do need
> > ciphers in order to do analysis, we're already way, way
> > overstocked.
> Which means that you feel most copmfortable in a small and dependable wold
> that tends not to challenge you too much?  OK, no real slam here intended,
> but anyones frustration about the growth of information and knowledge
> should not hamper those that wish to press on.

There is a genuine real-world problem in developing *knowledge*
about characteristics of things when new things are introduced
at a rate greater than one can keep up with.  To make rational
decisions one *must* have relevant information.  To pick the
first analogy that occurred to me: Suppose instead of careful
research and development of pharmaceuticals, the drug companies
just generated new compounds at random and made them available
for purchase?  Some of them might be just what one needs to
treat a given medical condition, but that wouldn't matter since
there would be no way to tell those apart from the ones that
would have horrible consequences.

I think both Ritter and Olson are partly right; analysis
produces information about properties of classes of systems,
but Ritter was talking about information of a different kind,
essentially psycho-social awareness of the state of the
cryptologic art as such.  This is something I also harp on
occasionally, when it looks like people are forming some
notion of the degree of protection afforded by some system
without having anything remotely resembling a proof thereof.

------------------------------

From: "Douglas A. Gwyn" <[EMAIL PROTECTED]>
Subject: Re: On combining permutations and substitutions in encryption
Date: Wed, 7 Feb 2001 23:03:50 GMT

Benjamin Goldberg wrote:
> Wow.  So if it can be proved that P=NP, then ALL ciphers which run in
> polynomial time lose most of their security against known plaintext.

No, that means some limit on the asymptotic complexity
as the key and message size jointly become arbitrarily large,
but that is not at all the same as the practical degree of security
of any given instance against actual cryptanalysis (known PT or not).

The main "practical" implication of P<NP would be that one
can expand the parameters of his NP-based cryptosystem as the enemy's
capabilities increase, more economically than the enemy can keep up.

------------------------------

From: "Douglas A. Gwyn" <[EMAIL PROTECTED]>
Subject: Re: Redundancy in algorithms
Date: Wed, 7 Feb 2001 23:29:35 GMT

[EMAIL PROTECTED] wrote:
> Some algorithms are described as being non-redundant. What does that
> mean?

Impossible to tell, out of context.
An algorithm as such has no need of redundancy.

------------------------------

From: "Douglas A. Gwyn" <[EMAIL PROTECTED]>
Subject: Re: RSA, discrete log Both not secure...
Date: Wed, 7 Feb 2001 23:26:24 GMT

[EMAIL PROTECTED] wrote:
> PS. The philippino's alg works. I tried it myself. I guess he's probably
> not the first one breaking RSA. The previous ones just aren't exposed.

Lots of algorithms "work" in the sense that they produce correct
results (eventually) and are practical for small test cases.
The real issue is to find algorithms that work *and* are fast
enough when applied to problems of the size actually used in
real cryptosystems.
I have been independently (before hearing about this fellow)
thinking about (nonmodular) factoring via the binary
representation, and I suspect that many other people also have
tackled similar problems with some such approach.
P.S. No useful result yet!

------------------------------

From: "Douglas A. Gwyn" <[EMAIL PROTECTED]>
Subject: Re: Different cipher type
Date: Wed, 7 Feb 2001 23:31:38 GMT

Michael Brown wrote:
> I mentioned this a while ago, but never really followed up on it. The idea
> was to have a cipher where instead of each bit in a key representing a
> number (bad description, but it's hard to cover DES and RSA keys under one
> word :), it represented an instruction.

Before spending much more time in this direction, read what Knuth
had to say about his youthful attempt at a "super-random" generator
(the Art of Computer Programming, Vol. 2: Seminumerical Algorithms).

------------------------------

From: Tom St Denis <[EMAIL PROTECTED]>
Subject: Re: Phillo's alg is faster than index calculus
Date: Wed, 07 Feb 2001 23:54:12 GMT

In article <95sk4q$bl8$[EMAIL PROTECTED]>,
  [EMAIL PROTECTED] wrote:
> All my posts are a gift to the world. I don't benefit from them.
>
> So I'll let researchers and paid scientists meticulously work out the
> details.

Don't advertise a method you're not willing to back up with proof.  That's
like me saying "there's no god" and I will let the theologians work out the
details.

Tom


Sent via Deja.com
http://www.deja.com/

------------------------------

From: Tom St Denis <[EMAIL PROTECTED]>
Subject: Re: CipherText: modification // correction
Date: Wed, 07 Feb 2001 23:55:08 GMT

In article <jSkg6.1888$[EMAIL PROTECTED]>,
  "Prichard, Chuck" <[EMAIL PROTECTED]> wrote:
> Actually the intended new line should read:
>
> $self->{S_KEY} .= $self->{MASK}[ord($key) - 31] + 31;
>
> and NOT:
>
> $self->{S_KEY} .= $self->{MASK}[ord($key) - 31] ;
>
> As stated.
>
> Sorry.

The fact that nobody cares doesn't hinder your postings.  That's good.

Tom


Sent via Deja.com
http://www.deja.com/

------------------------------

From: "Douglas A. Gwyn" <[EMAIL PROTECTED]>
Subject: Re: Phillo's alg is faster than index calculus
Date: Wed, 7 Feb 2001 23:42:16 GMT

[EMAIL PROTECTED] wrote:
> One of the things I have in mind is doing a lot of precomputation, eg.
> precompute 110110+11011=1010001 So every time we see ...1110 at the end,
> we apply 1010001.

Yes, good idea.  Generally speaking, if you can precompute a bunch
of templates that can be quickly tested in what would be an inner
loop of an algorithm, you can often short-cut one or more of the
next loops.  It is for reasons like this that the exact analysis of
one particular implementation of a method does not tell the whole
story.

------------------------------

From: "Douglas A. Gwyn" <[EMAIL PROTECTED]>
Subject: Re: relative key strength private vs public key
Date: Wed, 7 Feb 2001 23:45:53 GMT

[EMAIL PROTECTED] wrote:
>   The facts are nobody really knows what sizes to use.

Although we do know a range of sizes *not* to use.

------------------------------

From: Wei Dai <[EMAIL PROTECTED]>
Subject: Re: Bleichenbacher finds bug in DSA RNG
Date: Wed, 7 Feb 2001 16:20:57 -0800

In article <[EMAIL PROTECTED]>, djohn37050
@aol.com says...
> It is also interesting that the "chink" that the attack exploits found in DSA
> as written up in FIPS 186 or X9.30 does NOT apply to ECDSA as written up in
> X9.62.

Still, everyone who implements ECDSA or any other ElGamal-type 
signature scheme should double-check that the per-signature secret 
exponent k is uniformly selected between 0 and q. There's an 
interesting discussion about this on the coderpunks mailing list, where 
it turns out that most crypto libraries have the problem even though 
many of them don't use the exact method described in FIPS 186. So far, 
it seems that OpenSSL, CryptoLib, CryptLib, PureTLS are affected, and 
Crypto++ and GnuPG are not.

-- 
http://cryptopp.com - free C++ cryptography and compression library

------------------------------

From: "Prichard, Chuck" <[EMAIL PROTECTED]>
Subject: Re: CipherText: modification // correction
Date: Thu, 08 Feb 2001 00:35:22 GMT

The previous mistake leads to investigation about whether it will be
possible to expand the key domain from 32 to 64, using the shifted second
cipher.

This would serve to double the key domain, improving message integrity
without affecting the suitability for 7-bit transmission of the
algorithm's output.

When this has been properly implemented and verified, some new
information will be posted to properly reflect the relative strength of
CipherText. Comparing key strength of that of other algorithms.

64*64*64*64*64*64*64*64 = 2.814 E + 14 (combinations using 8 vaues in a
domain of 64)

12x = 4.722 E +21 (12 values)

16x = 7.92 E +28 (16 values)

This will be a subtle but significant change in the implementation that
greatly improves the algorithm's integrity.

 Charles Prichard (amateur cryptologist)
http://greentv.crosswalkcentral.com -PWS
http://members.nbci.com/chuck_prichard/




------------------------------

From: "Kurt Flei�ig" <[EMAIL PROTECTED]>
Subject: The law of Zipfs (Help)
Date: Thu, 08 Feb 2001 00:46:48 GMT

Sorry,

I know the law's formula only, but I read that it was analyzed by B.
Mandelbrot, C. H. Muller and that it's connected with Shannon's probable
word theory. If anybody knows exactly the commentary and analysis sources,
please, post their coordinates inside my e-mail box.
Thanks in advance.
KF






------------------------------


** FOR YOUR REFERENCE **

The service address, to which questions about the list itself and requests
to be added to or deleted from it should be directed, is:

    Internet: [EMAIL PROTECTED]

You can send mail to the entire list by posting to sci.crypt.

End of Cryptography-Digest Digest
******************************

Reply via email to