Cryptography-Digest Digest #502, Volume #13      Fri, 19 Jan 01 20:13:00 EST

Contents:
  Re: Dynamic Transposition Revisited (long) (Terry Ritter)
  crypto cracking screen savers ([EMAIL PROTECTED])
  Re: Dynamic Transposition Revisited (long) (Terry Ritter)
  Re: Kooks (was: NSA and Linux Security) (digiboy | marcus)
  Re: Comparison of ECDLP vs. DLP (Roger Schlafly)
  Question about security of Oracle get_hash_value (psyclops)
  Re: 32 bit message digest ("Joseph Ashwood")
  Re: Comparison of ECDLP vs. DLP (David Wagner)
  Re: RE-X-POST Quick-and-Dirty demonstration of twarting US6006328 ("Kasper Pedersen")
  Re: Comparison of ECDLP vs. DLP (Splaat23)
  Re: Question about security of Oracle get_hash_value ("Joseph Ashwood")
  Re: Kooks (was: NSA and Linux Security) (Greggy)
  Re: Kooks (was: NSA and Linux Security) (Greggy)
  Re: Comparison of ECDLP vs. DLP (Roger Schlafly)
  Re: Dynamic Transposition Revisited (long) (John Savard)

----------------------------------------------------------------------------

From: [EMAIL PROTECTED] (Terry Ritter)
Subject: Re: Dynamic Transposition Revisited (long)
Date: Fri, 19 Jan 2001 22:05:48 GMT


On Fri, 19 Jan 2001 21:53:33 GMT, in
<[EMAIL PROTECTED]>, in sci.crypt Benjamin Goldberg
<[EMAIL PROTECTED]> wrote:

>Mok-Kong Shen wrote:
>> 
>> Benjamin Goldberg wrote:
>> >
>> > John A. Malley wrote:
>> 
>> > > May I paraphrase the description of block balancing to make sure I
>> > > understand the mechanism as envisioned? And please, correct me if
>> > > I got this wrong.
>> > >
>> > > Given plaintext P,
>> > >
>> > > 1) divvy P into bytes as P[1], P[2], P[3] ... P[n],
>> > >
>> > > 2) build up (one at a time) blocks of size k bytes,  B[1], B[2],
>> > > B[3] ... B[m]  where m < n, and sequential plaintext bytes are
>> > > assigned to a given block B[i] where B[i] is the concatenation of
>> > > a few plaintext bytes, followed by a special byte that has 0s and
>> > > 1s in it, followed by bytes of all zeros or all ones - like
>> > >
>> > >  P[1] | P[2] | ... | P[L] | a block of 1s and 0s | 00000000 |
>> > > 11111111 | ... 00000000 = B[i]
>> > >
>> > > or
>> > >
>> > >  P[1] | P[2] | ... | P[L] | a block of 1s and 0s | "0" | "255" |
>> > > ... "0"  = B[i]
>> > >
>> > > Is this an accurate description of the proposed bit balancing?
>> >
>> > Almost.  It's more like
>> > if( P[1..L] has more 0s than 1s ) {
>> > P[1] | P[2] | ... | P[L] | XXXXXXX | 00000000 | 00000000 = B[i]
>> > Where XXXXXXXX is some number of 1s and 0s.
>> > } else {
>> > P[1] | P[2] | ... | P[L] | XXXXXXX | 11111111 | 11111111 = B[i]
>> > Where XXXXXXXX is some number of 0s and 1s.
>> > }
>> [snip]
>> 
>> I am certainly confused. What if, say, the block size is
>> 4 bytes and one has (1) two bytes and (2) three bytes of
>> information which are all 0's? Thanks.
>
>A 4 byte block is really small.  If you have two bytes of 0x00s, then
>one byte goes into the first block, one balance byte 0x00 is added, and
>two balance bytes 0xFFFF are added.  The next 0x00 byte goes into the
>next block.  With 3 0x00s it's similar.
>
>A 5 byte block is better.  Two 0x00 bytes go in, then a 0x0F byte, then
>0xFFFF.  With three 0x00s, two bytes go into the first block, then one
>into the next.
>
>Fortuneatly, we won't be likely to be using 4 byte blocks, but much
>larger, perhaps 15 or 16 bytes.

When I think of Dynamic Transposition, I think of 512-byte / 4096-bit
blocks.  This stuff scales almost linearly, and 512 bytes is just a
small buffer.  

I suspect that if we run the numbers, some modest minimum size --
perhaps 128 bits / 16 bytes -- is required for serious strength.  We
can scale it down to look at, of course, but we can't expect strength
there as well.  

---
Terry Ritter   [EMAIL PROTECTED]   http://www.io.com/~ritter/
Crypto Glossary   http://www.io.com/~ritter/GLOSSARY.HTM


------------------------------

From: [EMAIL PROTECTED]
Subject: crypto cracking screen savers
Date: Fri, 19 Jan 2001 22:03:46 GMT

Greetings:

I have heard about some projects for cracking crypto algorithms via
distributed processing in which they farm out bits and pieces to the
mass public in the form of a screen saver and use computer idle time to
crunch away. Are there any such projects currently running? Where would
one go to register and get a screen saver?

Thanks!


Sent via Deja.com
http://www.deja.com/

------------------------------

From: [EMAIL PROTECTED] (Terry Ritter)
Subject: Re: Dynamic Transposition Revisited (long)
Date: Fri, 19 Jan 2001 22:18:40 GMT


On Fri, 19 Jan 2001 12:53:08 -0600, in
<[EMAIL PROTECTED]>, in sci.crypt Mike Rosing
<[EMAIL PROTECTED]> wrote:

>Terry Ritter wrote:
>> 
>>               Dynamic Transposition Revisited
>> 
>>                        Terry Ritter
>> 
>>                         2001 Jan 18
>> 
>> ABSTRACT
>> 
>> A novel approach to block ciphering is remembered for
>> relevance to modern ciphering issues of strength, analysis
>> and proof.  A rare example of a serious transposition
>> cipher, it is also an unusual example of the simultaneous
>> use of block and stream techniques.
>[...]
>
>Thanks Terry, that was nice writing.  I agree that double transposition
>is useful, otherwise one could accumulate lots of plaintext-ciphertext
>pairs and begin to attack the RNG.  Doubling the transpositions squares
>the number of possible states, so brute force is probablely easier.

Thanks, but unfortunately it seems the author (that would be me) was
unable to make his point.  Maybe I can try again:


The Point

The main lesson is not about double-transposition, but balance.  If
all we do is double-shuffle, the ciphertext still contains only the
symbols present in the plaintext, and that leaks information.  By
balancing the block, we eliminate that leak.  

If we accept that a properly-hidden key of a few hundred bits is "long
enough," Dynamic Transposition should be considered stronger in
practice than a one-time-pad.  Why?  Because we generally cannot prove
that the OTP sequence is unpredictable, while Dynamic Transposition
hides predictability behind several closed doors.  

Main strength is provided by the vast number of permutations -- and
thus keys -- which produce the exact same ciphertext, thus cutting off
attack on the RNG before we get to the double-shuffling issue.  Also
note that because many keys produce the same result, in some sense,
the keying is not "efficient," which here is an advantage.  


The Number of Identical Results

The numbers scale factorially, and I normally think of having a
4096-bit block size (that is only 512 bytes, a small buffer).  But
suppose we have a tiny block of just 128 bits, so N = 128:

We have 64 - 1's and 64 - 0's, so many permutations will produce
exactly the same result value.  How many?  Well, imagine scanning
across any particular ciphertext block: the first symbol we get can
have come from the 64 different positions of that symbol in the
original ciphertext.  The next time we find that same symbol, it can
have come from any position other than that already used.  Clearly,
there are 64 * 63 * ... * 2 or (N/2)! possibilities for the 1's, and
the same number for the 0's.

(We can evaluate huge factorials on my combinatorics page:   

   http://www.io.com/~ritter/JAVASCRP/PERMCOMB.HTM#Factorials

.  Typically I use the base-2 log value plus 1, which is the number of
bits in the result value.) 

The number of different permutations which produce identical results
(with a bit-balanced block) is (N/2)! * (N/2)!.  64! is a value of 296
bits, and in log form we multiply by adding, so the result is 592 bits
long.  Exactly one permutation from among this number is correct, and
only that one could provide exact entry into the shuffling sequence.

All this assumes the opponents know both the plaintext and the
ciphertext for the block.  Given known-plaintext, there is no problem
in finding permutations which satisfy the transformation.  The problem
is that there are 2**592 of these easy-to-find alternatives, and there
is no easy test to find which is correct.  

Only *after* the correct permutation is found does double-shuffling
come into play as it further hides the shuffling sequence.


Summary

The reason Dynamic Transposition is strong is that opponents cannot
identify the correct permutation from among all possible permutations
which produce the exact same ciphertext from the exact same plaintext.

---
Terry Ritter   [EMAIL PROTECTED]   http://www.io.com/~ritter/
Crypto Glossary   http://www.io.com/~ritter/GLOSSARY.HTM


------------------------------

From: digiboy | marcus <[EMAIL PROTECTED]>
Subject: Re: Kooks (was: NSA and Linux Security)
Date: Fri, 19 Jan 2001 22:11:00 GMT

In article <949vuu$qh4$[EMAIL PROTECTED]>,
  Greggy <[EMAIL PROTECTED]> wrote:
> What have you done?

More than you know. I have explained quite clearly on many occasions
why what you see is not necessarily what occurs.

--
[ marcus ] [ http://www.cybergoth.cjb.net ]
[ ---- http://www.ninjakitten.net/digiboy ]


Sent via Deja.com
http://www.deja.com/

------------------------------

From: Roger Schlafly <[EMAIL PROTECTED]>
Subject: Re: Comparison of ECDLP vs. DLP
Date: Fri, 19 Jan 2001 14:20:47 -0800

Splaat23 wrote:
>  I can't imagine that it's very easy to generate a key
> that looks good but is not, but I'm now very curious as to how this
> attack could be pulled off in the real world.

It is trivial to generate a key that looks good but isn't. Just
use a lousy RNG. That was a problem with Netscape RSA keys, until
it was exposed. 

> Also, do you have any references to RSA PKV verification protocols?

There are none that provide any nontrivial assurances. That's
why you haven't seen the papers.

------------------------------

From: psyclops <[EMAIL PROTECTED]>
Subject: Question about security of Oracle get_hash_value
Date: Fri, 19 Jan 2001 23:01:56 GMT

Hello,

I am somewhat new to crypto and do not have a pure mathematical
background, so please bear with me while I ask what is probably a silly
question...

I am designing a user authentication system that will run as PL/SQL
functions within Oracle 8i.  I want to store the usernames hashed in
the db so that the user list cannot be stolen by hackers.

I found an MD5 PL/SQL function @
http://ulc199.residence.gatech.edu/keith/plsql/fast_md5.txt (thanks,
Keith!) - however, it is computationally intensive and takes about a
second to perform a single hash on an Ultra 5.  As an alternative, I
looked at the Oracle built in hash function,
DBMS_UTILITY.GET_HASH_VALUE.

The DBMS_UTILITY.GET_HASH_VALUE function allows a maximum 31 bit
output, so is inherently insecure, and would probably result in
collisions.  Additionally, I cannot find any analysis of this function
anywhere (has anyone done an analysis, or does anyone know any info on
this function?).

So here is my question: if I were to split a 20 digit username into
four separate 5-digit values, hash each substring, then multiply the
four values together to get a ~124 bit result, will this provide a
higher level of security / collision-resistance than simply hashing the
full 20-digit username?

It seems to me that multiplying the hashes like this would decrease the
likelihood of collisions, as the output space is bigger.  Like I said,
I don't have a strong math background, so this is realistically just a
hunch, as I don't really have a grip on how the product of hashes
affects probabilities...

Any help & ideas greatfully received!

cheers,

nick.

____________________________________________________________
Nick Donaldson                  mailto:[EMAIL PROTECTED]
Bit Wrangler Extraordinaire!    http://www.psyclops.com/
510.531.8722 voice              510.593.5638 mobile


Sent via Deja.com
http://www.deja.com/

------------------------------

From: "Joseph Ashwood" <[EMAIL PROTECTED]>
Subject: Re: 32 bit message digest
Date: Fri, 19 Jan 2001 15:10:24 -0800

> 32 bytes of entropy from a
> rather short phrase

Can't be done. It's one of the things about entropy, a deterministic process
cannot create entropy. What you can expect to have happen if you use
something like SHA-256 (I would recommend against your double MD5 hash) is
that the result will have a small amount of entropy diffused throughout the
32-bytes, it will not have 32-bytes worth of entropy. Sometimes this is
called apparent entropy, or other phrases for marketing purposes, but the
fact remains that a deterministic process cannot take a low entropy input
and create a high entropy output. The hand-waving version of the proof is to
define entropy as difficulty of finding  input to a process to duplicate the
output for all input, output pairs. Given that the process under attack is
the same process that was used to generate the output, you must only
determine the input to the process, since the input contains small amounts
of entropy the search can proceed against that entropy. This is not to say
that it will not resist attack, but instead means that you should recommend
to people to use strong passphrases, and the fairly well universally
recognised method to build those passphrases is DiceWare
(http://world.std.com/~reinhold/diceware.html), it's cheap (actually free if
you have at least one die)  it's easy, and it's good.
                    Joe



------------------------------

From: [EMAIL PROTECTED] (David Wagner)
Subject: Re: Comparison of ECDLP vs. DLP
Date: 19 Jan 2001 23:16:04 GMT
Reply-To: [EMAIL PROTECTED] (David Wagner)

Yes, but there is a very important caveat:
proof of strength is impossible to achieve, in general.
I can generate a truly wonderful key, and then publish my private key.
No test, no zero-knowledge proof, no use of elliptic curve cryptography,
is ever going to detect this attack.

------------------------------

From: "Kasper Pedersen" <[EMAIL PROTECTED]>
Crossposted-To: alt.security.pgp,comp.security.pgp.discuss,alt.binaries.cracks
Subject: Re: RE-X-POST Quick-and-Dirty demonstration of twarting US6006328
Date: Fri, 19 Jan 2001 23:33:19 +0100


"Thomas J. Boschloo" <[EMAIL PROTECTED]> wrote in message
news:[EMAIL PROTECTED]...
>
> You do know 386 code and you have nice debuggers, don't you? I always
> thought very highly of you with you PGP Disk patches, chosen keyID
> utilities and things like that. Maybe you could rip the 386 MS-DOS code
> of Screen Thief <http://www.villa.nildram.co.uk/download.htm>. It
> somehow redirects IRQ 1 (the keyboard interrupt signal line) to another
> interrupt and makes it call the interrupt 9 Chris traps in his code (by
> peeking into the interrupt table at offset 0:24 length 4). Chris would
> be too stupid to notice the difference and he has no knowledge on how to
> twart this. This would be the ultimate keylogger that *works under DOS*.
>
Off topic to s.c, but..

The way to redirect irq0..7 is to reprogram the first PIC to another
position (PIC1 is default at 08h), say 78h. Then you can perform snooping no
matter what. Also, ICW2 is readonly, so it's hard to detect. I used it in a
game trainer circa 1989.

/Kasper



------------------------------

From: Splaat23 <[EMAIL PROTECTED]>
Subject: Re: Comparison of ECDLP vs. DLP
Date: Fri, 19 Jan 2001 23:17:51 GMT

I meant an attack specific to RSA. Not even the DH PKV will show that a
private key was generated with a lousy RNG.

In article <[EMAIL PROTECTED]>,
  Roger Schlafly <[EMAIL PROTECTED]> wrote:
> Splaat23 wrote:
> >  I can't imagine that it's very easy to generate a key
> > that looks good but is not, but I'm now very curious as to how this
> > attack could be pulled off in the real world.
>
> It is trivial to generate a key that looks good but isn't. Just
> use a lousy RNG. That was a problem with Netscape RSA keys, until
> it was exposed.
>
> > Also, do you have any references to RSA PKV verification protocols?
>
> There are none that provide any nontrivial assurances. That's
> why you haven't seen the papers.
>


Sent via Deja.com
http://www.deja.com/

------------------------------

From: "Joseph Ashwood" <[EMAIL PROTECTED]>
Subject: Re: Question about security of Oracle get_hash_value
Date: Fri, 19 Jan 2001 15:21:45 -0800


"psyclops" <[EMAIL PROTECTED]> wrote in message
news:94ah10$at9$[EMAIL PROTECTED]...
[snip superfluous information]
> The DBMS_UTILITY.GET_HASH_VALUE function allows a maximum 31 bit
> output, so is inherently insecure, and would probably result in
> collisions.  Additionally, I cannot find any analysis of this function
> anywhere (has anyone done an analysis, or does anyone know any info on
> this function?).
>
> So here is my question: if I were to split a 20 digit username into
> four separate 5-digit values, hash each substring, then multiply the
> four values together to get a ~124 bit result, will this provide a
> higher level of security / collision-resistance than simply hashing the
> full 20-digit username?

> It seems to me that multiplying the hashes like this would decrease the
> likelihood of collisions, as the output space is bigger.  Like I said,
> I don't have a strong math background, so this is realistically just a
> hunch, as I don't really have a grip on how the product of hashes
> affects probabilities...

The problem is that you will be storing it as seperate 31-bit values, each
of these values can be attacked seperately, so your security will actually
be logBase2(2^31*4), or 33-bits. What I would recommend instead is to find a
fast version of MD5 or better SHA-1, SHA-256, SHA-384, SHA-512, Tiger-192,
etc. Store the users hash of the passphrase in the database. SHA-1 is
exceptionally fast on fairly well everything, and MD5 is even faster
(http://www.eskimo.com/~weidai/benchmarks.html for details). I think a big
part of the problem is that you are trying to use SQL to perform
computation, since this will be accessed via compiled code anyway I'd
recommend doing the hash in a programming language instead of a data access
language.
                        Joe




------------------------------

From: Greggy <[EMAIL PROTECTED]>
Subject: Re: Kooks (was: NSA and Linux Security)
Date: Sat, 20 Jan 2001 00:11:53 GMT

In article <[EMAIL PROTECTED]>,
  Benjamin Goldberg <[EMAIL PROTECTED]> wrote:
> tennish wrote:
> And indeed, they were inexcusably sloppy in publishing it, but:
>
>         In the late eighteenth and early nineteenth centuries,
>         there was frequent confusion about whether proposed
>         amendments had become part of the Constitution. "At that time
>         no legal procedure existed to control the communication of
>         action by States to the Federal Government. . . . Uncertainty
>         as to the status of [TONA] continued for eight years." The
>         Eleventh Amendment became effective on February 7, 1795, but
>         was not acknowledged by President John Adams as being in
>         effect until January 8, 1798. Similarly, President Thomas
>         Jefferson's Secretary of State, James Madison, did not declare
>         the Twelfth Amendment in effect until more than three months
>         after it became part of the Constitution. Even in 1845, the
>         editors of United States Statutes at Large were unsure exactly
>         when the Eleventh and Twelfth Amendments had been ratified.
>
> --thirdamendment.com
>
> Greggy wrote:
> [snip]
> > Entire legislatures published the 13th amendment without people
within
> > those bodies crying foul.
> >
> > To assume that they were all fools is beyond any credible argument
you
> > can put forth, but apparently that is what you would have us
believe.
>
> Not necessarily fools, but misinformed.

Is this a joke?  Read your history of those men.


>
> > To imagine that each of these legislatures were conspiring to place
> > into law an improperly ratified amendment is also incredible.
>
> Who says that they *knew* that it was improperly ratified?
>
> > You would have us believe that those who knew the truth intimately
> > would have stood by and said nothing - and many were present when
> > these votes were taking place.
>
> How do you know that people who knew that it hadn't been properly
> ratified where present when the votes were taking place?
>
> > Their actions show us that they knew TONA was ratified and was law.
>
> Their actions show us that they believed TONA was ratified and was
law.

Your statement implies you know better than they did.


>
> [snip]
> > You would have us believe that those involved in publishing
Virginia's
> > state law books were not certain, raised no question or objections,
> > and yet published anyway.  A quick history lesson on who those
> > publishers were would clear that up quickly, but you don't do your
> > research - you just attack.
>
> Publishers are not lawmakers.  Their actions do not make laws.

Look up the history of the publishers and those in charge and you will
see what I mean.


>
> > You would have us believe they knew little and were confused or
> > mislead.  As Richard Green shows in his essay, this is an assertion
> > that defies all the knowledge we have of these men.
>
> Perhaps they knew much, but not about the proposed 13th amendment.

They lived during that time.  And yet you seem to think you know better?

>
> No man is all knowing, or infallible.  Knowledge tends to flow from
few
> to many.  Supposing each of these legislatures gained their knowledge
> from a small number of people, and that small number of people were
> mistaken or misinformed, then what happened could happen.

and I have said so?  Where?  All I said is that there are way too many
of them that were in agreement for you or anyone else to claim they did
not know what was going on around them.

You would have us believe that your hind sight (really far back hind
sight) is better than their living it out themselves.

--
13th amendment to the US Constitution:
  If any citizen of the United States shall accept, claim, receive,
  or retain any title of nobility or honour, or shall, without the
  consent of Congress, accept and retain any present, pension, office,
  or emolument of any kind whatever, from any emperor, king, prince,
  or foreign power, such person shall cease to be a citizen of the
  United States, and shall be incapable of holding any office of
  trust or profit under them, or either of them.


Sent via Deja.com
http://www.deja.com/

------------------------------

From: Greggy <[EMAIL PROTECTED]>
Subject: Re: Kooks (was: NSA and Linux Security)
Date: Sat, 20 Jan 2001 00:12:56 GMT

In article <94ae1d$89p$[EMAIL PROTECTED]>,
  digiboy | marcus <[EMAIL PROTECTED]> wrote:
> In article <949vuu$qh4$[EMAIL PROTECTED]>,
>   Greggy <[EMAIL PROTECTED]> wrote:
> > What have you done?
>
> More than you know. I have explained quite clearly on many occasions
> why what you see is not necessarily what occurs.


Since you saw it fit not to answer in ANY specific terms, let me ask
again...

I have stated the obvious and cited material which cites US law.

What have you done?


--
13th amendment to the US Constitution:
  If any citizen of the United States shall accept, claim, receive,
  or retain any title of nobility or honour, or shall, without the
  consent of Congress, accept and retain any present, pension, office,
  or emolument of any kind whatever, from any emperor, king, prince,
  or foreign power, such person shall cease to be a citizen of the
  United States, and shall be incapable of holding any office of
  trust or profit under them, or either of them.


Sent via Deja.com
http://www.deja.com/

------------------------------

From: Roger Schlafly <[EMAIL PROTECTED]>
Subject: Re: Comparison of ECDLP vs. DLP
Date: Fri, 19 Jan 2001 16:32:16 -0800

David Wagner wrote:
> Yes, but there is a very important caveat:
> proof of strength is impossible to achieve, in general.
> I can generate a truly wonderful key, and then publish my private key.
> No test, no zero-knowledge proof, no use of elliptic curve cryptography,
> is ever going to detect this attack.

That is correct. About all they could do is pass a law against
publishing your private key, and punish you for violating
that law.

But then there is the slightly more subtle tactic of choosing
a low-entropy private. Eg, you could use some old Netscape
software. There is no way to reliably test for these weak keys,
and no way to know whether it was deliberately made weak.

Because of this, I cannot see any point to testing for weak keys,
except in the DH case where a non-key poses a risk to the sender's
private key.

------------------------------

From: [EMAIL PROTECTED] (John Savard)
Subject: Re: Dynamic Transposition Revisited (long)
Date: Sat, 20 Jan 2001 00:07:26 GMT

On Fri, 19 Jan 2001 22:18:40 GMT, [EMAIL PROTECTED] (Terry Ritter) wrote,
in part:

>The main lesson is not about double-transposition, but balance.  If
>all we do is double-shuffle, the ciphertext still contains only the
>symbols present in the plaintext, and that leaks information.  By
>balancing the block, we eliminate that leak.  

I missed that too on my first reading; a reply to my initial post by
Emmanual Goldberg got me to look more carefully and see that.

You are indeed correct: if one processes plaintexts which have equal
numbers of 0 and 1 bits because of processing, and one produces all
possible transpositions with, in a sense, 'equal probability', one can
do as well with transposition as the one-time-pad does with
substitution.

This is important to note theoretically. I've noted, though, that I
don't think people will see it as terribly attractive or convenient in
practice. The ICE block cipher is already a cipher which makes a more
limited use of transposition in a way that more readily 'fits in' with
conventional practice, so transposition isn't totally neglected.

John Savard
http://home.ecn.ab.ca/~jsavard/crypto.htm

------------------------------


** FOR YOUR REFERENCE **

The service address, to which questions about the list itself and requests
to be added to or deleted from it should be directed, is:

    Internet: [EMAIL PROTECTED]

You can send mail to the entire list by posting to sci.crypt.

End of Cryptography-Digest Digest
******************************

Reply via email to