Cryptography-Digest Digest #169, Volume #14      Tue, 17 Apr 01 16:13:01 EDT

Contents:
  Re: Reusing A One Time Pad (SCOTT19U.ZIP_GUY)
  Re: Reusing A One Time Pad (SCOTT19U.ZIP_GUY)
  Re: Reusing A One Time Pad ("Tom St Denis")
  Re: Reusing A One Time Pad ("Tom St Denis")
  Re: Note on combining PRNGs with the method of Wichmann and Hill (Mok-Kong Shen)
  Re: Reusing A One Time Pad (SCOTT19U.ZIP_GUY)
  Re: Reusing A One Time Pad ("Tom St Denis")
  Re: LFSR Security (Ian Goldberg)
  Re: Reusing A One Time Pad ("Joseph Ashwood")
  ansi x9.23 / iso 10126 (Fabian Kaiser)

----------------------------------------------------------------------------

From: [EMAIL PROTECTED] (SCOTT19U.ZIP_GUY)
Subject: Re: Reusing A One Time Pad
Date: 17 Apr 2001 18:14:29 GMT

[EMAIL PROTECTED] (Tom St Denis) wrote in
<Td%C6.31344$[EMAIL PROTECTED]>: 

>You have yet to prove that bijective compression is any more secure.  In
>fact it's not.  Let's demonstrate.  I get a message from you and I want
>to try decrypting it.  Say you use a 64-bit RC5 key or something...  I
>feel it's about money so amongst the ciphertexts I decrypt I get
>

   What are you trying to show?
First of all if I was sending text I would be using my
conditional adaptive huffman compressor so that only
characters in the  set "A to Z' "0 to 9" and space
actually I would use underscore for space it i used space at all.

>AKSJHDKH2309SLDFJSDFHSFJKGHW
>DFGJSHSKFHKJ2340OWE7FKJFSD
>THE MONEY IS IN MY POCKET
>345U3RWHERFRLSJFHKJGFYSUKJFTHSJK
>DFGJLHGJKHGDFJKGHDFKJGDGJDFLK349

    I see using the general bijective compressor you get only
the cases above that contian characters. How nice. Yes you
can say if you know it was about money that you can tell which
one was the input message. And in this example even if you
didn't know it was about money you can guess which one it was.

    Lets suppose instead that I compressed not using a bijective
compressor but a compressor that compresses as well but is non
bijective. Since many cases are not used becasue the test encryption
keys lead to files that could not have compressed with the 
nonbijective compressor

DFGJSHSKF J2340OWE7FKJFSD
THE MONEY IS IN MY POCKET

Gee there are fewer cases to look at since the filter effect
casued by the use of the nonbijective compressor. Gives me
extra information on how to eliminate candidate files. In
the bijective compressor you don't get this free filter to
toss out cases that can't exist.

 Yes Tom which one is easier to look at and guess a solution.
But you know all this we go over and over this all the time
what do you have that is new. Or are you just trying to confuse
new people.


David A. Scott
-- 
SCOTT19U.ZIP NOW AVAILABLE WORLD WIDE "OLD VERSIOM"
        http://www.jim.com/jamesd/Kong/scott19u.zip
My website http://members.nbci.com/ecil/index.htm
My crypto code http://radiusnet.net/crypto/archive/scott/
MY Compression Page http://members.nbci.com/ecil/compress.htm
**NOTE FOR EMAIL drop the roman "five" ***
Disclaimer:I am in no way responsible for any of the statements
 made in the above text. For all I know I might be drugged or
 something..
 No I'm not paranoid. You all think I'm paranoid, don't you!


------------------------------

From: [EMAIL PROTECTED] (SCOTT19U.ZIP_GUY)
Subject: Re: Reusing A One Time Pad
Date: 17 Apr 2001 18:33:17 GMT

[EMAIL PROTECTED] (Tom St Denis) wrote in
<hA%C6.31591$[EMAIL PROTECTED]>: 

>
>"SCOTT19U.ZIP_GUY" <[EMAIL PROTECTED]> wrote in message
>news:[EMAIL PROTECTED]...
>>     I can argue with your statments about the "super evil deflate".
>> Being better. Compression is a lose lose situation. It is well known
>> for certain classes of files Huffman would be the optimal. I don't
>> think that can be said for deflate for any class of files.
>
>There are no classes of files where huffman is optimal.  If the file has
>a bias towards a certain symbol more than likely it's in some pattern.
>

   Actually I need go no farther. 

<snip>

David A. Scott
-- 
SCOTT19U.ZIP NOW AVAILABLE WORLD WIDE "OLD VERSIOM"
        http://www.jim.com/jamesd/Kong/scott19u.zip
My website http://members.nbci.com/ecil/index.htm
My crypto code http://radiusnet.net/crypto/archive/scott/
MY Compression Page http://members.nbci.com/ecil/compress.htm
**NOTE FOR EMAIL drop the roman "five" ***
Disclaimer:I am in no way responsible for any of the statements
 made in the above text. For all I know I might be drugged or
 something..
 No I'm not paranoid. You all think I'm paranoid, don't you!


------------------------------

From: "Tom St Denis" <[EMAIL PROTECTED]>
Subject: Re: Reusing A One Time Pad
Date: Tue, 17 Apr 2001 18:45:44 GMT


"SCOTT19U.ZIP_GUY" <[EMAIL PROTECTED]> wrote in message
news:[EMAIL PROTECTED]...
> [EMAIL PROTECTED] (Tom St Denis) wrote in
> <Td%C6.31344$[EMAIL PROTECTED]>:
>
> >You have yet to prove that bijective compression is any more secure.  In
> >fact it's not.  Let's demonstrate.  I get a message from you and I want
> >to try decrypting it.  Say you use a 64-bit RC5 key or something...  I
> >feel it's about money so amongst the ciphertexts I decrypt I get
> >
>
>    What are you trying to show?
> First of all if I was sending text I would be using my
> conditional adaptive huffman compressor so that only
> characters in the  set "A to Z' "0 to 9" and space
> actually I would use underscore for space it i used space at all.
>
> >AKSJHDKH2309SLDFJSDFHSFJKGHW
> >DFGJSHSKFHKJ2340OWE7FKJFSD
> >THE MONEY IS IN MY POCKET
> >345U3RWHERFRLSJFHKJGFYSUKJFTHSJK
> >DFGJLHGJKHGDFJKGHDFKJGDGJDFLK349
>
>     I see using the general bijective compressor you get only
> the cases above that contian characters. How nice. Yes you
> can say if you know it was about money that you can tell which
> one was the input message. And in this example even if you
> didn't know it was about money you can guess which one it was.
>
>     Lets suppose instead that I compressed not using a bijective
> compressor but a compressor that compresses as well but is non
> bijective. Since many cases are not used becasue the test encryption
> keys lead to files that could not have compressed with the
> nonbijective compressor
>
> DFGJSHSKF J2340OWE7FKJFSD
> THE MONEY IS IN MY POCKET
>
> Gee there are fewer cases to look at since the filter effect
> casued by the use of the nonbijective compressor. Gives me
> extra information on how to eliminate candidate files. In
> the bijective compressor you don't get this free filter to
> toss out cases that can't exist.
>
>  Yes Tom which one is easier to look at and guess a solution.
> But you know all this we go over and over this all the time
> what do you have that is new. Or are you just trying to confuse
> new people.

You miss a crucial point Dave.  If there are 2^256 keys it doesn't matter
how fast you can filter them out.  My point was that I **could** filter them
out (albeit maybe slower) even when bijective encoders are used.  You
haven't prevented anything, just delayed it.

However, I would rather just increase my key size and use a cipher resilient
to known attacks then use ad hoc security measures.  Not only that I will
benefit from using less bandwidth by using a decent codec.

Tom



------------------------------

From: "Tom St Denis" <[EMAIL PROTECTED]>
Subject: Re: Reusing A One Time Pad
Date: Tue, 17 Apr 2001 18:46:51 GMT


"SCOTT19U.ZIP_GUY" <[EMAIL PROTECTED]> wrote in message
news:[EMAIL PROTECTED]...
> [EMAIL PROTECTED] (Tom St Denis) wrote in
> <hA%C6.31591$[EMAIL PROTECTED]>:
>
> >
> >"SCOTT19U.ZIP_GUY" <[EMAIL PROTECTED]> wrote in message
> >news:[EMAIL PROTECTED]...
> >>     I can argue with your statments about the "super evil deflate".
> >> Being better. Compression is a lose lose situation. It is well known
> >> for certain classes of files Huffman would be the optimal. I don't
> >> think that can be said for deflate for any class of files.
> >
> >There are no classes of files where huffman is optimal.  If the file has
> >a bias towards a certain symbol more than likely it's in some pattern.
> >
>
>    Actually I need go no farther.

Other than contrived cases.  I could make a 2 bit arithmetic encoded file
too.  Does that represent a real message?  Nope.  It's purely a contrived
one.

For text, hands down a LZ method will win 99 out of 100 times.    (as
compared to a huffman coder, sure LZP or BZIP will beat LZ77 ...)

Tom



------------------------------

From: Mok-Kong Shen <[EMAIL PROTECTED]>
Crossposted-To: sci.crypt.random-numbers
Subject: Re: Note on combining PRNGs with the method of Wichmann and Hill
Date: Tue, 17 Apr 2001 21:02:43 +0200



Brian Gladman wrote:
> 
> "Mok-Kong Shen" <[EMAIL PROTECTED]> wrote:
> 
> > Brian Gladman wrote:
> > >
> > > "Mok-Kong Shen" <[EMAIL PROTECTED]> wrote:
> >
> > > If you read my earlier posts you will see that I have already said that
> > > adding two generators with multipliers of 1.0 and taking the result 'mod
> > > 1.0' will be uniform.  This was never in doubt.  The issue is not this
> but
> > > rather that of what happens when two such generators are added with
> > > multipliers that are both close to, but different from, 1.0.
> >
> > Well, I suppose it is evident that as the multipliers
> > approaches 1.0, the result will also approach that for
> > the case with multipliers exactly equal to 1.0. This
> > is from simple 'continuity' considerations. Thus your
> > previous claim that using small deviations from 1.0 will
> > lead to very large non-uniformness clearly cannot hold.
> 
> The phrase 'very large non-uniformness' is entirely your own
> misrepresentation of what I said.  I used the phrase 'very non-uniform', by
> which I meant that the non uniformity would be obvious. In other words, the
> results would deviate so far from uniformity that the differences would be
> evident from simple inspection without the need for more advanced
> statistical tests to find them. And this is exactly what the data below
> shows.

Sorry for my English language incompetency. I am quite 
sure though that many foreigners would interpret your phrase 
'very non-uniform' in the sense that I have understood.

> 
> > > For example, here is the result of 10,000,000 trials for each of 3
> PRNGs,
> > > two uniform generators, A and B, and a third C, which is (0.9 * A + 1.1
> * B)
> > > mod 1.0:
> > >
> > >   range            gen A      gen B      gen C
> > > [0.0:0.1) ->    999970   999545   958872
> > > [0.1:0.2) ->    999207 1002146 1009123
> > > [0.2:0.3) ->    998672   999233 1009761
> > > [0.3:0.4) ->  1000023   999415 1009993
> > > [0.4:0.5) ->  1000984   999239 1009305
> > > [0.5:0.6) ->  1001546 1000377 1010543
> > > [0.6:0.7) ->    998803   998860 1010898
> > > [0.7:0.8) ->  1000290   999234 1008644
> > > [0.8:0.9) ->  1001218 1000259 1011547
> > > [0.9:1.0) ->    999287 1001692   961314
> > >
> > > Notice that the first and last intervals for the combined generator (C)
> are
> > > significantly less populated than the other eight - these intervals are
> > > respectively about 0.96 and 1.01 times the frequency expected from a
> uniform
> > > generator.
> >
> > You have to study uniformity with standard statistical
> > methods. I suppose that the chi-square test is useful
> > for that. There must be a sufficient sample size in
> > order to be able to obtain reasonable results. Small
> > sample size cannot provide useful data in the sense of
> > statistics. (The FIPS tests, for example, need quite a
> > bit of data, even though someone in the group considered
> > the amounts to be too small.)
> 
> It is not necessary to use statistical tests to see that generator C's data
> above is not uniformly distributed.  I am confident that, after 10,000,000
> tests, the probability of getting the above distribution by chance is very
> small.
> 
> I am confident in this result without the need for a formal test but you are
> free to run one if you want to prove that my confidence is misplaced.

Again sorry. I am not a statistician. I can trust the
methods stated in most statistical textbooks but can't
trust others. Chi-square test is also given in Knuth
vol. 2. The other method in question is Kolmogorow-
Smirnov. For the issue in point you have to use some 
good real-life pseudo-random variables and do the test
in a manner recommended by the textbooks (in particular,
size of sample) and compare the results, once with all 
factors 1.0 and once with deviated values, eventually 
repeating the experiments an appropriate number of times. 
(This is not any dis-appreciation of the value of your 
method of testing uniformity. I would certainly be happy 
to be among the first people to use it, if it has been 
published in an article in a journal of statistics.
Until then, I can only accept results from methods
currently known in the literature, unfortunately.)

M. K. Shen

------------------------------

From: [EMAIL PROTECTED] (SCOTT19U.ZIP_GUY)
Subject: Re: Reusing A One Time Pad
Date: 17 Apr 2001 18:56:04 GMT

[EMAIL PROTECTED] (Tom St Denis) wrote in
<sb0D6.31925$[EMAIL PROTECTED]>: 

>
>"SCOTT19U.ZIP_GUY" <[EMAIL PROTECTED]> wrote in message
>news:[EMAIL PROTECTED]...
>> [EMAIL PROTECTED] (Tom St Denis) wrote in
>> <Td%C6.31344$[EMAIL PROTECTED]>:
>>
>> >You have yet to prove that bijective compression is any more secure. 
>> >In fact it's not.  Let's demonstrate.  I get a message from you and I
>> >want to try decrypting it.  Say you use a 64-bit RC5 key or
>> >something...  I feel it's about money so amongst the ciphertexts I
>> >decrypt I get 
>> >
>>
>>    What are you trying to show?
>> First of all if I was sending text I would be using my
>> conditional adaptive huffman compressor so that only
>> characters in the  set "A to Z' "0 to 9" and space
>> actually I would use underscore for space it i used space at all.
>>
>> >AKSJHDKH2309SLDFJSDFHSFJKGHW
>> >DFGJSHSKFHKJ2340OWE7FKJFSD
>> >THE MONEY IS IN MY POCKET
>> >345U3RWHERFRLSJFHKJGFYSUKJFTHSJK
>> >DFGJLHGJKHGDFJKGHDFKJGDGJDFLK349
>>
>>     I see using the general bijective compressor you get only
>> the cases above that contian characters. How nice. Yes you
>> can say if you know it was about money that you can tell which
>> one was the input message. And in this example even if you
>> didn't know it was about money you can guess which one it was.
>>
>>     Lets suppose instead that I compressed not using a bijective
>> compressor but a compressor that compresses as well but is non
>> bijective. Since many cases are not used becasue the test encryption
>> keys lead to files that could not have compressed with the
>> nonbijective compressor
>>
>> DFGJSHSKF J2340OWE7FKJFSD
>> THE MONEY IS IN MY POCKET
>>
>> Gee there are fewer cases to look at since the filter effect
>> casued by the use of the nonbijective compressor. Gives me
>> extra information on how to eliminate candidate files. In
>> the bijective compressor you don't get this free filter to
>> toss out cases that can't exist.
>>
>>  Yes Tom which one is easier to look at and guess a solution.
>> But you know all this we go over and over this all the time
>> what do you have that is new. Or are you just trying to confuse
>> new people.
>
>You miss a crucial point Dave.  If there are 2^256 keys it doesn't
>matter how fast you can filter them out.  My point was that I **could**
>filter them out (albeit maybe slower) even when bijective encoders are
>used.  You haven't prevented anything, just delayed it.

   Tom you miss the point. The idea of checking every key is never
done. It is just to show the information is there for the attacker.
But you never seem to get past that point. No a dumb blind search of
whats left is seldom if ever the only way of attack. The point of
the exercise is to show how much information is there so one could
mount ab attack.

>
>However, I would rather just increase my key size and use a cipher
>resilient to known attacks then use ad hoc security measures.  Not only
>that I will benefit from using less bandwidth by using a decent codec.

    Then feel free to do so. I would rather have it all and would put
more trust in a system that looked at all possible weaknesses. Something
you seem to fear to do.


David A. Scott
-- 
SCOTT19U.ZIP NOW AVAILABLE WORLD WIDE "OLD VERSIOM"
        http://www.jim.com/jamesd/Kong/scott19u.zip
My website http://members.nbci.com/ecil/index.htm
My crypto code http://radiusnet.net/crypto/archive/scott/
MY Compression Page http://members.nbci.com/ecil/compress.htm
**NOTE FOR EMAIL drop the roman "five" ***
Disclaimer:I am in no way responsible for any of the statements
 made in the above text. For all I know I might be drugged or
 something..
 No I'm not paranoid. You all think I'm paranoid, don't you!


------------------------------

From: "Tom St Denis" <[EMAIL PROTECTED]>
Subject: Re: Reusing A One Time Pad
Date: Tue, 17 Apr 2001 19:17:13 GMT


"SCOTT19U.ZIP_GUY" <[EMAIL PROTECTED]> wrote in message
news:[EMAIL PROTECTED]...
>    Tom you miss the point. The idea of checking every key is never
> done. It is just to show the information is there for the attacker.
> But you never seem to get past that point. No a dumb blind search of
> whats left is seldom if ever the only way of attack. The point of
> the exercise is to show how much information is there so one could
> mount ab attack.

Unless the cipher is bad you can't simply say keys XYZ and WRT and ABC will
lead to bad plaintexts without first trying the key.  I don't get what you
are trying to say.  Searches like the DES and RC5 challenges try every key
and use a set of primitive heuristics to flag possible keys.  If you could
predict the plaintext without actually using the cipher then it's weak and
your argument is moot.

> >
> >However, I would rather just increase my key size and use a cipher
> >resilient to known attacks then use ad hoc security measures.  Not only
> >that I will benefit from using less bandwidth by using a decent codec.
>
>     Then feel free to do so. I would rather have it all and would put
> more trust in a system that looked at all possible weaknesses. Something
> you seem to fear to do.

Hmm but you make no sense (your above comment).  I would rather trust my
neurosurgen to get my brain right, my cardiologist to get my heart right
etc... you seem to think one thing can do it all...  (i.e one codec is a)
secure and b) a good codec in compression ratio wise).

Tom



------------------------------

From: [EMAIL PROTECTED] (Ian Goldberg)
Crossposted-To: sci.crypt.random-numbers
Subject: Re: LFSR Security
Date: 17 Apr 2001 19:32:02 GMT

In article <[EMAIL PROTECTED]>,
Trevor L. Jackson, III <[EMAIL PROTECTED]> wrote:
>I've failed to show the single bit connectivity of the optimal
>intermediate machines, and you'd found a counter example.  Fairly
>conclusive evidence that the single-bit approach to maintaining the
>invariant is inadequate.
>
>Did you find your example by analysis or by search?

A bit of each.  Consider the linear complexity profile of a string
(as defined in HAC).  It can only increase when L <= N/2, so we start
off with 0 0 0 0 1 in order to make it jump to L=5, and be stuck there
for at least the next 5 bits.  So I took 4 bits to play with,
and ended with a ?.  That way, for sure, L=5 after the first ?.
Now I want to end up with at least 4 different values of L at the
end (7,8,9,10 as it turns out), for the "trap" to work, so I need to
arrange to crawl up the N/2 line, and to cause "jumps" at just the right
times.  Trying out a few different values for the play bits found
an appropriate solution.  Took me maybe a little under an hour on
a whiteboard, pad, and a program that runs BM.

   - Ian

------------------------------

From: "Joseph Ashwood" <[EMAIL PROTECTED]>
Subject: Re: Reusing A One Time Pad
Date: Tue, 17 Apr 2001 12:38:04 -0700

I was using a very coarse measurement, based on percentage assuredness that
it can/cannot be broken. Absolute assuredness that it can be broken in a
minimal amount of time (apprixmating 0, but not true zero for obvious
reasons) by any adversary was 0, and requiring infinite resources in at
least one dimension was 1. I guess I shouldn't have said it is precisely 0,
more that it approaches 0. Like I said it's a very coarse method, and
lacking in many ways (most importantly in concreteness of definition), I
simply used it because it was convenient and I could give a number that
worked very well for saying don't do that.
                                            Joe
"Douglas A. Gwyn" <[EMAIL PROTECTED]> wrote in message
news:[EMAIL PROTECTED]...
> Joseph Ashwood wrote:
> > The moment you reuse *any* portion of the pad, the security
> > immediately falls to precisely 0. That's a simple fact of life
>
> I'm curious how you're measuring "security".



------------------------------

Date: Tue, 17 Apr 2001 23:48:08 +0200
From: Fabian Kaiser <[EMAIL PROTECTED]>
Subject: ansi x9.23 / iso 10126

does anyone know something about padding with ansi x9.23 (or iso 10126,
should be the same). 

thanks in advance, fabian

------------------------------


** FOR YOUR REFERENCE **

The service address, to which questions about the list itself and requests
to be added to or deleted from it should be directed, is:

    Internet: [EMAIL PROTECTED]

You can send mail to the entire list by posting to sci.crypt.

End of Cryptography-Digest Digest
******************************

Reply via email to