Cryptography-Digest Digest #497, Volume #10       Tue, 2 Nov 99 20:13:02 EST

Contents:
  Re: Doesn't Bruce Schneier practice what he preaches? (Anton Stiglic)
  Re: Scientific Progress and the NSA (was: Bruce Schneier's Crypto     Comments...) 
(SCOTT19U.ZIP_GUY)
  Your Opinions on Quantum Cryptography ([EMAIL PROTECTED])
  Re: basic question ([EMAIL PROTECTED])
  Re: Re: Re: Re: Compression: A ? for David Scott (CoyoteRed)
  Re: Compression: A ? for David Scott (Tim Tyler)
  Re: "Risks of Relying on Cryptography," Oct 99 CACM "Inside Risks" column 
([EMAIL PROTECTED])

----------------------------------------------------------------------------

From: Anton Stiglic <[EMAIL PROTECTED]>
Crossposted-To: alt.security.scramdisk
Subject: Re: Doesn't Bruce Schneier practice what he preaches?
Date: Tue, 02 Nov 1999 16:50:34 -0500

"Thomas J. Boschloo" wrote:

> If a flaw would be found in the program, this would be very bad press
> for counterpane. So it is probably secure.
>
> Not releasing the source of the product, just decreases the chance of
> such a unfortunate incident to happen :)

Yes, that is perfect!  Perfect for the vendors!  Hide the security faults,
don't
let the customer know about them, make them think what they use is safe
when in fact it has holes.   If it breaks somewhere, people might see it and
tell,
thus the vendor could create a much better product.  Take Windows for a
counter
example to this,
why is it you think we have to pay something like 100$ a version, which
keeps on
having bugs, they always hide the source so they can't get any help.  They
don't
want to, they like it that way, they get richer and richer every version,
while the
customer keeps on paying more and more.



------------------------------

From: [EMAIL PROTECTED] (SCOTT19U.ZIP_GUY)
Subject: Re: Scientific Progress and the NSA (was: Bruce Schneier's Crypto     
Comments...)
Date: Tue, 02 Nov 1999 23:07:45 GMT

In article <[EMAIL PROTECTED]>, [EMAIL PROTECTED] (Doug Stell) 
wrote:
>On Tue, 02 Nov 1999 10:07:55 -0800, J Shane Culpepper
><[EMAIL PROTECTED]> wrote:
>
>Shane.
>
>>Could you let me know which Number Theory book you are referring to?  I'm
> always
>>interested in getting my hands on good texts....
>
>The math text is the following. However, I doubt that you can get a
>copy unless you know somebody and have a working relationship with the
>agency. It isn't classified and doesn't contain any hints about any
>specific algorithm, but they do like to keep its distribution under
>some level of control.
>
>    NSA Technical Literature Series Monograph N. 25, "Number
>    Theory for Cryptology: A Textbook for MA-550" (3rd Edition,
>     January 1991) 
>
>I can't say that it is a well-written text, but it does appear to
>address and limit itself to the topics of interest to cryptographers,
>etc.. It is also great for putting yourself to sleep at night.
>
>The text was recommended to me by a military person who participates
>in this new group. So, I decided to ask to NSA representative on my
>project for a favor. The "favor" they asked in return is that I try to
>let people know that the NSA is not the evil organization some people
>think it is or some movie portray it as being. Seriously, they are
>good people, faced with a important and difficult tasks.
>
>doug
>

  Well I guess that proves the NSA is only for the good of the
people. I guess I can sleep good at night knowing that you have
set the world straight by broadcasting the truth about the NSA.
  But what are these important and difficult tasks. They seem
to be able to find people like you that are very capable of spreading
what ever lies they want.
  If you think they would tell you the truth then you don't know
much about the US government. It is very very crooked which
I guess it must be for the police state we seem to be headed for.



David A. Scott
--

SCOTT19U.ZIP NOW AVAILABLE WORLD WIDE
http://www.jim.com/jamesd/Kong/scott19u.zip
                    
Scott famous encryption website NOT FOR WIMPS
http://members.xoom.com/ecil/index.htm

Scott rejected paper for the ACM
http://members.xoom.com/ecil/dspaper.htm

Scott famous Compression Page WIMPS allowed
http://members.xoom.com/ecil/compress.htm

**NOTE EMAIL address is for SPAMERS***

------------------------------

From: [EMAIL PROTECTED]
Subject: Your Opinions on Quantum Cryptography
Date: Tue, 02 Nov 1999 22:18:09 GMT

Dear All,

I am preparing a short paper on Quantum Cryptography. I would be most
grateful if you could give your opinion/thought/knowledge on the
following points:

1. Is there a need for Quantum Cryptography?
2. Will Quantum Cryptography reach a phase where it can be implemented
over long distances successfully?
3. Will Quantum Cryptography become a neccesity against increasing
advanced crypto attacks?

Please add any other thoughts you have on this topic.

Also, any helpful website addresses would be appreciated.

Best Wishes
VivS


Sent via Deja.com http://www.deja.com/
Before you buy.

------------------------------

From: [EMAIL PROTECTED]
Subject: Re: basic question
Date: Tue, 02 Nov 1999 22:23:05 GMT

fungus wrote:
> Muharem Hrnjadovic wrote:
> > [...] I would like to check whether the password supplied
> > by the user is the valid one [...]
>
> Encrypt the password using itself as a key and put the result in the
> file. To check a password, use it to decrypt the start of the file
> and see if the same password pops out.
>
> If you don't need an *exact* check you could store a hash of the
> password in the file and compare this with a hash of the password
> being used for decryption.

The password under itself is not an *exact* check.
For example if the cipher is periodic XOR with the
password, all passwords of the same length will
pass the check.

--Bryan


Sent via Deja.com http://www.deja.com/
Before you buy.

------------------------------

From: [EMAIL PROTECTED] (CoyoteRed)
Subject: Re: Re: Re: Re: Compression: A ? for David Scott
Date: Tue, 02 Nov 1999 22:46:14 GMT
Reply-To: this news group unless otherwise instructed!

On Tue, 02 Nov 1999 22:11:56 GMT, [EMAIL PROTECTED]
(SCOTT19U.ZIP_GUY) wrote:

>>Also, how well does this resultant random file from a bad key compress
>>with a different compression routine?  Could an angle of attack be
>>formed by comparing your o-o-o re-compressed data with data compressed
>>by a n-o-o-o?
>     I am not sure just what you mean here.
>>
>>Because, by nature, your decompressed file will compress back into the
>>same file.  But, if the resultant decompressed file can't be
>>compressed by a n-o-o-o then this tells the attacker something.
>
>   I think the assumption is all files compress ( well change) with a 
>compressor so just what do you mean.


Okay, say an attacker attempts a key.  He gets a file X.

Now with n-o-o-o compression an attempt to Comp(DeComp(X))=X will
fail.  This is a serious flaw in our encryption scheme.

So we switch to a o-o-o for encryption to foil this angle of attack.

So, when the attacker attempts a key and applies Comp(DeComp(X))=X it
will always come back true.  So far, so good.

Next he looks at the size of the resultant file and it has expanded
out as if he had decompressed /valid/ data.  He looks at it but it's
still jibberish.

Now, we know that this jibberish will compress back down well with
/your/ routine, but how well does this compress with a standard
routine like ZIP?

If he takes the attempted file that has been decompressed and compress
with your o-o-o AND a ZIP to get two different files, an o-o-o'ed file
and the ZIP'ed file.  He compares these two for size and efficiency of
compression, will he see a difference? 

Most importantly, are these differences consistent with other failed
keys AND the proper key?

I'm not a programmer, but I would be interested in the following
experiment:

Take a text file (copy a message from this NG) and a random data file
of equal length.

Compress both with your o-o-o and PKZIP.

Make up two more data files the same size as both o-o-o compressed
files (these will represent a decrypt with an invalid key and
presumably will be of different lengths.)  

Decompress both of the second random files with your o-o-o.  These
will now represent a decompression of two files from an invalid key.

So, we have:

-Our two orginal files TEXT1 and RANDOM1

-Our four compressed files TEXT1.ooo, TEXT1.zip, RANDOM1.ooo, and
RANDOM.zip

-Our two random files that represent invalid key attempts that are the
size of RTEXT1.ooo and RRANDOM1.ooo

-Our two decompressed files DeRTEXT1.ooo and DeRRANDOM1.ooo


Now compress both DeRTEXT1.ooo and DeRRANDOM1.ooo with PKZIP to get
DeRTEXT1.ooo.zip and DeRRANDOM1.ooo.zip

How does *1 and DeR*1.000 compare?  (Comparisons of the original file
and decompression of file with invalid key)

What about DeR*.ooo.zip and *.ooo.zip? (Comparing a ZIP compression of
a o-o-o decompressed file, one with a valid key and the other
without.) 

 (We'd have to run it with several random files to determine if there
is any usable bias)

The conclusions of this little experiment will answer, can this be
used as an attack against a o-o-o compression based encryption scheme?

It'll be a little more complex than Comp(DeComp(X))=X to be sure!

-- 
CoyoteRed
CoyoteRed <at> bigfoot <dot> com
http://go.to/CoyoteRed
PGP key ID: 0xA60C12D1 at ldap://certserver.pgp.com


------------------------------

From: Tim Tyler <[EMAIL PROTECTED]>
Subject: Re: Compression: A ? for David Scott
Reply-To: [EMAIL PROTECTED]
Date: Tue, 2 Nov 1999 22:27:26 GMT

Tom <[EMAIL PROTECTED]> wrote:
: On Sun, 31 Oct 1999 11:34:34 GMT, Tim Tyler <[EMAIL PROTECTED]> wrote:
:>Tom <[EMAIL PROTECTED]> wrote:
:>: On Sat, 30 Oct 1999 17:29:55 GMT, Tim Tyler <[EMAIL PROTECTED]> wrote:
:>:>Tom <[EMAIL PROTECTED]> wrote:

:>:>: Measuring by compression ratio is objective.  You're definition of
:>:>: adding information sounds subjective. [...]
:>:>
:>:>No - it could be measured precisely using entropic measures on both
:>:>pre- and post- compressed texts.
:>
:>: Has this been done?
:>
:>I doubt it.  Measuring entropy accurately is very difficult in general.
:>
:>One-on-one compressors have only just been invented.  I don't know if
:>people have been very interested in quantifying how bad non-one-on-one
:>compressors are before now.

: If "one-on-one" (symmetric compression) was better at reducing
: patterning, the files would be smaller

Indeed.  When I said "bad" I was referring to giving away information to
attackers, not *just* file size.

:  The objective, quantitative measurement of compression algorithms
: is compression ratio.

Not for compression before encryption it's not.

Compression ratio is *one* factor.  Whether the compressor systematically
adds information to the file, is another, largely independent factor.

[snip]

:>Certainly I'm more interested in building compressors which avoid adding
:>clues of their own to files than I am in measuring /exactly/ how shafted
:>ordinary compressors are on this front ;-)

: Or if they are at all!

They are.  They add information.  It has been demonstrated for a number of
compressors.  In fact, no other one-on-one compressor has been found yet.

How severe the problem is, and how easy it is for analysts to make use of
the information has not been so widely studied, AFAIK.

: If the point is to reduce patterning, why not address that [...]

The point is to increase the entropy per bit.  You *can't* increase the
entropy of the whole file, without injecting "real" randomness.

: I can certainly understand the rationale for looking at compression
: without added patterning, but what does symmetry have to do with it?

It's a condition which prevents the compressor from adding information of
its own making to the file, when the one-on-one property is present.

:>: Everything I've seen has been subjective, based on viewing the file, or
:>: anecdotal; and even these examples weren't specific as to whether the
:>: patterning was confined to the header and footer or not.
:>
:>Perhaps you need to look again at the eariler posts.  David has discussed
:>ordinary Huffman compression on a number of occasions.  This is now the
:>third time I have mentioned that LZ compression schemes typically map both
:>"3X + 5X" and "4X + 4X" and "8X" to "XXXXXXXX".

: But that's an example of a pattern in the input, not the output.
: We're not talking about the input.

?

These patterns are in the input to the decompressor.

They illustrate clearly that the non-one-on-one property can be
distributed throughout the file - and not confined to "headers".

:>:>: If we try to compress a file compressed by pkzip - a "bad" compressor
:>:>: by your definition, we'll find it doesn't compress much, if any.
:>:>: This could be taken as an objective measurement of a lack of
:>:>: information added, overall.
:>:>
:>:>No it could not.  The same incompressibility would result if you appended
:>:>32MB of hardware-generated random numbers to the file.
:>
:>: Actually that wouldn't, because although the random data appended
:>: wouldn't compress, the first part of the file would, and it would
:>: reduce the size of the file.
:>
:>...but you've *already* said the first part of the file has been
:>compressed by PKZIP and - in your own words - "we'll find it doesn't
:>compress much, if any"!
:
: Random data isn't compressible, nor is pkzip output generally
: compressible.  Both inputs are an example of low patterning.  I'm not
: suggesting that ALL patterns are eliminated by compression, only that
: the compressor with the highest compression ratio should have the
: largest reduction of patterning.  This should be true even if counting
: the information added by compression.

Information added by the compressor may be qualatitatively different from
redundant information in the plaintext that has not been squeezed out.

In particular information added may be essentially the same in every file,
independent of the plaintext of the messages.

You should be able to see why we are emphasising "added information" over
"redundancy that has failed to be squeezed from the messages" from this.

:>:>: A few bytes in the header?  Sure.  Patterns throughout the file?  Doesn't
:>:>: seem likely.
:>:>
:>:>You don't seem to grasp how severly compression preeceeding encryption
:>:>differs from ordinary compression:
:>
:>: As far as the compression goes, it doesn't differ at all.  
:>
:>: What this sounds like is a presentation of a new form of encryption,
:>: presented in a way to side step analysis of the strength of the cipher.
:>
:>Analysis of the strength of the cypher would be fine.  It's *cracking* the
:>cypher that might cause problems.
:
: I'm wondering if this scheme is being proposed so that someone
: couldn't tell if a decrypted file was, in fact, decrypted correctly,
: as the compressed file would always decrypt to "something".

This is one of a number of plus points of the scheme.  It is hardly the
primary motivation, though.

: If so, there are a handful reasons why this won't work.

?

: First, if the compressed file has patterns, especially if the patterns
: are specific to the compressor, it'll be recognizable as probably plaintext.

?

: Second, decompressing the file isn't difficult at all, and would
: quickly establish if the decryption were correct or not.

?

If the compression is good enough, /all/ decompressed messages will
look plausible.

We may not have the technology to do this for text, but for some
formats, compression more closely approaches this ideal.

: Finally, it seems of interest only for a brute force attack, and
: this shouldn't be a concern with a decent algorithm and key size.

I agree that brute-force attacks should not get too much attention.

However, there will be cases where the attack can be used.

Imagine you have extractes 102 bits of the key from agent orange before he
commits suicide.

Suddenly a brute-force attack on the remainder becomes plausible.

There are other circumstances in analysis where the ability to rule out
particular keys is useful.

:>: I'll grant you that poor compression can increase patterning, and provide
:>: for known plaintext attacks, but again, unless random data or a keyed
:>: system is added, the compression will still result in some form of
:>: known plaintext attack being possible.
:>
:>This appears to be a questionable notion.  In the case where compression
:>reduces the size of the message to the size of the key, the entire system
:>reduces to a OTP.

: Even if it's a OTP, it still allows a known plaintext attack.

Yes, /if/ the keys have been generated by a less-than perfectly random
process, such an attack might work.

:>Perhaps.  However it depends on the volume of information concerned
:>in each case, and various other factors.
:>
:>Note that the compressor may add the /same/ type of regularity to all the
:>files it treats, even random files where an analytic attack based on
:>patterns in the data would not normally be feasible.
:>
:>You need to quantify your problems before claiming one is smaller than
:>another.  With qualatatively different security problems - such as the
:>ones faced here - this can be difficult.
:>
:>Fortunately - in principle - both problems can be eliminated.
:>
:>However "perfect" compressors are rare beasts - but fortunately there
:>is at least one compressor that has completely eliminated the other
:>problem.
:
: Having a compressor that doesn't have a defined header format or CRC's
: is certainly an advantage, but I don't see how that has anything to do
: with the concept of "one-on-one".

It doesn't necessarily.  Did anyone claim otherwise?

: That the compression is non-symtetrical or non-unique doesn't equate
: with added data.

Yes it does.  Sorry.

:>There are probably patterns in the compressed file remaining from
:>inadequate compression - but that has nothing to do with wther or not
:>there are additional patterns generated by the compressor there as well.
:
: I would agree - there are two issues.  I'd contend that the patterns
: resulting from inadequate compression would tend to be lower with
: compression algorithms of higher compression ratio, and I again
: haven't seen examples of patterns added by compression, again
: excepting housekeeping information, which of course is a problem.

What counts as "housekeeping information"?

/Any/ systematically-added information represents a potential security
problem.

:>:>No.  Most compression programs are specifically designed not to do this.
:>
:>: I'll agree with programs, but not algorithms, as the programs add the
:>: CRC information you've mentioned.  This I'll also agree is a problem,
:>: especially if there are checksums scattered within the file.
:>
:>Algorithms with no concern for error recovery or detection may not be
:>"designed to do this".  A number of them do /still/ scatter their own data
:>carelessly through the file, though.  See the LZW family, for example.

: But how much is added patterning, and is this more than they gain by
: having a higher compression ratio?

Added patterning is fifferent from failure to compress as well as is
possible.  Different types of attack result.

You can't very usefully say one is "more" than the other - unless
one of them is zero.

[snip rest]
-- 
__________
 |im |yler  The Mandala Centre  http://www.mandala.co.uk/  [EMAIL PROTECTED]

Where there's a will, I want to be in it.

------------------------------

From: [EMAIL PROTECTED]
Subject: Re: "Risks of Relying on Cryptography," Oct 99 CACM "Inside Risks" column
Date: Tue, 02 Nov 1999 23:13:47 GMT

In article <[EMAIL PROTECTED]>,
  "Trevor Jackson, III" <[EMAIL PROTECTED]> wrote:
> [EMAIL PROTECTED] wrote:

>>In another message (see:
>>http://www.deja.com/threadmsg_ct.xp?AN=540914195" ) I suggested to
>>send a list of only *one* ciphersuite, i.e. not to send a list at
>>all. If I send a message containing my information, it should be I
>>who defines how to encrypt it - not the other party in some way.
>>Security should not be negotiable.
>
>Why do you care?  If you choose a set of ciphers in which you trust you
>*are* defining how to encrypt your information.

I see your point, but this wouldn't work with email for example. Here
the choice of cipher must be made by me alone.

>If you are stuck at 3DES and the other person insists upon AES_1, will
>you prefer to not communicate rather than communicate with a cipher
>other than your favorite?

Well, if I trust 3DES and don't trust AES1 then I should be able
encrypt my messages with 3DES only. Communication will always be
possible if, as I suggested, there exists a public repository of
ciphersuits where executable code or source code can be downloaded.
This repository must be super-certified; can include recommendations by
organizations that define standards, by governments, or by
cryptologists; must include code in Java but can also include code
optimized for different operating environments, etc.

The big advantage here is simplicity and flexibility. I don't have to
define a complicated ciphersuite negotiating protocol today; in the
future this protocol itself may prove to have been a bad choice. Having
to change a standard protocol that negotiates the ciphersuite is almost
as costly as changing a standard ciphersuite. By appending to the data
a pointer to the method that must be used to process them I buy
insurance for the long run. We still have a critical component though:
the certification of the ciphersuites.

> Do you also have a preference for particular keys?

Actually I might: several ciphers have a handful of "weak" keys; I may
be paranoid and want to use a key exchange protocol that checks for
them.


Sent via Deja.com http://www.deja.com/
Before you buy.

------------------------------


** FOR YOUR REFERENCE **

The service address, to which questions about the list itself and requests
to be added to or deleted from it should be directed, is:

    Internet: [EMAIL PROTECTED]

You can send mail to the entire list (and sci.crypt) via:

    Internet: [EMAIL PROTECTED]

End of Cryptography-Digest Digest
******************************

Reply via email to