Cryptography-Digest Digest #107

2000-06-26 Thread Digestifier

Cryptography-Digest Digest #107, Volume #12  Mon, 26 Jun 00 04:13:01 EDT

Contents:
  Re: newbieish question (tomstd)
  where start for engine computer encyrption system ([EMAIL PROTECTED])
  Early Draft of the TC5 paper available (tomstd)
  Re: Quantum computing (Bill Unruh)
  Re: Public key algorithm conversion - does it possible? (acoola)
  Re: libdes: des_SPtrans ([EMAIL PROTECTED])
  Re: DES 64 bit OFB test vectors (Hideo Shimizu)
  Re: "And the survey says" ("Scott Fluhrer")
  Re: DES and questions ("Scott Fluhrer")
  Re: Variability of chaining modes of block ciphers (Mok-Kong Shen)
  Re: Variability of chaining modes of block ciphers (Mok-Kong Shen)
  Re: How Uncertain? (Mok-Kong Shen)
  Re: Announce: Catacomb 2.0.0pre2 now available (Mark Wooding)



Subject: Re: newbieish question
From: tomstd [EMAIL PROTECTED]
Date: Sun, 25 Jun 2000 20:35:49 -0700

Benjamin Goldberg [EMAIL PROTECTED] wrote:
In a recent post I put up, I said that a function I was looking
for
should have the following two properties:

1) For any fixed key, if any bit changes in the input,
approximately
half of the bits in the output should change, and

More specifically half the *round key* bits should change not
the plaintext/ciphertext bits.  See below.

2) For any fixed input, if any bit changes in the key,
approximately
half of the bits in the output should change.

I was wondering... are these properties always considered
important in
cryptographic functions?

No they are not.  They are mainly a consequence of secure block
ciphers.  For example if there is a linear correlation between
input bits and key bits then obviously flipping a bit will not
change half the bits with an even probability (hence it's
linear!!!).

In your comment about the master key, it's half the round key
bits should change which in turn would cause the half of the
ciphertext bits to change...

Tom


Got questions?  Get answers over the phone at Keen.com.
Up to 100 minutes free!
http://www.keen.com


--

From: [EMAIL PROTECTED]
Subject: where start for engine computer encyrption system
Date: Mon, 26 Jun 2000 04:20:01 GMT

I want to take in data off of my car's engine computer, encrypt it,
store it and only allow me and a "GM" mechanic to get the information
off.  I think they should all be able to get the information without
each others permissions.  Where should I start looking to design such a
system?  What sort of pieces do I need?  What can I buy that would fit
into this puzzle?

I think I would need some sort of device to get the data from the
engine computer (I have that) and then another device to encrypt it.  I
would have to already have PKI system in place?  Please let me know
what/where I should be looking.

This is a home project I am undertaking.  Fun huh?!  Hope I get through
the first week.  I wanted the best info to start, so I came here.


Many thanks.
--Matt


Sent via Deja.com http://www.deja.com/
Before you buy.

--

Subject: Early Draft of the TC5 paper available
From: tomstd [EMAIL PROTECTED]
Date: Sun, 25 Jun 2000 21:59:15 -0700

My few hours of work are available on my website (see below).  I
have started some basic cryptanalysis but by no means complete.
I would like some help working on this cipher, so anyone with a
few hours to spare please let me know.

Thanks,
Tom
--
http://www.geocities.com/tomstdenis/

Got questions?  Get answers over the phone at Keen.com.
Up to 100 minutes free!
http://www.keen.com


--

From: [EMAIL PROTECTED] (Bill Unruh)
Subject: Re: Quantum computing
Date: 26 Jun 2000 05:37:36 GMT

In [EMAIL PROTECTED] "Douglas A. Gwyn" [EMAIL PROTECTED] writes:

Bill Unruh wrote:
 Uh, no. Error correction itself requires incredibly unnoisy qbits. The
 raw level of error must be less than about 10^-4 for one even to dream
 of error correction.

Wrong again.

Interesting argument. I work in quantum computing. What is the basis for
your statement?

--

From: acoola [EMAIL PROTECTED]
Subject: Re: Public key algorithm conversion - does it possible?
Date: Mon, 26 Jun 2000 05:27:07 GMT

In article [EMAIL PROTECTED],
  [EMAIL PROTECTED] (Mark Wooding) wrote:

 It's trivial to derive an ElGamal public key from the private key.  If
 the prime modulus is p, the generator is g, and the private key is x
you
 compute the public key as g^x mod p.

Maybe it's my fallacy but it seems to me that it's no trivial for
cryptanalyst to get g.

 ElGamal Encryption
Public Key:
  p prime (can be shared among a group of users)
  g  p (can be shared among a group of users)
  y = g^x mod p
Private Key:
  x  p
Encrypting:
  k choose at random, relatively prime to p - 1.
  a (ciphertext) = g^k mod p
  b (ciphertext) = y^kM mod p
Decrypting:
  M (plaintext) = b/a^x mod p

Cryptanalyst know only a, b, M, x and p.

 What's the difference between 

Cryptography-Digest Digest #108

2000-06-26 Thread Digestifier

Cryptography-Digest Digest #108, Volume #12  Mon, 26 Jun 00 08:13:00 EDT

Contents:
  Re: Public key algorithm conversion - does it possible? (Mark Wooding)
  On a notation issue of Feistel ciphers (Mok-Kong Shen)
  RPK ([EMAIL PROTECTED])
  Re: MD5 Expansion (Mark Wooding)
  Re: TEA-wmlscript question (dexMilano)
  Re: How Uncertain? (Runu Knips)
  Re: DES 64 bit OFB test vectors (Jack Spencer)
  Re: security problem with Win 2000 Encryption File System (Sébastien SAUVAGE)
  Re: DES 64 bit OFB test vectors (Jack Spencer)
  Re: Quantum computing (Rob Warnock)
  Re: XOR versur MOD (Mark Wooding)
  Re: DES and questions (Gerard Tel)
  Re: DES 64 bit OFB test vectors (Mark Wooding)
  Re: TEA-wmlscript question (Mark Wooding)
  Re: Variability of chaining modes of block ciphers (Mark Wooding)
  Re: Variability of chaining modes of block ciphers (Mark Wooding)
  Re: On a notation issue of Feistel ciphers (tomstd)
  Re: RPK (tomstd)
  Key agreement in GSM phones (Gerard Tel)
  Re: Algo's with no easy attacks? (Runu Knips)
  Re: Quantum computing (Runu Knips)
  Re: Idea or 3DES (jungle)
  Has anyone got / read: "The CRC Handbook of Combinatorial Designs" ("Sam Simpson")



From: [EMAIL PROTECTED] (Mark Wooding)
Subject: Re: Public key algorithm conversion - does it possible?
Date: 26 Jun 2000 08:14:43 GMT

acoola [EMAIL PROTECTED] wrote:

 Maybe it's my fallacy but it seems to me that it's no trivial for
 cryptanalyst to get g.

You're right, it's not trivial.  However, it's also irrelevant.  *Any*
generator will do!

I'll restate the various parameters for your system, in more general
terms, and switching around the `public' and `private' labels:

  Possibly shared:
A cyclic group G, with order q.
  Public key:
An integer 1  x  q.
  Private key:
A generator g of the group G, and the element y = g^x.

  `Signature'
A pair (a, b) = (g^k, M y^k)
  `Verification'
b / a^x

The adversary chooses any generator g' of the group G.  This is easy
enough to do.  He computes y' = g'^x and uses the pair (g', y') as his
private key.

  What's the difference between what you want to do and a digital
  signature?
 
 I'd like to unite properties of digital signature and enciphering and
 I want to find out does it possible to make it at a time.

Investigate signature algorithms with message recovery, e.g., RSA.

-- [mdw]

--

From: Mok-Kong Shen [EMAIL PROTECTED]
Subject: On a notation issue of Feistel ciphers
Date: Mon, 26 Jun 2000 10:50:24 +0200


Feistel ciphers having two equal halves, e.g. DES, are
commonly described as follows for the i_th round:

 L_i = R_(i-1)

 R_i = L_(i-1) + F(K_i,R_(i-1))

If one combines two rounds into one combination ('big round'
for short in the following) and denotes the two halves of
the input to the big round with L and R, the two round keys
with K_1 and K_2 and the two halves of the output from the
big round with L' and R', one obtains

 L' = L + F(K_1,R)

 R' = R + F(K_2,L')

With this formulation one clearly sees the nature of the
iteration process involved when sucessive rounds are
performed, while in the original formulation this is
obscured a little bit by the 'swapping' of the two halves
of the block.

It may be interesting that using big rounds enables one to
simplify formulation for the (at least theoretically
conceivable, though for practical purpose presumably not
advantageous) case where the block is divided into, say,
three equal parts instead of two. Denoting the three parts
with U, V and W and the three round keys with K_1, K_2
and K_3, we have

 U' = U + F(K_1,V)

 V' = V + F(K_2,W)

 W' = W + F(K_3,U')

M. K. Shen
=
http://home.t-online.de/home/mok-kong.shen


--

From: [EMAIL PROTECTED]
Subject: RPK
Date: Mon, 26 Jun 2000 08:38:16 GMT

This public key cryptographic system seems to fit audio and video
application very well.
Does anyone know about "real" applications that use it ?
Does anyone work on trying to break it ? ( Just to know how robust this
system is and how reasonnable it is to use it).
Thank you


Sent via Deja.com http://www.deja.com/
Before you buy.

--

From: [EMAIL PROTECTED] (Mark Wooding)
Subject: Re: MD5 Expansion
Date: 26 Jun 2000 09:08:29 GMT

David A. Wagner [EMAIL PROTECTED] wrote:

 David Hopwood already posted one attack, so this is probably
 irrelevant, but here's another, just in case you're interested.

I think this works with the `fixed' version.  I'll give up at this
point.

-- [mdw]

--

From: dexMilano [EMAIL PROTECTED]
Subject: Re: TEA-wmlscript question
Date: Mon, 26 Jun 2000 09:13:29 GMT

I love you guy.

I'll meka the test and let you know.

Just a question: How can I calculate a golden number (which is the
theory)? Have you some reference on the web.
Thanks

dex

In article 

Cryptography-Digest Digest #109

2000-06-26 Thread Digestifier

Cryptography-Digest Digest #109, Volume #12  Mon, 26 Jun 00 12:13:01 EDT

Contents:
  Re: software protection schemes (Mark Wooding)
  Re: software protection schemes (Runu Knips)
  Re: DES and questions (Eric Young)
  SSL Encryption via Diffe-Hellman or DSA? (Jeffrey Parker)
  Des breaking service ? ("Erik Olssen")
  DES Weakness ? ("Erik Olssen")
  Re: CRC-64 and 128 - Generator Polynomials? (Mack)
  Re: CRC-64 and 128 - Generator Polynomials? (Mack)
  Re: Comments/analysis requested ([EMAIL PROTECTED])
  Re: Encryption on missing hard-drives ("Tony T. Warnock")
  Re: DES Weakness ? (Mark Wooding)
  Re: Quantum computing ([EMAIL PROTECTED])
  Re: Idea or 3DES (dexMilano)
  Re: Idea or 3DES (dexMilano)
  Re: TEA-wmlscript question (dexMilano)
  Re: Weight of Digital Signatures ("Trevor L. Jackson, III")
  After the FIPS140-1 randomness tests for DOS (command line)... ("Sam Simpson")
  Re: TEA-wmlscript question (Mark Wooding)
  Re: After the FIPS140-1 randomness tests for DOS (command line)... (Mark Wooding)
  Re: newbieish question ("Douglas A. Gwyn")
  Re: How Uncertain? ("Douglas A. Gwyn")
  Re: Quantum computing ("Douglas A. Gwyn")
  Re: DES Weakness ? (Pascal JUNOD)
  Re: Idea or 3DES (Simon Johnson)
  Re: Key agreement in GSM phones (David A. Wagner)
  Re: DES Weakness ? (Mark Wooding)
  Surrendering Keys, I think not. (Simon Johnson)



From: [EMAIL PROTECTED] (Mark Wooding)
Subject: Re: software protection schemes
Date: 26 Jun 2000 12:39:00 GMT

lament [EMAIL PROTECTED] wrote:
 In article g4u35.30$[EMAIL PROTECTED],
 [EMAIL PROTECTED] says...
 
  Almost any software protection scheme running on the PC suffers from
  this same vulnerability.  If the attacker finds the magic bit that
  finally authenticates the user, the software protection will not
  work.
 
 Big "If."

Actually, not a terribly large `if'.  In fact, it's an `if' which looks
a lot more like a `when'.

 Given that billions of instructions transpire, how do you know which
 one(s) are the one?

By analysing the code and using intelligence?

  Any scheme that is deliberately and perversely complex is likely to
  be buggy and slow and still subject to attack by automated
  tools. Dongle protection scheme fall into this category.
 
 "...deliberately and perversely complex" seems to be a necessary
 condition of encryption.

This perception is false.  RC4, RC5 and Blowfish are extremely simple to
think about and implement.  Public key systems are an even better
demonstration of the falseness of your perception.  Diffie-Hellman key
exchange, to pick one example, entirely lacks complexity, deliberate,
perverse or otherwise.

 Is all encryption code "buggy and slow"?

No.  Not at all.

  Most dongle protection scheme have been broken.

 The CAD industry uses dongles. 

I fail to see the relevance of this statement.  The fact that dongles
are still used implies that marketting is more effective that fact, and
this isn't news.

  Good, reliable software protection in the PC environment is a myth.
 
 I guess this depends on what you mean by "good," "reliable" and "a myth." 

Reaplce `myth' by `fiction'.  Choose your own definitions of `good' and
`reliable'.

-- [mdw]

--

Date: Mon, 26 Jun 2000 14:37:41 +0200
From: Runu Knips [EMAIL PROTECTED]
Subject: Re: software protection schemes

lament wrote:
 In article g4u35.30$[EMAIL PROTECTED], [EMAIL PROTECTED] says...
  [...]  If the attacker finds the magic bit that finally
  authenticates the user, the software protection will not
  work. Usually it is very easy to find this bit. [...]

 Given that billions of instructions transpire, how do you
 know which one(s) are the one?

Ever heard of "debuggers" ? Its not that hard to find the
position where such checks happen. Even if you have no
debugging information. Just track what the app is doing
and with some searching you'll find those places.

You have to be able to read and write assembly, of course.

--

From: Eric Young [EMAIL PROTECTED]
Subject: Re: DES and questions
Date: Mon, 26 Jun 2000 12:50:17 GMT

rick2 wrote:
 
 I have two more questions about implementing DES in a program. (Feel
 free to Dope-Slap(TM) me if these are too close to outright idiotic.)
 
 1. It turns out that 3-DES is kind of slow, so any speed-up tricks
 would be worth it. So, not very brilliant, but what about 2-DES?
 Would that have a encryption strength of 128 (or should it be 112)
 bits? Geez, that seems pretty strong, no? Especially if people have
 been using only two different keys for 3 rounds of DES, why not just
 do the two with two keys?

I wonder how one does a 'Dope-Slap(TM)' via news :-).

Regarding 2-key vs 3-key, it make no different to encryption performance,
only key-setup, and DES has a very good key-setup vs encrypt ratio
(unlike blowfish).  The cost to re-key 3-DES is probably about the same as
the cost to encrypt 24 

Cryptography-Digest Digest #110

2000-06-26 Thread Digestifier

Cryptography-Digest Digest #110, Volume #12  Mon, 26 Jun 00 14:13:00 EDT

Contents:
  Re: Compression and known plaintext in brute force analysis   (James Felling)
  Re: Idea or 3DES (Mark Wooding)
  Re: Compression  Encryption in FISHYLAND (James Felling)
  Re: Idea or 3DES (Simon Johnson)
  Re: TEA-wmlscript question ("Douglas A. Gwyn")
  Re: How Uncertain? (James Felling)
  Re: How Uncertain? (Future Beacon)
  Re: where start for engine computer encyrption system (Mike Rosing)
  Re: Compression  Encryption in FISHYLAND (SCOTT19U.ZIP_GUY)
  Re: Encryption on missing hard-drives ("CrakMan")
  Re: TEA-wmlscript question (dexMilano)
  TEA question (dexMilano)



From: James Felling [EMAIL PROTECTED]
Subject: Re: Compression and known plaintext in brute force analysis  
Date: Mon, 26 Jun 2000 11:12:59 -0500



Tim Tyler wrote:

 James Felling [EMAIL PROTECTED] wrote:
 : Joseph Ashwood wrote:

 : [...] with compression a slightly different result can occur, we can end up
 : with a deflation down to significantly smaller sizes. A quick test with
 : winzip on a large text file gave a size reduction of 65%, more usefully for
 : this that means that the compression resulted in 3 bytes per byte. Moving
 : outward this leads to 3 blocks per block, so with a (semi)known plaintext,
 : under compression, it is entirely possible that the unicity distance could
 : actually be reduced.

 : Ok, I think I see what you are claiming.  That given n encrypted blocks and a
 : plaintext of known character, with compression you may actually need fewer
 : cyphertext blocks than without compression.  I belive this to be a possible
 : condition to occur.  OTOH I believe you are missing something here.

 : Given we have a plaintext P composed of  blocks p(i), and after compression
 : we have a compressed plaintext C composed of blocks c(i).  It may be
 : possible to recognise the code with fewer c(i)'s , however it would still
 : require more p(i) blocks.

 I wondered if this was what was intended.

 However, it still doesn't seem to make any sense.

 If the compressor actually compresses, then a larger proportion of
 cyphertexts will result in expansion to valid plaintexts (for any given
 length of cyphertext) - so you'll typically need a larger cyphertext (as
 well as a larger original plaintext) to reduce the number of possible
 plaintexts to 1.

I am in agreement with your position, I was just pointing out that even in the
worst case, where compression does NOT buy you a more efficient usage of space, it
may still have the effect of increasing the number of plaintext blocks required to
reach unicity.

In my opinion a good compression method can have the effect of increasing the
unicity distance in two ways. 1) by expanding the space of possible inputs to the
cypher in comparison to the raw plaintext ( won't always happen as it is possible
to have a space of possible inputs in raw plaintext that is wider than the space of
all compressed files, but is a safe bet most of the time)
and 2) as I suggest ablove, by making an equal amount of cyphertext go further in
relation to plaintext.


 --
 __  Lotus Artificial Life  http://alife.co.uk/  [EMAIL PROTECTED]
  |im |yler  The Mandala Centre   http://mandala.co.uk/  Niagra falls.


--

From: [EMAIL PROTECTED] (Mark Wooding)
Subject: Re: Idea or 3DES
Date: 26 Jun 2000 16:24:44 GMT

Simon Johnson [EMAIL PROTECTED] wrote:

 It is optimized against both linear and differential cryptanalysis
 where as the original form of DES was only secure against differential
 cryptanalysis.

IDEA was designed before linear cryptanalysis was known.  That IDEA is
resistant to linear cryptanalysis is useful and interesting, but doesn't
necessarily imply that this was a design decision.

 Moreover, IDEA doesn't have any steps that might be troublesome in
 software.

Actually, IDEA's multiplication mod 2^{16} + 1 is a real nuisance in
software, even though it's theoretically lovely. ;-)

 Even better, all the operations in the F-function work on 8-bit words,

What?  IDEA is entirely based on 16-bit operations, not 8-bit ones.  Are
you getting confused with SAFER?

 IDEA is faster than Triple-DES, DES is prehaps the securest cipher on
 the plannet, i say this because it has the weight of 30-35 years of
 analysis.

Which is pretty impressive for a cipher which was designed in the
1970s.  I think you mean 25 years.  ;-)

As for securest ciphers, consider BEAR or LION with a BBS and
HMAC-SHA1.

 In conclusion, though Triple-DES is old and clunky, its the best out
 there for guaranteed security. If you want a faster, robuster and
 newer algorithm, IDEA is the one to go for.

Don't forget patents: IDEA is patented.

I agree that Triple-DES is the cipher to use for maximum paranoia.  I
thoroughly disagree about IDEA, though.  I suspect that, had Zimmerman
not used IDEA in PGP, it would have been 

Cryptography-Digest Digest #112

2000-06-26 Thread Digestifier

Cryptography-Digest Digest #112, Volume #12  Mon, 26 Jun 00 18:13:01 EDT

Contents:
  Re: Compression  Encryption in FISHYLAND (James Felling)
  Re: breaking encryption - help! ("Joseph Smith")
  Re: How Uncertain? (Future Beacon)
  Re: Key agreement in GSM phones (Paul Rubin)
  Re: Surrendering Keys, I think not. (S. T. L.)
  Re: Algo's with no easy attacks? ("Joseph Ashwood")
  Re: Compression and known plaintext in brute force analysis (restatements caused by 
the missing info  thread) ("Joseph Ashwood")
  Re: breaking encryption - help! (Steve Basford)
  Re: Surrendering Keys, I think not. ("Douglas A. Gwyn")
  Re: Weight of Digital Signatures (Shawn Willden)
  Re: On a notation issue of Feistel ciphers (Mok-Kong Shen)
  Re: Key agreement in GSM phones (David A. Wagner)
  Re: RPK (David A. Wagner)
  SSL CipherSuites in Netscape  Internet Explorer (Peter Kwangjun Suk)
  Re: Encryption on missing hard-drives ("Douglas A. Gwyn")
  Re: How Uncertain? ("Douglas A. Gwyn")
  Re: Surrendering Keys, I think not. (Simon Johnson)



From: James Felling [EMAIL PROTECTED]
Subject: Re: Compression  Encryption in FISHYLAND
Date: Mon, 26 Jun 2000 14:40:59 -0500



"SCOTT19U.ZIP_GUY" wrote:

 [EMAIL PROTECTED] (James Felling) wrote in
 [EMAIL PROTECTED]:

  snip
 
  .
 
  I don't follow ya.  I never intended to suggest that compression
  will make it weaker, I just wanted to point out an interesting
  observation about how compression can make a cipher stronger
  (which it can't).
 
 It CAN make a cyper stronger, but it is not guaranteed to.  There are
 advantages regarding Unicity distance, and also regarding the amount of
 cyphertext available for analisys, but there are always cases where it
 will actually make your situation worse. ( such as encrypting highly
 random data -- it will probably worsen the statistacal character of your
 data while actually lengthening the plaintext.
 
 I am of the opinion that compression before encryption is almost always
 a good idea,  with special emphasis on the almost.
 

   I my be wrong but it seems your position is starting to crack.

No.  I will state that 99.9% of the time compression is a good thing.
However, every so often there will be situations in which it is less than
optimal, and such circumstances can be noticed with a reasonable degree of
accuracy, so it is not really an issue.  I would use a headerless compression
of some sort, and if such compression were 1-1 it would be a good thing, but I
would have to consider a number of characteristics of the compressor before
deciding upon ANY compression scheme for a specific aplication.


 I will go to say that most compression methods in use may be of
 help to the attacker. The main reason is that they gave the attacker
 a black and white anwser as to weather or not the guessed key leads
 to a possible solution.

How is that? They may allow one to discard possibles more easily( i.e.
decompression fails ), but one still needs to make the attempt to decompress.
A much more important concern is how well it does at filling the space of
possible plaintexts.

I concede that the theoretical optimum for compressors is a bijective
compressor, but that I would still rather have a highly efficient compressor
that is non bi jective versus a poor bijective one.  I feel that your
compressor is an OK compressor, but before setting up any actual aplication of
that technology, I would have to test it versus others as to how it performs
with the likely traffic.

 Do to the fact most random files will not
 decompress and then compress back to the stated random file. By using
 most compression schemese you give the attacker who may know nothing
 about the message being sent a way to eliminate most files immediatly
 without having to check if the guessed file is a valid file for some
 word video or audio application. If one chooses a bijective or (1-1)
 compressor this problem does not exist. In the compression methods
 that have been modifed to make them bisjective namely the huffman rle
 and arithmetic they also compress better after the methods were modifed
 to make them bijective. There is no reason to belive this effect would
 not occur in other compression methods. It is just that people don't
 give it much thought yet. The NSA and phony crypto gods have let people
 but these improvements on the back burner for obvious reasons.

 David A. Scott
 --
 SCOTT19U.ZIP NOW AVAILABLE WORLD WIDE
 http://www.jim.com/jamesd/Kong/scott19u.zip
 Scott famous encryption website NOT FOR WIMPS **no JavaScript allowed**
 http://members.xoom.com/ecil/index.htm
 Scott rejected paper for the ACM
 http://members.xoom.com/ecil/dspaper.htm
 Scott famous Compression Page WIMPS allowed ** JavaScript OK**
 http://members.xoom.com/ecil/compress.htm
 **NOTE EMAIL address is for SPAMERS***
 I leave you with this final thought from President Bill 

Cryptography-Digest Digest #113

2000-06-26 Thread Digestifier

Cryptography-Digest Digest #113, Volume #12  Mon, 26 Jun 00 23:13:01 EDT

Contents:
  Re: Surrendering Keys, I think not. ("Harvey Rook")
  Re: Surrendering Keys, I think not. (zapzing)
  Re: Compression and known plaintext in brute force analysis (restatements caused by 
the missing info  thread) (zapzing)
  Re: How Uncertain? (Mok-Kong Shen)
  Re: Idea or 3DES (Chem-R-Us)
  Re: Encryption on missing hard-drives (Mike Andrews)
  Re: Weight of Digital Signatures (Rex Stewart)
  Re: Weight of Digital Signatures ("Lyalc")
  Re: How Uncertain? ("ben handley")
  Re: How Uncertain? (wtshaw)
  Re: Compression and known plaintext in brute force analysis (restatements caused by 
the missing info  thread) (wtshaw)
  Re: Surrendering Keys, I think not. (wtshaw)
  unable to use HIGH ENCRYPTION on IIS 5.0 (1024 bit key) ("Phil")
  Re: Idea or 3DES (Steve)



From: "Harvey Rook" [EMAIL PROTECTED]
Subject: Re: Surrendering Keys, I think not.
Date: Mon, 26 Jun 2000 15:21:54 -0700

How do you convince the opponent that you are using an OTP? Because you told
him, and you provided a sample OTP+plain text? This is security though
obscurity. In the long run it doesn't work.

Harvey Rook
[EMAIL PROTECTED]


"Simon Johnson" [EMAIL PROTECTED] wrote in message
news:[EMAIL PROTECTED]... The problem with all
these suggestions, is that all these OTP or
 equivlent keys take an equal space to the message.

 What i'm suggesting is more subtle:

 I'm saying that since the output of a cipher is
 indistinguishable from random data, that it could be used to
 masquerade as a OTP.

 U'd take another document, XOR it with the cipher-text, this
 would give you a dummy key. U then surrender this key, instead
 of the proper key used with the algorithm.

 I think this is a good idea, because the legatimate owner of the
 encrypted file only has to enter their encryption key to decrypt
 the file.  It take nearly no extra effort to make the false key,
 and doesn't affect normal use of the file atall.

 There is no way they could prove that it isn't a OTP, unless
 they brute-forced the underlying algorithm.

 Got questions?  Get answers over the phone at Keen.com.
 Up to 100 minutes free!
 http://www.keen.com




--

From: zapzing [EMAIL PROTECTED]
Subject: Re: Surrendering Keys, I think not.
Date: Mon, 26 Jun 2000 22:15:18 GMT

In article 8j8ac4$pvc$[EMAIL PROTECTED],
  Simon Johnson [EMAIL PROTECTED] wrote:
 In article [EMAIL PROTECTED],
   "Douglas A. Gwyn" [EMAIL PROTECTED] wrote:
  Simon Johnson wrote:
   I was wondering how they would ever be able to *prove* that this
   key is correct. Since one of the requirements for the AES is
   that the output of data encryption produces cipher-text that
   cannot be told apart from random data. If some person said the
   cipher-text was a message encrypted using an OTP, then the
   police must brute-force the underlying algorithm to prove
   otherwise.
 
  The decryption key (which is what must be provided) would produce
  putative plaintext that could readily be validated.  With nearly
  any decent cryptosystem, using the wrong decryption key produces
  "random" noise, not a coherent plaintext, so it would be obvious.

 U're missing my point entirely..

 To point this out:

 Prerequisties:

 A 'Good' encryption algorithm and a key, E_k().
 A Real piece of plain-text, T_0
 A Piece of non-incriminating plain-text, T_1

 Method:
 C=E_k(T_0)
 Dummy-Key = C XOR T_1

 The officials can't prove it isn't a one time pad, so they are forced
 to recover using the plain-text using the dummy key provided:
 T_1 = C XOR Dummy-Key.

The police may want to see that the recipient also has a copy
of the OTP. The recipient, however, cannot have a message
he has not recieved yet. Dummy_Key, you will observe, depends
on the message sent. What happens if the police intercept the
message, preventing the intended recipient from recieving it,
and then demand of the recipient that he produce the OTP ?

--
If you know about a retail source of
inexpensive DES chips, please let
me know,  thanks.


Sent via Deja.com http://www.deja.com/
Before you buy.

--

From: zapzing [EMAIL PROTECTED]
Subject: Re: Compression and known plaintext in brute force analysis (restatements 
caused by the missing info  thread)
Date: Mon, 26 Jun 2000 22:20:24 GMT

In article [EMAIL PROTECTED],
  [EMAIL PROTECTED] wrote:
 zapzing wrote:
  If you still disagree, I challenge you to
  present a "compression" algorithm that will
  compress *all* files without loss of
  information.

 Actually, it's theoretically impossible, assuming the input alphabet
is the
 same as the output alphabet. Otherwise, one could keep feeding the
output
 back into the input until you reached a minimum size (1 byte or
whatever).


Oh, you gave it away!

 --
 Darren New / Senior MTS  Free Radical / Invisible Worlds Inc.
 San