Cryptography-Digest Digest #200

2001-04-21 Thread Digestifier

Cryptography-Digest Digest #200, Volume #14  Sat, 21 Apr 01 13:13:01 EDT

Contents:
  Re: ANOTHER REASON WHY AES IS BAD (SCOTT19U.ZIP_GUY)
  Re: "UNCOBER" = Universal Code Breaker (Joe H Acker)
  Re: Cryptanalysis Question: Determing The Algorithm? (SCOTT19U.ZIP_GUY)
  Re: Concerning US.A.4979832 (Mok-Kong Shen)
  Re: View from the top ("Dramar Ankalle")
  Re: ANOTHER REASON WHY AES IS BAD ("Tom St Denis")
  Re: Cryptanalysis Question: Determing The Algorithm? ("Tom St Denis")
  Re: "UNCOBER" = Universal Code Breaker (Mok-Kong Shen)
  Re: Random and not random (John Savard)
  Re: View from the top (Michael Davis)
  Re: View from the top ("Dramar Ankalle")
  Re: Random and not random (Mok-Kong Shen)
  Re: View from the top (Michael Davis)
  Re: View from the top ("Dramar Ankalle")
  Re: Better block cipher pre/post whiten step (John Savard)
  Re: Better block cipher pre/post whiten step ("Tom St Denis")
  Re: Better block cipher pre/post whiten step ("Tom St Denis")
  Re: Better block cipher pre/post whiten step (Mok-Kong Shen)
  Re: Better block cipher pre/post whiten step ("Tom St Denis")



From: [EMAIL PROTECTED] (SCOTT19U.ZIP_GUY)
Subject: Re: ANOTHER REASON WHY AES IS BAD
Date: 21 Apr 2001 15:20:55 GMT

[EMAIL PROTECTED] (Darren New) wrote in [EMAIL PROTECTED]:

SCOTT19U.ZIP_GUY wrote:
   Actaully Tom as usually your quite wrong. If one looks at an OTP
 you would have to think of the OTP data itself as part of the
 encryption program or the program nesicessary to make the OTP sting.

So why do you think that doesn't apply to the AES cyphers as well?


   Actaully I do think it should inculde the AES short keys of 256 bits.
Why do you think I mentioned scott19u and its key which is over a
million butes in length. If you read the start of the thread you will
see that.

David A. Scott
-- 
SCOTT19U.ZIP NOW AVAILABLE WORLD WIDE "OLD VERSIOM"
http://www.jim.com/jamesd/Kong/scott19u.zip
My website http://members.nbci.com/ecil/index.htm
My crypto code http://radiusnet.net/crypto/archive/scott/
MY Compression Page http://members.nbci.com/ecil/compress.htm
**NOTE FOR EMAIL drop the roman "five" ***
Disclaimer:I am in no way responsible for any of the statements
 made in the above text. For all I know I might be drugged or
 something..
 No I'm not paranoid. You all think I'm paranoid, don't you!


--

From: [EMAIL PROTECTED] (Joe H Acker)
Subject: Re: "UNCOBER" = Universal Code Breaker
Date: Sat, 21 Apr 2001 17:25:48 +0200

Joe H Acker [EMAIL PROTECTED] wrote:

 Joseph Ashwood [EMAIL PROTECTED] wrote:
 
  "Joe H Acker" [EMAIL PROTECTED] wrote in message
  news:[EMAIL PROTECTED]...
   If the random source is truly random, it doesn't do any harm when some
   of its output is discarded, except for performance slowdown. So yes,
   tests are useful to (a) continually test wether the hardware has failed
   or does appear to work correctly, and (b) to prevent against very very
   bad luck when a true random source happens to output the complete volume
   of Shakespeare's Macbeth (very unlikely)
  
  
  Actually it can do a great deal of harm. A short, rather extreme
  demonstration:
  take a perfect random number generator that generates binary
  throw away all the output bits that are 1
  Is the sequence predictable?
 
 You may not filter the sequence heuristically. I was talking about
 testing large sequences and when they fail the test, discarding them
 *completely*. Discarding an output sequence of a tRNG completely can
 never do any harm, except a performance slowdown, given that the
 discarded sequence is large enough and not just 1 bit in length. That's
 provable.

Sorry about the duplicate answers. After some thinking, I do no longer
believe that my claim is provable. It was a quickshot I'd like to
apologize for. Still, I believe that filtering out large sequences that
look very non-random has more benefits than it can harm, given that it's
a tRNG and not a pRNG. E.g. it doesn't appear to be a security problem
if you have 2^128-2 or 2^128 possible outputs of a 128-bit sequence. But
an all 1 or all 0 sequence can be a security problem, because it may
indicate that the tRNG is broken.

Regards,

Erich

--

From: [EMAIL PROTECTED] (SCOTT19U.ZIP_GUY)
Subject: Re: Cryptanalysis Question: Determing The Algorithm?
Date: 21 Apr 2001 15:35:19 GMT

[EMAIL PROTECTED] (Leonard R. Budney) wrote in 
[EMAIL PROTECTED]:

The NSA no doubt has a bestiary of bad ciphers, including all hand
ciphers, where the effort of breaking is so trivial that they could
simply run a random message through all of them by brute-force, and
they probably succeed most of the time. (With the advent of PGP, GPG,
etc., hopefully that is changing.)


  Yes Len with PGP G

Cryptography-Digest Digest #200

2000-11-21 Thread Digestifier

Cryptography-Digest Digest #200, Volume #13  Tue, 21 Nov 00 19:13:01 EST

Contents:
  Re: Legal issues for hobbiests (Steve Portly)
  Re: Pseudo random sequence generation for xor encryption (OTP) (David Schwartz)
  Re: A Simple Voting Procedure (David Wagner)
  Re: A Simple Voting Procedure (David Wagner)
  Re: vote buying... ("Tony T. Warnock")
  Re: A Simple Voting Procedure (David Schwartz)
  Capcom Encryptions ("Zen")
  Re: Proof of posession (David Wagner)
  Re: A Simple Voting Procedure (David Schwartz)
  Re: Proof of posession ("Matt Timmermans")
  Re: My new book "Exploring RANDOMNESS" (Richard Heathfield)
  Re: need help claasifying my progrma (Richard Heathfield)
  Re: Pseudo random sequence generation for xor encryption (OTP) (Tom St Denis)
  Re: A poorman's cipher (Tom St Denis)
  Re: Legal issues for hobbiests (Tom St Denis)
  Re: Randomness from key presses and other user interaction (Terry Ritter)
  Re: Q: fast block ciphers (Tim Tyler)
  Re: Q: fast block ciphers (Tim Tyler)
  Re: Pseudo random sequence generation for xor encryption (OTP) (Simon Johnson)
  Re: A poorman's cipher (Mack)
  Re: Randomness from key presses and other user interaction (David Schwartz)
  RE: More about big block ciphers ("Manuel Pancorbo")
  Re: Legal issues for hobbiests (Mack)



From: Steve Portly [EMAIL PROTECTED]
Subject: Re: Legal issues for hobbiests
Date: Tue, 21 Nov 2000 16:56:52 -0500



David Schwartz wrote:

 John Savard wrote:

  It's when you are a little guy who wants to let people download off
  the Internet that you have a problem.

 It depends upon how little. If you can get enough download volume, you
 may be able to get your product classified as 'retail', in which case
 there are no key-length limits. Unless your key length is your selling
 point, you should be able to make a restricted key length version, build
 volume, and then apply for retail status.

 DS

As Tom St. Denis pointed out elsewhere in this thread, creating an efficient code
is quite difficult.
I believe there is a subset of individuals that could write an expensive key
program in C without being able to stringently classify it.  About 5 percent of
the population could program an expensive key system that would exceed NIST
guidelines if it were presented for classification.  A cipher that is only 80 %
efficient would still be dangerous in the hands of a terrorist if the cipher is
strong.



--

From: David Schwartz [EMAIL PROTECTED]
Subject: Re: Pseudo random sequence generation for xor encryption (OTP)
Date: Tue, 21 Nov 2000 14:20:23 -0800


Ivan Skytte Jørgensen wrote:
 
 What is the best way of producing a pseudo random sequence for use with
 xor encryption?
 
 Huge precomputed sequences stored on each computer seem impractical.
 
 I was thinking of seeding a random generator with a shared secret
 combined with a reasonably sized per-session "unique" seed, and
 periodically re-seeding the random generator with pieces of the
 cleartext.
 
 Both ends of the encryption channel is considered secure. The cleartext
 has some structure, so some of the cleartext are will be known by
 attackers, but not all of it.

I would suggest using rc4/arcfour. It already does all of this and is
well understood and well tested.

DS

--

From: [EMAIL PROTECTED] (David Wagner)
Subject: Re: A Simple Voting Procedure
Date: 21 Nov 2000 22:26:21 GMT
Reply-To: [EMAIL PROTECTED] (David Wagner)

David Schwartz  wrote:
   But with a receipt scheme, they can only claim that the receipt is
their receipt. They can't prove it.

Then what good is the receipt?  I don't see any point to have a receipt
if it doesn't prove anything!

One problem is that the goalposts seem to keep changing; every time
someone points out a problem, you change your proposal of how to build a
voting system, and I find it difficult to keep track of all the changes.

Why don't you work out a detailed, concrete proposal, listing all the
security goals and properties of a specific proposed scheme, and describe
why you feel it is immune to all of the attacks discussed here and elsewhere?

--

From: [EMAIL PROTECTED] (David Wagner)
Subject: Re: A Simple Voting Procedure
Date: 21 Nov 2000 22:27:49 GMT
Reply-To: [EMAIL PROTECTED] (David Wagner)

David Schwartz  wrote:
Paul Rubin wrote:
But with a receipt scheme, they can only claim that the receipt is
  their receipt. They can't prove it.
 
 DUH, then they can't establish how they voted.

   Sure they can. They can, for example, take a photograph of the receipt
being printed with them in the picture and in the voting booth.

Well, look, then that very picture _is_ itself a proof of how you voted,
and it allows vote coercian and vote buying.

Look, you can't have it b

Cryptography-Digest Digest #200

1999-09-08 Thread Digestifier

Cryptography-Digest Digest #200, Volume #10   Wed, 8 Sep 99 13:13:04 EDT

Contents:
  Re: Different Encryption Algorithms (John Savard)
  Re: Linear congruential generator (LCG)
  Re: compression and encryption (Patrick Juola)
  Re: THE NSAKEY (jerome)
  Re: Linear congruential generator (LCG)
  Re: compression and encryption (SCOTT19U.ZIP_GUY)
  Re: Confused about public key encryption (DJohn37050)
  Re: NSAKEY as an upgrade key  (Was: NSA and MS windows) ("Trevor Jackson, III")
  Re: Different Encryption Algorithms (Anton Stiglic)
  Re: GnuPG 1.0 released
  Re: Hash of a file as key (Anton Stiglic)
  Re: GnuPG 1.0 released (JPeschel)
  Re: NSA and MS windows (SCOTT19U.ZIP_GUY)
  Re: Hash of a file as key ("Richard Parker")
  Re: MUM III (3 Way Matrix Uninvertable Message) (Tom St Denis)
  Re: arguement against randomness (Tim Tyler)
  Re: THE NSAKEY (SCOTT19U.ZIP_GUY)
  Re: Random and pseudo-random numbers
  Re: NSA and MS windows (Patrick Juola)
  Re: Linear congruential generator (LCG) (Tim Tyler)
  Re: Random and pseudo-random numbers (Tim Tyler)
  Re: compression and encryption (Tom St Denis)



From: [EMAIL PROTECTED] (John Savard)
Subject: Re: Different Encryption Algorithms
Date: Wed, 08 Sep 1999 15:41:39 GMT

"entropy" [EMAIL PROTECTED] wrote, in part:

I'm doing a high school research paper on different encryption algorithms,
such as CAST, IDEA, blowfish, RCx, DES, etc.   Could anyone point me to
informative web sites pertaining to the differences between these encryption
methods?

IDEA, DES, RC6, and Blowfish are described on my web page, if that
helps.

John Savard ( teneerf- )
http://www.ecn.ab.ca/~jsavard/crypto.htm

--

From: [EMAIL PROTECTED] ()
Subject: Re: Linear congruential generator (LCG)
Date: 8 Sep 99 14:24:24 GMT

David Goodenough ([EMAIL PROTECTED]) wrote:
: However, on the subject of LCG's, I seem to remember that a certain
: operating system once used a LCG as it's "random number generator"
: (sic), that had a strong tendancy for the lower bits to go in a
: repeating cycle.  Is this just a property of that particular poor
: choice of a and b, or is this a problem with all LCG's

Yes, it is a problem with them all. Each bit only depends on itself and
the less significant ones, and thus the last n bits always cycle through
2^n states (or less).

John Savard

--

From: [EMAIL PROTECTED] (Patrick Juola)
Subject: Re: compression and encryption
Date: 8 Sep 1999 11:14:49 -0400

In article 7r5jp2$[EMAIL PROTECTED],
Shaun Wilde [EMAIL PROTECTED] wrote:

should I compress my data before or after encryption? (binary data - with
possibly repeated blocks i.e .exe etc)

1) If I compress before encyption the final data block is small.
2) If I compress after encryption the data block is much larger (hardly any
saving as the encryption removes any repetitiveness
that exists in the original data.)

From the above I would say go for the 1st option, however I have a concern
and it is as follows.

If someone was trying to break the encryption all they would have to do is

a) try a key
b) try to decompress
if decompression works - no errors - then the odds are on that thay have
broken the code
else repeat

Which would lead to an automated attack, whereas the second approach would,
in my opinion, require a more
interactive approach - as you would need to know what sort of data exists in
the original to know whether you
have decrypted succesfully.

Six of one, half a dozen of another.  Lots of headerless compression
methods exist, which would imply that any file can be compressed.
Similarly, "what sort of data exists" is a sufficiently general question
to which sufficiently general answers are known that automated
attacks are extremely practical.

In general, the result of an incorrect decryption with a good algorithm
are almost always "indistinguishable from random."  All I need is
a good test for randomness, and if the test fails, then I'm on to
something -- this applies irrespective of whether my plaintext is
English, German, .EXE files, or financial records

I'd recommend precompression on the can't-hurt-may-help theory.

-kitten



--

From: [EMAIL PROTECTED] (jerome)
Subject: Re: THE NSAKEY
Date: 8 Sep 1999 14:50:56 GMT

On Wed, 08 Sep 1999 11:43:36 GMT, Tom St Denis wrote:

You might not believe this but David Wagner is a smart, talented person.  He
is not 'attached' to Bruce as you might think, he has done work with Rivest
and others as well.  I think this thread is way out of line.


http://www.counterpane.com/cpaneinfo.html lists d.wagner as a part of 
counterpane personnel and b.scheiner is the president of counterpane.

i don't take position in this debat, i simply show that cryptography 
is a small world and it isen't exactly fai

Cryptography-Digest Digest #200

1999-03-08 Thread Digestifier

Cryptography-Digest Digest #200, Volume #9Mon, 8 Mar 99 03:13:04 EST

Contents:
  Re: Has anyone given easy-to-understand descriptions of encryption methods? (John 
Savard)
  Re: British Crypto Fascists (R. Knauer)
  Re: wipe free space (Albert P. Belle Isle)
  Re: RNG = encryption ("Steve Sampson")
  Re: British Crypto Fascists (R. Knauer)
  Quantum PRNG (R. Knauer)
  Re: checksum algorithm ? (wtshaw)
  Re: RNG = encryption (wtshaw)
  CRYPTO, TRADEMARKS, BRAND NAMES  DOMAIN NAMING ([EMAIL PROTECTED])
  Re: Testing Algorithms [moving off-topic] (Somniac)
  Re: Client-server encryption key negotiation...? ("Chris Odom")



From: [EMAIL PROTECTED] (John Savard)
Subject: Re: Has anyone given easy-to-understand descriptions of encryption methods?
Date: Sun, 07 Mar 1999 21:38:49 GMT

[EMAIL PROTECTED] wrote, in part:

I've been searching the Web and Usenet for some time, but I have not found
what I've been looking for--a detailed description of encryption methods in
language that I can understand.  It would be nice to read a description of a
particular algorithm, say blowfish, broken down into steps and described in
terms understandable to people who don't already know the jargon of
cryptography or advanced mathematics.

My web site comes as close to what you are looking for as anything on
the Web might, if I may be so bold as to say so myself.

I definitely avoid advanced mathematics wherever possible. As for the
jargon of cryptography, you should be able to pick up some of that
from my pages as well, but I've striven to be as nontechnical as
possible.

John Savard (teneerf is spelled backwards)
http://members.xoom.com/quadibloc/index.html

--

From: [EMAIL PROTECTED] (R. Knauer)
Subject: Re: British Crypto Fascists
Date: Mon, 08 Mar 1999 01:10:30 GMT
Reply-To: [EMAIL PROTECTED]

On Sun, 07 Mar 1999 23:01:43 GMT, [EMAIL PROTECTED] (Haldor) wrote:

I wonder if anyone behind this nonsense knows anything about crypto.
The only reason crime pays is that the people fighting it are stupider
than criminals.

I think we all know the answer to that question. Here in Canada they
are hemming and hawing over the question as well, also by earnest but
ignorant bureaucrats and politicians. 

The only reason the truth wins out is that it is not embraced by
criminals, and that by definition includes all politicians.

Bob Knauer


"The smallest minority on earth is the individual. Those who deny individual
rights cannot claim to be defenders of minorities."
-- Ayn Rand

--

From: [EMAIL PROTECTED] (Albert P. Belle Isle)
Subject: Re: wipe free space
Date: Mon, 08 Mar 1999 01:35:47 GMT
Reply-To: [EMAIL PROTECTED]

On 7 Mar 1999 18:46:55 -, brandon [EMAIL PROTECTED] wrote:

Over the last year or so I've come across lots of different programs for wiping
free space on hard disks. First there was BCwipe, then Eraser, Scramdisk, PGP,
even Norton Utilities has this option if I remember correctly.

Some programs take a very long time to complete (eraser for example), while
others are quite brisk little buggers. Are the quickies not doing a proper job?
or are the long hauls doing unnecessary overtime? Obviously the best one takes
the least time possible to do the job properly.

Can anyone shed light on this one? Which is the best program to use?


Brandon:

For a given number of overwrites-per-sector with pre-determined
patterns, all overwrite functions should be limited by the disk
transfer rate (PIO3, PIO4, UltraDMA33, UltraDMA66, etc.).

Consequently, for the same disk accessed through the same driver by
the same operating system, the overwrite rate to a given standard
should be the same for all the programs which you're trying. 

If one appears to magically complete its passes at a higher rate
(usually accompanied by much-less-or-no disk activity), it usually
means yet another clueless implementation without cache-flushing of
each overwrite data buffer.

The "diskwipe.c" module in 32-bit PGP 5.5 for Windows was apparently
written by a "Windows programmer" who didn't understand exactly what
those nifty-looking mapped file functions in the Win32 API really do.
Consequently, all its "secure file-wiping" isn't.

 (Hint - they "commit" to VCACHE - not flush to disk. You can see the
same mistake in the memory allocation/deallocation part of the beta
source code for Bruce Schneier's "Yarrow" PRNG.)

The old reliable 16-bit PGP2.63 has properly-written cache-flushing
calls, and works reliably under DOS. However, many people peddling
"Windows front-ends" for it are apparently oblivious to the fact that
Win95's VCACHE ignores 16-bit cache-flushing calls (contrary to
assurances of "backwards compatibility" by MSFT technical pe