Cryptography-Digest Digest #817, Volume #8       Wed, 30 Dec 98 18:13:03 EST

Contents:
  Re: New Book "The Unknowable" (R. Knauer)
  Re: Security through obscurity in the smartcard world ([EMAIL PROTECTED])
  Re: Security through obscurity in the smartcard world (John Savard)
  Why no Standard C/R Password Protocol? (John Savard)
  Re: All-or-nothing encryption idea? ([EMAIL PROTECTED])
  Re: Anybody heard of these people? ("Sam Simpson")
  First issue of Journal of Craptology (Lars Ramkilde Knudsen)
  MTP Complex Number Cipher (R. Knauer)
  Re: U.S. Spying On Friend And Foe (Jim Dunnett)
  Re: On leaving the 56-bit key length limitation ([EMAIL PROTECTED])
  Re: Free ENCRYPTION Programs (32b) (David Hamilton)

----------------------------------------------------------------------------

From: [EMAIL PROTECTED] (R. Knauer)
Subject: Re: New Book "The Unknowable"
Date: Wed, 30 Dec 1998 20:10:15 GMT
Reply-To: [EMAIL PROTECTED]

On Wed, 30 Dec 1998 19:16:19 GMT, [EMAIL PROTECTED] wrote:

>Correct!  (That's assuming, of course, that the formal system
>only proves true theorems, i.e., is truthful.)
>Rgds,
>GJC
>P.S.  The sufficiently large t turns out to be the complexity
>of the formal axiomatic system that is being used, plus a
>constant c that doesn't depend on the choice of the formal
>axiomatic system.

While we have you here, perhaps you might like to comment on something
I just posted in another thread on sci.crypt the other day.

But first let me point out that I have been a fan of your theories for
some time now, having first learned of them on sci.crypt over a year
ago when we had a huge discussion/debate on crypto-grade randomness
going. Since then I have read your major papers on randomness and its
relation to algorithmic complexity.

The post below was addressing the use of random numbers for the One
Time Pad cryptosystem. Here is that post with minor typos corrected:

+++++
.... it seems that Chaitin's definition of randomness does no good for
purposes of crypto. Consider the following procedure.

Begin with a plaintext and compress it using the best compression
procedure available. Call this sequence CP for Compressed Plaintext.
The assumption is that CP is the minimal representation for the
original plaintext based on compression.

Now employ algorithmic complexity by finding the minimal algorithm
that will reproduce CP. But that algorithm is just:

print (CP)

because there is no other algorithm that can reduce CP to a smaller
size.

Therefore CP is a random ciphertext which cannot be deciphered based
on any discernable pattern, since it is "random" according to
algorithmic complexity - that is the algorthm that porduces it cannot
be made substantially smaller.

IOW whatever algorithm that is employed to reproduce the sequence must
contain the literal sequence without any reduction in size, since it
is already as small as it can get by prior compression (see Chaitin,
op. cit.). So the cryptanalyst faces an impossible task since there is
no pattern for him to use to decipher the message.

But we know better - the cipher can be broken by decompressing it.
Therefore it would seem that randomness, based on algorithmic
complexity, is not a suitable measure of crypto strength.
+++++

Your comments, please.

Bob Knauer

"He knows nothing and he thinks he knows everything. That points
clearly to a political career."
--George Bernard Shaw


------------------------------

From: [EMAIL PROTECTED]
Subject: Re: Security through obscurity in the smartcard world
Date: Wed, 30 Dec 1998 20:12:13 GMT

In article <76b57t$[EMAIL PROTECTED]>,
  Gary Howland <[EMAIL PROTECTED]> wrote:
> ...
> But I don't want to get into the GSM debate.  I want comment on
> the priniciple of adding security (i.e. cost and expense) using
> obscurity.  My reason for wanting debate, first on the question of
> "is security through obscurity" valid, and second on "how?", is
> because I intend to write a report on the benefits of obscurity,
> and the techniques one should use when creating proprietary
> algorithms.  So in addition to comments on the validity of the
> proprietary algorithms, I would also like comments on the best ways
> of creating such proprietary algorithms.  I am going to outline a
> few of the techniques in my paper, such as the subtle modification
> of existing algorithms, adding (better than removing) rounds,
> dangers or mixing algorithms that have weak keys etc.  Any comments
> on the design of proprietary algorithms based on tried and trusted
> algorithms would be much appreciated. (...)

    Before breaking a non-trivial system, an attacker must analyze it.
    Obscurity is a type of defense that makes analysis difficult.
    There are basically two avenues here:

 1. Try to keep the system secret. Generally speaking this is not a
    good idea because in most cases we must assume that the attacker
    will gain access to the encryption mechanism and will be able to
    analyze it anyway. If you are confident that the attacker will not
    gain access then, by all means, you should keep your system
    secret. An argument used here is that most people or organizations
    are not able to develop secure systems themselves without the
    benefit of peer review. This certainly is correct but there is a
    good solution: use a peer reviewed system that is meant to be
    modified by the user. Here is an idea: modify 3DES by including
    two invertible functions you wrote yourself between the DES
    encryptions.

 2. Try to make analysis difficult even if the design of the system is
    public. This, I believe, is a valid design principle. The basic
    idea is to design "variable" system, i.e. a system that changes
    its profile depending on the secret key. Here are some ideas on
    how to achieve this with a symmetric cipher:

2.1 Define an enormous number of encrypting functions and choose one
    depending on the secret key. An example: Define a set of 256 DES
    keys which may be known to the attacker. Now use as a cipher 16DES
    (encrypt 16 times using DES) and for each encryption choose one of
    the 256 available DES keys using the secret key as an index. You
    will need 16 indexes of 8 bits each, so the secret key will have
    128 bits. The attacker knows everything except the secret key -
    still it is not clear how he can attack this cipher.

2.2 Analyze cryptanalytic methodology and design a very complicated
    cipher that is just too difficult to analyze. This is a no-no in
    academic cryptology, but all the same I think it is a valid
    method. Such a cipher will almost certainly *not* attract public
    cryptanalysis and the accepted wisdom is that a cipher should be
    considered secure only if it has been cryptanalyzed extensively by
    competent cryptanalysts. This is a convincing argument but there
    are two possible responses: a) one could try to quantify how
    difficult it is to analyze the cipher and use this as a measure of
    the cipher's security, b) one can always combine this obscure
    cipher with 3DES the most analyzed cipher in history.

2.3 Design a cipher where the sequence of arithmetic operations
    applied to the plaintext is not constant but variable depending on
    the key. The AES candidate FROG (http://www.tecapro.com/aesfrog.htm)
    is such a design because it changes the destination of operations
    depending on the key. A previous design of mine called GodSave
    (http://www.tecapro.com/godsave.htm) uses massive overkill (at the
    price of speed) and is extremely variable. In fact the sequence of
    operations applied on the plaintext depend both on the key and the
    plaintext itself. I wrote this cipher before becoming acquainted
    with academic cryptology; GodSave is also very difficult to
    analyze (point 2.2 above).

2.4 Design a "cipher generator". The cipher generator itself is peer
    reviewed but the individual ciphers are kept secret. The idea here
    is to produce either source code or executable modules using a
    true random input. This is not as farfetched as it sounds: GodSave
    uses several primitive hash function that were machine produced.

--
http://www.tecapro.com
email: [EMAIL PROTECTED]

============= Posted via Deja News, The Discussion Network ============
http://www.dejanews.com/       Search, Read, Discuss, or Start Your Own    

------------------------------

From: [EMAIL PROTECTED] (John Savard)
Subject: Re: Security through obscurity in the smartcard world
Date: Wed, 30 Dec 1998 18:49:43 GMT

Gary Howland <[EMAIL PROTECTED]> wrote, in part:

>But I don't want to get into the GSM debate.  I want comment on
>the priniciple of adding security (i.e. cost and expense) using
>obscurity.  My reason for wanting debate, first on the question of
>"is security through obscurity" valid, and second on "how?", is
>because I intend to write a report on the benefits of obscurity,
>and the techniques one should use when creating proprietary
>algorithms.  So in addition to comments on the validity of the
>proprietary algorithms, I would also like comments on the best ways
>of creating such proprietary algorithms.  I am going to outline a
>few of the techniques in my paper, such as the subtle modification
>of existing algorithms, adding (better than removing) rounds,
>dangers or mixing algorithms that have weak keys etc.  Any comments
>on the design of proprietary algorithms based on tried and trusted
>algorithms would be much appreciated.

Mostly, the accepted wisdom is that security through obscurity is
completely invalid, and should be avoided as a snare and a delusion.

I don't go quite that far. I think that with conservative design
practices, one can come up with a proprietary algorithm that is close
enough to one that has been well analyzed to be reasonably safe.

If one wished to base one's cipher on DES, of course, you had better
use larger S-boxes, to avoid needing to do extensive analysis.

But what you must also do is ensure that your design does not *depend*
on obscurity for its security. That the obscurity is only an
additional security measure, not critical to the security of your
design.

John Savard
http://www.freenet.edmonton.ab.ca/~jsavard/index.html

------------------------------

From: [EMAIL PROTECTED] (John Savard)
Subject: Why no Standard C/R Password Protocol?
Date: Wed, 30 Dec 1998 18:49:31 GMT

Since hash functions aren't export controlled, why isn't there a nice,
simple, non-proprietary standard for entering passwords over the
Internet that doesn't require sending passwords in the clear?

Asking myself this question, I've come up with some answers...

1) Suppose we just use a simple keyed hash function on the password
itself. The challenge is the key, the response the hash.

Now, the system at the other end of the line has to have a copy of the
password itself - rather than a hash of the password - stored to check
if the response was correct.

2) OK, let's store the passwords at the other end hashed.

I type the password in at a dialog box at my end. My program hashes it
first by the fixed hash method, and then hashes it a second time, via
the keyed hash method, in response to the challenge.

At the other end of the line, only the hashed version of the password
is stored - but it's enough for someone to log in with, using a hacked
version of the program I was using, that takes the hash, rather than
the password, as input.


At this point, I began to wonder - would public-key cryptography, or
at least true encryption (after all, the password is a shared secret)
be required to get around this problem?

But then, I saw that a _partial_ improvement could be made.

3) A hash is stored at the other end, using salt. Along with a key to
a second keyed hash, the salt is also mentioned in the challenge
message.

Now, as my password is hashed with that system's salt, someone with
access to that system's password file can still get enough information
to enable them to sign on to that system with my password - but other
systems with which I use that same password, if any, will have stored
it with different salt values.


There's still a loss of security. In the original case, where the
communications line was secure, the password file didn't enable people
to type the password - and everything was under the control of the one
trusted computer during logons.

The problem seems to require public-key cryptography: store a public
key in the "password" file, and you can verify a signature without
being able to forge it.

And that explains the lack of a simple non-proprietary standard. But
perhaps I'm missing some simple idea, some way for A to prove to B
that he knows something that B does not know, although B does know
something derived from it.

John Savard
http://www.freenet.edmonton.ab.ca/~jsavard/index.html

------------------------------

From: [EMAIL PROTECTED]
Subject: Re: All-or-nothing encryption idea?
Date: Wed, 30 Dec 1998 20:20:25 GMT

In article <[EMAIL PROTECTED]>,
  Darren New <[EMAIL PROTECTED]> wrote:
> Just thinking on it, would it be reasonable to use the following scheme
> to encrypt data such that the entire file must be decrypted at least
> once before one could check for a valid decryption?
>
> 1) Generate a random stream of data R of the same length as the
> plaintext P.
>
> 2) Hash R with MD5 or SHA-1 or some other suitable cryptgraphic hash.
>
> 3) Use the result of that hash as the key to encrypt P once, yielding I.
>
> 4) Interleave the random stream R with the intermediate text I on a
> byte-by-byte basis, such that the final file is twice as big, consisting
> of R[0]:I[0]:R[1]:I[1]:R[2]:I[2]:..., yielding W (for lack of a better
> letter).
>
> 5) Encrypt W with the shared-secret key, yielding the cyphertext to be
> sent.
>
> It would seem you need to decrypt the entire cyphertext to find all the
> bytes of the key to test?  Would it be possible to crack the beginning
> of I without knowing the end? Does this help at all?
>
> Of course, this doubles the file size, but a similar technique could be
> used just putting one bit of R for each block (minus one bit) of P.
>
> --
> Darren New / Senior Software Architect / MessageMedia, Inc.
> Overheard: "Does that go on the parallel port or the sushi cable?"
>

 A much better approach that will not change the fiile size at all
is to use scott19u. It is an all or nothing encryption method.

David A. Scott

--
http://cryptography.org/cgi-bin/crypto.cgi/Misc/scott19u.zip
http://members.xoom.com/ecil/index.htm

============= Posted via Deja News, The Discussion Network ============
http://www.dejanews.com/       Search, Read, Discuss, or Start Your Own    

------------------------------

From: "Sam Simpson" <[EMAIL PROTECTED]>
Subject: Re: Anybody heard of these people?
Date: Wed, 30 Dec 1998 12:46:58 -0000

Looks worrying....

The web site says:  "Rest assured that email you received is written by your
correspondent and by no one else. ASM ensures that email is authentic by
incorporating digital signatures that cannot be forged."

Yeah, right...

And the User Manual says:  "The strong encryption used in this product is
based on the world�s best patented algorithms and is virtually unbreakable."

Sounds a little vague to me.  I can find no reference to either the
symmetric cipher or the hash function that is used.  Source code is not
available.

Even funnier, I tried installing the program and get a "Unable to locate
DLL" message - RASAPI32.DLL.

The file call itself "Pretty Easy Privacy application" - but I find no
supporting evidence of this!


Why would one use this in preference to PGP?  Erm, pass.


Sam Simpson
Comms Analyst
-- http://www.hertreg.ac.uk/ss/ for ScramDisk hard-drive encryption & Delphi
Crypto Components.  PGP Keys available at the same site.



[EMAIL PROTECTED] wrote in message <[EMAIL PROTECTED]>...
>Hi,
>
>There is an outfit on the net advertising an automatic e:mail encryption
>system based on Diffie/Hellman.
>
>I'm just curious to know if anyone has heard of them.
>
>They can be found at
>
>http://www.nc-tech.net/nctmap.htm
>
>Comments appreciated.
>
>Thanx



------------------------------

From: Lars Ramkilde Knudsen <[EMAIL PROTECTED]>
Crossposted-To: sci.crypt.research
Subject: First issue of Journal of Craptology
Date: 30 Dec 1998 12:49:28 -0800


Hi,
 
First issue contains 3 papers. Download all three or either one of them
from 
 
http://www.esat.kuleuven.ac.be/~rijmen/crap/index.html 
 or
http://www.ii.uib.no/~larsr/crap.html 
 
Happy holidays

------------------------------

From: [EMAIL PROTECTED] (R. Knauer)
Subject: MTP Complex Number Cipher
Date: Wed, 30 Dec 1998 21:13:14 GMT
Reply-To: [EMAIL PROTECTED]

Here is an MTP (Many Time Pad) cipher based on complex numbers.

1) Create a random pad with a TRNG. Group the bits into 7-bit numbers.
Group these into three numbers, (a, b, c), for the two complex
numbers: (a + ib), (c + id), using each 7-bit ASCII plaintext number
as 'd' in the second complex number above.

2) Form the product (a +ib)*(c + id) and output the real part as the
first character in the ciphertext and the imaginary part as the
second: (ac - bd), (ad + bc).

3) Decipher the sequence by taking the product of the first two
outputs in #3 treated as a complex number times the complex conjugate,
(a -ib), divided by the squared modulus (a**2 + b**2). This gives
(c + id), which in turn gives 'd' since 'c' is known.

Are there any realistic analytic attacks possible if the pad is used
multiple times? It would seem that because of the way that the
ciphertext is built, it is not possible to learn the identity of any
group of three numbers from a known plaintext attack.

IOW, even if I know the ciphertext pair, (X, Y), belongs to a known
plaintext character, 'd', I cannot know (a, b, c) even though
 X = (ac - bd) and Y = (ad + bc).

Bob Knauer

"He knows nothing and he thinks he knows everything. That points
clearly to a political career."
--George Bernard Shaw


------------------------------

From: [EMAIL PROTECTED] (Jim Dunnett)
Subject: Re: U.S. Spying On Friend And Foe
Date: Wed, 30 Dec 1998 21:49:53 GMT
Reply-To: Jim Dunnett

On Tue, 29 Dec 1998 18:57:09 -0800, Anthony Stephen Szopa
<[EMAIL PROTECTED]> wrote:

>> Germany's BND, too, has apparently cooperated with the U.S. encryption
>> rigging scheme through Siemens Defense Electronics Group of Munich. 

...and a British supplier of cryptographic equipment, whose
name begins with 'R' has also been playing this very nasty
game...

-- 
Regards, Jim.                | A drunk man is more likely to find a
olympus%jimdee.prestel.co.uk | woman attractive. So if all else fails,
dynastic%cwcom.net           | get him drunk.
nordland%aol.com             | - Dr Patrick McGhee, who coaches women
marula%zdnetmail.com         |   on successful dating.         
Pgp key: wwwkeys.uk.pgp.net:11371

------------------------------

From: [EMAIL PROTECTED]
Subject: Re: On leaving the 56-bit key length limitation
Date: Wed, 30 Dec 1998 22:30:20 GMT

In article <[EMAIL PROTECTED]>,
  [EMAIL PROTECTED] (wtshaw) wrote:
> In article <769jjn$p6e$[EMAIL PROTECTED]>,
> [EMAIL PROTECTED] wrote:
>
> >
> > 6. The final word on cryptographic strength is thus not to be found
> > in enforced export controls for key length. It is to be found in our
> > own drawing boards in terms of a system's "unicity distance" and its
> > derived design issues, which is feasible to deal with and lies in our
> > hands.
>
> You say something important, but serves to further minimize the meaning of
> keylength as it has been thrown around.

Yes, I intended to highlight both sides of the issue -- even though the WA
thinks otherwise ;-)

>In short, keylength is not much
> of an absolute cross-algorithm measure of much of anything, certainly not
> cryptological strength.  It matters more what the actual algorithm is and
> if it is crackable without brute forcing the keys.  Some do not exhibit a
> unicity distance at all,

That is one of the problems of the current definition for "unicity distance"
-- which I hope to address here for public comments, soon.


> >
> > 7. To reach the heart of the matter, one must devise ways to thwart
> > automatic surveillance decoding -- which is additional from only
> > making it harder or theoretically-impossible to decipher the
> > messages, as dealt with by TSCSs. The objective here is to make
> > decryption either ambiguous or ambiguously related to the key, even
> > if sucessful (say, by collusion, forced escrow, etc.). So, the
> > attacker would have difficulties to detect that a key does NOT work,
> > that it DOES work, and what the decrypted message is, from a possible
> > list of choices.
>
> This is characteristic of some ciphers, but they tend to have a bigger
> number keylength than 56 bits.  And, the block size can be large enough to
> slow a search down to a crawl.

Yes, actively denying efficicient surveillance is a way to protect privacy,
since it negates current scaling laws.

> > To close, in order to extract the full benefit from such approach to
> > security as commented in the seven items above, I believe that one
> > must first revisit the concept of "unicity distance" -- since it is
> > usually regarded more as a "proof-of-concept" definition than a
> > practical tool.  Which is IMO due to a series of unfortunate
> > historical facts -- beginning with the name, since it is not a
> > "distance" (i.e., it is not a metric function that provides
> > distance).
>
> I agree.  It was based on a limited set of strength deficient algorithms,
> without insight into the spectrum of them we have today.  Still, many are
> mislead into thinking that unicity distance is something more than just a
> measure that can be applied to many weaker ciphers.

My (yet) unsubstantiated remark is that the usefulness of "unicity distance"
is perhaps not so much as a measure to help "grade" cipher systems (but where
it may miserably fail even if used as defined), but it seems to be more
interestingly related to what a cipher system may disclose at most -- when
any number of its cipher messages are attacked by unlimited computational
resources, including time.

> >
> > BTW, on leaving the 56-bit key length limitation we may well bid
> > farewell to security systems which do not take into account the
> > message's statistics and perfunctorily pad bits -- which is a funny
> > flaw, since the attackers of such systems always tend to do
> > otherwise.
> >
> Those who ignore the past are bound to have it what it holds bite them on
> the foot, or elsewhere.  Certain features incorporated into many ciphers
> are misguided, if not the product of basic ignorance, while others are
> indeed good; racing stripes painted on an elephant will not increase its
> agility.
>
> As a result of my studies of somewhat deficient ciphers I sometimes
> implement with others, the obvious is that frequent key changes, working
> within a unicity distance, can make them quite practical.
>

Frequent key changes have the further benefit of restoring trust that the key
was not attacked *at* the encrypting host.

> However, it would be premature to accept a 56 bit limit imposed by a
> collection of dunderheads.

It is not a question of "acceptance" -- since denial was not offered ;-)
Neither voting ;-)

Rather, my comments about "unicity distance" motivate a different attitude
towards WA's current limitations on key-length, that could be better described
by the French word nonchalance -- "having an air of easy unconcern or
indifference"  ;-)

Cheers,

Ed Gerck

============= Posted via Deja News, The Discussion Network ============
http://www.dejanews.com/       Search, Read, Discuss, or Start Your Own    

------------------------------

From: [EMAIL PROTECTED] (David Hamilton)
Crossposted-To: alt.privacy,fido7.crypt,talk.politics.crypto
Subject: Re: Free ENCRYPTION Programs (32b)
Date: Wed, 30 Dec 1998 22:56:41 GMT

=====BEGIN PGP SIGNED MESSAGE=====

[EMAIL PROTECTED] wrote:

>Don't let assholes who no nothing about encryption make
>your minds up. Look at the source code and check program
>out your selves.

It takes very little knowledge of cryptography to ask the 6 questions that I
have. It does take some knowledge of cryptography to answer them though. And
you can't answer them. Your insults will not prevent people from realising
what follows from your fear of attempting to give answers.

Give reasons why people should use your software. Any supporting evidence? Or
just your usual claims without evidence?

By the way. In a recent post, you said:
>The reason that only that area has
>changed is becasue you have been brain washed in to false sense
>of security of CBC by clever folks like Mr NSA Bruce and pompous
>assholes like David Hamilton.
I challenged you to find a posting of mine where I had ever given a view on
the merits or demerits of CBC. Have you done so yet? You haven't have you?
Because such a posting doesn't exist. Another false statement of yours.

>David Scott
>
> While your at it enter my contest it is free

False. It will cost time/effort.


David Hamilton.  Only I give the right to read what I write and PGP allows me
                           to make that choice. Use PGP now.
I have revoked 2048 bit RSA key ID 0x40F703B9. Please do not use. Do use:-
2048bit rsa ID=0xFA412179  Fp=08DE A9CB D8D8 B282 FA14 58F6 69CE D32D
4096bit dh ID=0xA07AEA5E Fp=28BA 9E4C CA47 09C3 7B8A CE14 36F3 3560 A07A EA5E
Both keys dated 1998/04/08 with sole UserID=<[EMAIL PROTECTED]>
=====BEGIN PGP SIGNATURE=====
Version: PGPfreeware 5.5.3i for non-commercial use <http://www.pgpi.com>
Comment: Signed with RSA 2048 bit key

iQEVAwUBNoqvDMo1RmX6QSF5AQE/aQf+Kv5EXeGr7M3xo0mn2EYEqtawjQZMFI0W
x6QIilr5ZrgbIwFSjBw17mS8C5Z5b1wYb5xEI2WxcIbcmjZuVDaQhaK655DuMvn5
C+wSVP0xIAayi9WQLSs6knUzoc21KI8vV7SfM1qg+2vMnwjlQcisVaV0I0v/hsaB
J7kHt6ZuTUAH4z+l5tQ3NXg6FTXjEq/MDzCK2jW5aoI9y76NP3abGIj6B7M5WgXZ
OmO8/HapwTOMrcmFwdkKNzOt9f+7saay9tRW6AgL7QbJyj9oJC43ISzh01GagVaB
82hqS5UqUEIdPmSyGPx5quTcRtUsI9TYTCKayVO21/u7vgKUYoQ8BA==
=d/Sy
=====END PGP SIGNATURE=====

------------------------------


** FOR YOUR REFERENCE **

The service address, to which questions about the list itself and requests
to be added to or deleted from it should be directed, is:

    Internet: [EMAIL PROTECTED]

You can send mail to the entire list (and sci.crypt) via:

    Internet: [EMAIL PROTECTED]

End of Cryptography-Digest Digest
******************************

Reply via email to