Cryptography-Digest Digest #115, Volume #11 Sun, 13 Feb 00 16:13:01 EST
Contents:
Re: Period of cycles in OFB mode (Tim Tyler)
Re: Has some already created a DATA DIODE? (Terry Ritter)
Re: Block chaining (Tim Tyler)
Q: Division in GF(2^n) (Mok-Kong Shen)
Re: Which compression is best? (Tim Tyler)
Re: Is there a list server for this newsgroup? (Mok-Kong Shen)
Re: SHA1 and longer keys? (Tom St Denis)
Re: Which compression is best? (wtshaw)
Re: Which compression is best? (wtshaw)
Re: Guaranteed Public Key Exchanges (Mok-Kong Shen)
Re: Message to SCOTT19U.ZIP_GUY (wtshaw)
Re: Basic Crypto Question 2 (wtshaw)
Re: Is there a list server for this newsgroup? (Tony L. Svanstrom)
Basic Crypto Question 3 ([EMAIL PROTECTED])
Re: Guaranteed Public Key Exchanges (Ralph Hilton)
Re: New encryption algorithm ABC. ([EMAIL PROTECTED])
----------------------------------------------------------------------------
From: Tim Tyler <[EMAIL PROTECTED]>
Subject: Re: Period of cycles in OFB mode
Reply-To: [EMAIL PROTECTED]
Date: Sun, 13 Feb 2000 17:03:06 GMT
Helger Lipmaa <[EMAIL PROTECTED]> wrote:
: David Wagner wrote:
[problems using counter mode?]
:> Do you mean using a LFSR to drive the block cipher?
:> That sounds like a good idea, if it was what you meant.
: Or may be Gray codes :-)
Gray codes /could/ be even worse. At least with a counter a few bits
change on an increment once in a while. With Gray codes you only ever
have one bit change at a time.
--
__________
|im |yler The Mandala Centre http://www.mandala.co.uk/ [EMAIL PROTECTED]
Legalise IT.
------------------------------
From: [EMAIL PROTECTED] (Terry Ritter)
Subject: Re: Has some already created a DATA DIODE?
Date: Sun, 13 Feb 2000 17:15:17 GMT
On Sun, 13 Feb 2000 13:35:54 GMT, in <886bvq$ods$[EMAIL PROTECTED]>, in
sci.crypt Tom St Denis <[EMAIL PROTECTED]> wrote:
>[...]
>AlgM is a good place to start, just don't use LCG's as the underlying
>PRNG.
I doubt that is nearly enough.
>AlgM with two 'long' Fibonacii generators is secure.
Really? Care to back that up?
We have been through this before, and I have quoted at length from the
attack references. As far as I know, MacLaren-Marsaglia techniques
have appeared exactly twice in cryptographic literature -- and they
have *failed* twice. Yet even with this information you continue to
claim strength where there is a damn good reason to suspect that there
is no strength. Why do you do it, and why do you do it in response to
newbies?
---
Terry Ritter [EMAIL PROTECTED] http://www.io.com/~ritter/
Crypto Glossary http://www.io.com/~ritter/GLOSSARY.HTM
------------------------------
From: Tim Tyler <[EMAIL PROTECTED]>
Subject: Re: Block chaining
Reply-To: [EMAIL PROTECTED]
Date: Sun, 13 Feb 2000 17:24:48 GMT
David Wagner <[EMAIL PROTECTED]> wrote:
: In article <[EMAIL PROTECTED]>, Tim Tyler <[EMAIL PROTECTED]> wrote:
:> However... Plaintext Block Chaining (PBC), and Plaintext FeedBack (PFB)
:> modes allow parallel processing in the encryption direction.
: Are these ones secure, when used with non-random plaintexts?
: (e.g., English text as the plaintext)
: I'm not worried so much about chosen-text attacks as whether there is
: the possibility that they might share some of the weaknesses of ECB mode,
: but to a lesser extent. Any thoughts?
A cursory examination suggests that birthday-type attacks on ECB mode
which depend on repeated plaintext blocks, would need repeats of two
consecutive plaintext blocks (instead of one) to work against PBC mode.
Even then there'd be the possibility that the repeated cyphertext had
other causes besides repeated plaintext. However this may be a component
of the reason why these modes do not see much use.
Chosen plaintexts are also a concern ;-/
I find the lack of any feedback from the output of the block cypher to
its inputs rather aesthetically displeasing.
Also, while encryption can be done in parallel, decryption cannot.
This may waste the time of the brute-force attacker - but otherwise
generally seems undesirable.
--
__________
|im |yler The Mandala Centre http://www.mandala.co.uk/ [EMAIL PROTECTED]
Dates in calendars are often closer than they appear.
------------------------------
From: Mok-Kong Shen <[EMAIL PROTECTED]>
Subject: Q: Division in GF(2^n)
Date: Sun, 13 Feb 2000 19:19:23 +0100
I learned from a mailing list that there is a US patent US5890800
'Method and device for the division of elements of a Galois field'
(http://www.patents.ibm.com/details?pn=US05890800) for computing
A/B in GF(2^n). The algorithm amounts to computing with squaring
and multiplication of numbers A and B represented in n bits the
expression
A^(2^n) * B^(2^n-2)
The writer of a post in the mailing list then reasoned that this
reduces to A/B, since x^(2^n-1) = 1 for all x.
It seems, however, that the algorithm is doing computations all
the way in Z_(2^n) instead of GF(2^n). So I wonder how the binary
number thus obtained can be the intended result. Would someone
please kindly help? Thanks in advance.
M. K. Shen
------------------------------
From: Tim Tyler <[EMAIL PROTECTED]>
Subject: Re: Which compression is best?
Reply-To: [EMAIL PROTECTED]
Date: Sun, 13 Feb 2000 18:04:03 GMT
David Hopwood <[EMAIL PROTECTED]> wrote:
: Tim Tyler wrote:
:> Mok-Kong Shen <[EMAIL PROTECTED]> wrote:
:> : John Chandler wrote:
:> :> I just noticed a post crediting "DS" with suggesting using
:> :> adaptive compression in _both_ directions before encrypting.
:>
:> : If the forward pass does a good compression job, I wonder how much
:> : additional compression could be expected from the backward pass.
:>
:> The backward pass is not intended to compress. AFAICS, It's purpose is
:> primarily to make sure that information necessary to decrypt plaintext is
:> not present in any fragment of the message.
: Depending on the compression algorithm, the backward pass is not guaranteed
: to do anything substantive at all. Many adaptive algorithms will simply
: give up and copy the data verbatim (with a tag indicating that is what has
: been done) when they find that it isn't compressible by a significant
: amount - which it won't be after the forward pass. [...]
This is true in general - but is not true for David's compressor which
he has suggested be used in this manner.
:> This means (for one thing) that an analyst is effectively forced to
:> decrypt the whole message in order to recover any decompressed text at all
:> - which you will probably need to do before you can reject a key.
: The analyst is not necessarily forced to do any such thing, unless you
: place much stronger constraints on the compression algorithm than are
: normally assumed [...]
The analyst certainly *should* be. This is a fairly central reason behind
doing this in the first place. If the attacker can still analyse
plaintext from the file from a small fragment, you've not done things
correctly - your system probably contains an implementation problem.
: The problem seems to be that this construction tries to use a
: compression algorithm for mixing, which it is not designed for.
I agree with the sentiment here. I feel David is using a compressor
as a diffuser, mainly because he has one handy, and it seems to work
pretty well when doing serial processing.
After the first compression pass (with some unknown algorithm) you /may/
still need to diffuse in one direction.
/Apart/ from compressors, I'm not aware of other structures designed to
diffuse information in significant volumes in one direction. A proper,
full file diffuser may not be in order on grounds of performance on serial
hardware. It may be overkill anyway, since /some/ diffusion in one
direction is already likely to have taken place.
If anyone knows of a good structure (apart from a compressor) that
performs genuine diffusion in significant volumes, in at least one
direction, works reasonably fast on serial machines, and doesn't expand
the data too much, I'm interested in knowing more about it ;-)
: If you need an all-or-nothing-transform, use something that has been
: analysed for that specific use; don't try to construct one badly from a
: compression algorithm.
A sentiment I'd love to agree with. I have access to some wonderful
diffusers (based on lattice gasses) that I'd /love/ to use - but they
need parallel hardware to work fast. It's serial hardware, I'm
concerned about.
At the moment, the nearest thing I know of is a large-scale
pseudo-random transposition - a process which doesn't diffuse at all - but
/does/ split any known plaintext over a large number of blocks, "diluting"
it in the process.
I hardly consider it to be the right tool for the job, though ;-(
--
__________
|im |yler The Mandala Centre http://www.mandala.co.uk/ [EMAIL PROTECTED]
Flashlight: case for holding dead batteries.
------------------------------
From: Mok-Kong Shen <[EMAIL PROTECTED]>
Subject: Re: Is there a list server for this newsgroup?
Date: Sun, 13 Feb 2000 19:46:00 +0100
No Brainer wrote:
>
> I was just wondering if there is a list server linked to this
> newsgroup...it's just a hell of a lot easier to compose replies etc etc.
At least on my PC I don't yet see any difference in the effort of
composing replies in the two cases. (Perhaps you meant that you
need to be connected online in one case longer?)
M. K. Shen
------------------------------
From: Tom St Denis <[EMAIL PROTECTED]>
Subject: Re: SHA1 and longer keys?
Date: Sun, 13 Feb 2000 18:36:10 GMT
In article <886n6a$vlq$[EMAIL PROTECTED]>,
[EMAIL PROTECTED] wrote:
>
>
> > > For i = 2 to 8
> > > result = HMAC_SHA1(password, sessionkeys(i)+HMAC_SHA1(password,
> > result+
> > > sessionKeys(i-1)))+result
> > > Next
> > >
> > > returns 160 bytes I use as key. password is the literal password
> > string.
> > > Is this secure at all? If not, what's the best way to get a
larger key
> > > than the result of SHA1?
>
> >
> > You would never need keys this big comming from a user supplied
> > passwd. But lets suppose you want a 320 bit key... [using sha]
Just
> > do this.
> >
> > K1 = SHA(PASSWD + SALT)
> > K2 = SHA(K1 + PASSWD)
> >
> > K = K1 || K2
> >
> > It's basically a CBC style chaining mode. 'Salt' should be some >64
> > bit value [which is publicly known, but unique to each session key]
> > appended to the password in the first hash.
>
> Well, thanks for the reply but it didn't really answer my question. I
> know I don't need a 1280 bit key - this was just an example. However,
my
> question was, wether my chaining method using HMAC_SHA1 is secure. Is
it?
>
> Thanks,
>
> John Stone
As long as the 'sessionkeys[]' array is new upon each itteration? I
supoose it's ok. All you have todo is salt the first hash, then chain
the previous result.
Simpler would be
K[0] = SHA(passwd + SALT)
for i = 1 to 8 do
K[1] = SHA(passwd + K[i - 1])
Where giving out one chunk of the password say K[0] doesn't give the
rest, but finding passwd obviously gives it all.
Tom
Sent via Deja.com http://www.deja.com/
Before you buy.
------------------------------
From: [EMAIL PROTECTED] (wtshaw)
Subject: Re: Which compression is best?
Date: Sun, 13 Feb 2000 12:12:32 -0600
In article <[EMAIL PROTECTED]>, Helger Lipmaa <[EMAIL PROTECTED]> wrote:
> Tim Tyler wrote:
> > You're compressing in order to add strength, and frustrate the analyst -
> > *not* simply to remove redundancy present in the plaintext. Size simply
> > is not the only criteria which is relevant.
>
> This is done by encryption, frustrating the analyst has never been an
> objective of compression. Leave security for cryptography and compressing for
> compression!
>
Compression may be necessary prior to encryption to meet expectations of
the whole process. A form of compression may just as well add optional
keyspace to the overall algorithm.
Something I am involved it is a good example: Consider that a social
security number is a good identifier but leaving it the clear would be a
bad idea for privacy's sake. The record that contains the number may only
have so much character space assigned to that field. By using base
translation, a form of compression that can sport keys as well, the SS#
can be shortened so that when it is encrypted by the GVA, it results in
the same or almost as few characters as the original number, but is is an
inductive ciphertext, making it all but impossible to break back to the
original number without having the keys. Cute?
--
If Al Gore wants to be inventor of the internet, complain that he
did a lousy job. If he admits to not doing much, complain that he
is a slacker. Now, do we want him in charge of security?
------------------------------
From: [EMAIL PROTECTED] (wtshaw)
Subject: Re: Which compression is best?
Date: Sun, 13 Feb 2000 12:01:52 -0600
In article <[EMAIL PROTECTED]>, "Douglas A. Gwyn"
<[EMAIL PROTECTED]> wrote:
> Tim Tyler wrote:
> > While this seems to cover pretty much 0% of them. How can you
> > /know/ that any confidence you have in your cryptosystem is
> > justifiable? You can't. You have to juggle probabilities.
>
> You prove an appropriate theorem using appropriate methods
> (as opposed to blathering about how you don't know all methods
> of attack).
A ship may feel lost if it can see no land or does not know the land it
might see. You must work with things you know, not complain too bitterly
about things you don't, and try to keep your mind ready to deal with
possibly relevant information that does not fit in with your prior
knowledge.
--
If Al Gore wants to be inventor of the internet, complain that he
did a lousy job. If he admits to not doing much, complain that he
is a slacker. Now, do we want him in charge of security?
------------------------------
From: Mok-Kong Shen <[EMAIL PROTECTED]>
Subject: Re: Guaranteed Public Key Exchanges
Date: Sun, 13 Feb 2000 20:24:38 +0100
No Brainer schrieb:
>
> Shen,
>
> On Fri, 11 Feb 2000 08:19:32 +0100, Mok-Kong Shen <[EMAIL PROTECTED]>
> wrote:
>
> <snip>
>
> > As far as I know, you could only 'approximately' not 'absolutely'
> > solve your problem, in that either you have a third party certifying
> > the public key, eventually via a hierachy of trust centers, or have
> > so-to-say the 'public' (this could in the special case simply be
> > one person) certifying that, i.e. in the art of the web of trust of
> > PGP. (Interestingly the two approaches could be compared to
> > centralized vs. distributed computing.) There has to be somewhere
> > some trust, i.e. something on which you put your (subjective,
> > hopefully not blind) faith on, and that's something one can have
> > no prove (by definition).
>
> Aaah...so if I had a third party (or other party) certifying the public key it
> should be OK?
>
> Two questions though; how would it be done over the Internet and what is
> "approximately"? :)
>
> AND...if I did use this model, could I still obtain the public key via the
> Internet?
Sorry, if my use of the word 'approximately' was confusing. I meant
you don't have 100.00% security even if you use certificates etc.,
since you have somehow to 'trust' something. So you don't 'really'
solve your problem in the absolute sense. (Of course you can get
the certificates via the internet.)
M. K. Shen
------------------------------
From: [EMAIL PROTECTED] (wtshaw)
Subject: Re: Message to SCOTT19U.ZIP_GUY
Date: Sun, 13 Feb 2000 12:19:46 -0600
In article <[EMAIL PROTECTED]>, "Douglas A. Gwyn"
<[EMAIL PROTECTED]> wrote:
> Tim Tyler wrote:
> > Encrypting something twice does not double the time to break.
> > Speaking *very* roughly, if anything, it squares it.
>
> If the encryption uses the same key, then it doubles the time
> for a brute-force key search.
Then, you divide the keyspace and use two machines for the same search if
the cipher is not a group. If it is, you need only one.
--
If Al Gore wants to be inventor of the internet, complain that he
did a lousy job. If he admits to not doing much, complain that he
is a slacker. Now, do we want him in charge of security?
------------------------------
From: [EMAIL PROTECTED] (wtshaw)
Subject: Re: Basic Crypto Question 2
Date: Sun, 13 Feb 2000 12:24:19 -0600
In article <[EMAIL PROTECTED]>, "Douglas A. Gwyn"
<[EMAIL PROTECTED]> wrote:
> [EMAIL PROTECTED] wrote:
> > 1.Which are the Top 5 PRNG's for practical cryptosystems?
>
> There aren't any. Random number generation and cryptology are
> different applications with different requirements.
Except that an encryptive means may be or be a part of a generator. I
seem to remember that was mentioned as a possible requirement for an AES
submission, that it might be used in generation.
--
If Al Gore wants to be inventor of the internet, complain that he
did a lousy job. If he admits to not doing much, complain that he
is a slacker. Now, do we want him in charge of security?
------------------------------
From: [EMAIL PROTECTED] (Tony L. Svanstrom)
Subject: Re: Is there a list server for this newsgroup?
Date: Sun, 13 Feb 2000 20:23:43 +0100
Mok-Kong Shen <[EMAIL PROTECTED]> wrote:
> No Brainer wrote:
> >
>
> > I was just wondering if there is a list server linked to this
> > newsgroup...it's just a hell of a lot easier to compose replies etc etc.
>
> At least on my PC I don't yet see any difference in the effort of
> composing replies in the two cases. (Perhaps you meant that you
> need to be connected online in one case longer?)
That's all just because bad software, get a good off-line newsreader and
you'll see news/usenet as something "new and improved". ;-)
I'm saving hours each month (most likely each week) by using MacSOUP
instead of Netscapes Communicator for news.
/Tony
--
/\___/\ Who would you like to read your messages today? /\___/\
\_@ @_/ Protect your privacy: <http://www.pgpi.com/> \_@ @_/
--oOO-(_)-OOo---------------------------------------------oOO-(_)-OOo--
DSS: 0x9363F1DB, Fp: 6EA2 618F 6D21 91D3 2D82 78A6 647F F247 9363 F1DB
---���---���-----------------------------------------------���---���---
\O/ \O/ �1999 <http://www.svanstrom.com/?ref=news> \O/ \O/
------------------------------
From: [EMAIL PROTECTED]
Subject: Basic Crypto Question 3
Date: Sun, 13 Feb 2000 20:02:48 GMT
Cascading Multiple Block Ciphers:
1. If a plain text is encrypted with three Ciphers, is it as strong as
the strongest Cipher in the Chain or as week as the weekest?
2. Coud there be some subtle interaction between the algorithms that may
reduce security or reduced keyspace?
3. Is there an optimum way of combining ciphers together or
rules...assuming that the cascade is made up of block ciphers of the
same size and key length? i.e. should one choose IDEA, 3DES, CAST128 or
Blowfish,IDEA,CAST128, or Blowfish, RC5 and 3DES......what are the
criteria???
4. What if the ciphers have different block size and key length, is it
still ok to cascade them? Blowfish, Twofish, IDEA?
5. Is it too complex to alternately encrypt the plaintext blocks with
the different ciphers in one Pass????? Does that make sense?
Sent via Deja.com http://www.deja.com/
Before you buy.
------------------------------
From: Ralph Hilton <[EMAIL PROTECTED]>
Subject: Re: Guaranteed Public Key Exchanges
Date: Sun, 13 Feb 2000 21:01:04 +0100
Reply-To: [EMAIL PROTECTED]
=====BEGIN PGP SIGNED MESSAGE=====
Hash: SHA1
On Mon, 14 Feb 2000 00:19:59 +0800, No Brainer <[EMAIL PROTECTED]>
wrote:
>Paul,
>
>On Fri, 11 Feb 2000 15:14:14 -0500, Paul Koning <[EMAIL PROTECTED]>
>wrote:
>
><snip>
>
>> The issue is: how you you bootstrap this? I.e., how do you get that
>> first key, the one from A? The same problem exists in X.509 and
>> similar certificate systems, the only difference is that these use
>> trees while the WOT uses graphs.
>
>That's the golden question Paul; how DO you get the first key?
>
>Surely there must be *some* way to get a key over the Internet in a
>trusted manner without someone "intercepting/modifying" it on the way
>through?
Yes:
Alice and Bob wish to establish a key, All communications are monitored by
Charlie. The communications would have to appear in public though to avoid
unobserved modification. But the fact of the key exchange being public is
irrelevant.
They agree on a large prime p and a root r. Alice chooses secretly a large
number a and Bob chooses a large number b. (both < the prime).
Alice calculates r^a mod p and publishes it.
Bob calculates r^b mod p and publishes it.
Alice computes (r^b )^a mod p and obtains the key s.
Bob computes (r^a )^b mod p and obtains the key s.
So - for some small numbers -
p = 7 r=3
Alice chooses 3 and Bob chooses 5
Alice calculates 3^3 = 27 . mod 7 = 6
Bob calculates 3^5 = 243 . mod 7 = 5
Both are published.
Alice calculates 5^3 = 125 . mod 7 = 6.
Bob calculates 6^5 = 7776 . mod 7 = 6.
Charlie can't derive the key 6 from the data he has as easily as Alice and
Bob can so one can increase the size of the numbers to a point where it
isn't feasible for Charlie to calculate the key s.
So - for a bit larger numbers:
p = 46337 r=71
Alice chooses 537 and Bob chooses 197
Alice calculates 71^ 537 mod 46337 = 16906
Bob calculates 71^197 = mod 46337 = 12543
Both are published.
Alice calculates 12543^537 mod 46337 = 37289
Bob calculates 16906^197 mod 46337 = 37289
Without knowing the numbers Alice and Bob chose it is quite difficult for
him to derive s although doable. With larger numbers s would be secure.
s might be an unusable result so prior agreement would have to deal with
such cases.
I've got a windoze program that does simple calculations with 32 bit
integers showing the process if you want it.
The method was developed in 1976 by Whitfield Diffie and Martin Hellman in
1976.
=====BEGIN PGP SIGNATURE=====
Version: 6.5.1ckt
Comment: Fingerprint: 8E22 FC69 3FB3 F53A 0B0D 4392 409D AE0D 1173 21D0
iQA/AwUBOKb/R0Cdrg0RcyHQEQItPwCfX7L7Yj/uyIeZCfBiX5YYA7AzYYMAn1nw
AIePfNjUbEaVPo+8Cyxql0oG
=5XJp
=====END PGP SIGNATURE=====
------------------------------
From: [EMAIL PROTECTED]
Subject: Re: New encryption algorithm ABC.
Date: Sun, 13 Feb 2000 20:06:12 GMT
In article <T6yp4.8832$[EMAIL PROTECTED]>,
"Bogdan Tomaszewski" <[EMAIL PROTECTED]> wrote:
> I discovered new encryption algorithm. I called him ABC - Alternative Binary
> Code.
> Permits on usage of key about unrestricted lengths from this
> neself( dependent only from lengths of key) with level of safety.
> You know some programme testing entropia or breaking keys?
> I can somewhere check whether message after coding realizes conditionses of
> good agorithm?
To find out if the algorithm is good, you should post it here into the
public. I think here are many people who at least can tell you when it
happens to be not so good. If you test the cyphertext output with some
programs, you probably will get to know that it is very insecure if it is
very insecure. But if your algorithm passes the test, this might just
mean that it is quite insecure, just not insecure enough to be detected
by simple statistical analysis.
I'm not an expert, just a newbie, but here's one test that might work:
Take a very large chunk of plain text (e.g. a book in electronic form)
and encrypt it to raw binary data. Then take the best compression tool
you know and compress the cyphertext file. If there's a significant
amount of compression as a result, your algorithm might be weak.
Greetings,
John Stone
Sent via Deja.com http://www.deja.com/
Before you buy.
------------------------------
** FOR YOUR REFERENCE **
The service address, to which questions about the list itself and requests
to be added to or deleted from it should be directed, is:
Internet: [EMAIL PROTECTED]
You can send mail to the entire list (and sci.crypt) via:
Internet: [EMAIL PROTECTED]
End of Cryptography-Digest Digest
******************************