Cryptography-Digest Digest #952, Volume #10      Sat, 22 Jan 00 02:13:02 EST

Contents:
  discussions/articles about about third party authentication models? (Peter Broadhead)
  Re: simplistic oneway hash ([EMAIL PROTECTED])
  Re: MIRDEK: more fun with playing cards. (Paul Rubin)
  Re: ECC vs RSA - A.J.Menezes responds to Schneier ([EMAIL PROTECTED])
  Re: apology (Jeff Lockwood)
  Re: MIRDEK: more fun with playing cards. ("r.e.s.")
  Re: NIST, AES at RSA conference (Hammer)
  Re: ECC vs RSA - A.J.Menezes responds to Schneier (Tom St Denis)
  Re: NIST, AES at RSA conference (Nicol So)
  Re: What's with transposition? ("Douglas A. Gwyn")
  Re: Combination of stream and block encryption techniques ("Douglas A. Gwyn")

----------------------------------------------------------------------------

From: [EMAIL PROTECTED] (Peter Broadhead)
Subject: discussions/articles about about third party authentication models?
Date: Sat, 22 Jan 2000 01:20:13 GMT


I am interested in conceptual models or practical implementations of
third party authentication in otherwise two party data transactions,
in which there are large numbers of different potential parties at
each end of the data exchanges.  

By this I mean models where Ted gets to authorise or not a transaction
of data about Ted, but between Alice and Bob, where there may be a
large number of potential Alices and Bobs.

This is of course the model for electronic funds transfer, where Alice
and Bob are (say) the Bank and the Supermarket, and Ted is the
customer.  

EFT as I understand it is based upon Ted verifying his identity to his
Bank by sending over an encrypted link a cleartext passphrase known to
the Bank (ie Ted's PIN).  This is fine provided Ted only wants to deal
with one or two banks or else has an excellent memory for his many
different PINs if he wishes to deal with many more banks.  (I am
neglecting the option for Ted to choose the same PIN for each account
with all his different banks, as I wish to avoid the "crack one crack
them all" scenario.)

I am particularly interested to know whether there are any articles or
discussions of conceptual models where there are a very large number
of "banks" (and a very large number of "supermarkets"), and there is
not the capacity for Ted to have the same passphrase with each "bank".
So Ted now has to have the capacity to handle (preferably very simply)
authorising (through authentication) transactions between a large
number of different parties.


The best example I can think of is as follows:

Ted has a smartcard on which he has a private key and an RSA
encryption app. He supplies his public key to each "bank". Whenever he
wishes to authorise a transaction, the "bank" sends a unique random
session id which Ted returns encrypted by the card with his private
key.  The "bank" then checks the result using his public key, and
proceeds with the transaction if the check returns true.

Access to the encryption function on the card would be governed by a
PIN or biorecognition (eg thumbprint) on the card.  (The thumbprint
reading aspect is not sci-fi as there is at least one european health
insurer currently piloting a print recognition smartcard.)

The card would also have the capacity to provide the public key to a
"bank", in a registration process, so that Ted can register himself
with as many "banks" as he likes, whenever he likes, by slotting the
card in a registration session.

My pref. would be to have the card app do the key generation, or else
have an offcard app. within Ted's control (eg on his PC), together
with a routine to load the new key set on to the card if and only if
Ted has "opened" the card using his PIN or thumb or whatever.

Can anyone point me to any discussions of conceptual models or
articles which discuss this sort of approach? Is there anyone doing it
in practice?

Reason I ask, is that this approach would provide a universal
recognition/authentication token within Ted's control, with only Ted
holding the private key (not some key generated by a "trusted"
certification authority).

Such a token would have numerous applications, for example, avoiding
the need to remember umpteen different passwords on various computer
systems (drives me batty:-).  Just slot the smartcard and thumb the
print-reading window on the card, and away you go.

The other thing I like about it is that is doesn't require any central
contorl of identity info. If for whatever reason Ted wants to have an
alias, he simply has a second card for that alias.  

I realise there are further issues about key revocation and the need
for open standards.  But none of these are insurmountable.  Probably
the fastest way to blow the standards problems into the weeds would be
for Microsoft to adopt this approach as an alternative to password
authentication in their operating systems, and encourage hardware
manufacturers to build card readers into PCs.

?

PB



------------------------------

From: [EMAIL PROTECTED]
Subject: Re: simplistic oneway hash
Date: Sat, 22 Jan 2000 01:54:28 GMT

[EMAIL PROTECTED] wrote:

> Schneier says "no weaknesses in MD2 have been found", and cites a
> 1993 study.  Has something been discovered since then?

Yes.  RSA Labs now recommends against using it.  See:
ftp://ftp.rsasecurity.com/pub/pdfs/bulletn4.pdf

--Bryan


Sent via Deja.com http://www.deja.com/
Before you buy.

------------------------------

From: [EMAIL PROTECTED] (Paul Rubin)
Subject: Re: MIRDEK: more fun with playing cards.
Date: 22 Jan 2000 02:03:13 GMT

In article <[EMAIL PROTECTED]>, CLSV  <[EMAIL PROTECTED]> wrote:
>Ah, you forget a step I believe, shouldn't it be
>5) the modulo of the sum of the cards swapped gives
>   the index to the card whose value is the output
>
>The card swapping is indeed fast but the administration
>in my head always slows things down. But that's probably because
>I'm not a fast thinker.

Don't try to do the arithmetic mod 52.  Just do it mod 13, based
on (say) K=0, A=1, 2=2, ..., 10=10, J=11, Q=12 to locate the 
output column of the output card.  To get the row of the output
card, add the *suits* of the swapped cards, mod 4.  I like the
visual mnemonic of Diamond=0, spade=1, heart=2, club=3.  Very cute.

------------------------------

From: [EMAIL PROTECTED]
Subject: Re: ECC vs RSA - A.J.Menezes responds to Schneier
Date: Sat, 22 Jan 2000 02:43:34 GMT

On Tue, 18 Jan 2000 13:26:04 -0600, Mike Rosing
<[EMAIL PROTECTED]> wrote:

>Moore's law isn't academic.  

Moore's "law" isn't a law. It is an observation. Or, if you like, an
expression of faith.  "Because it has been so for this long, it will
be so in future."

I observe that a falling body accelerates at 9.8m/sec^2.  
It says nothing about when it will hit the ground, or what will happen
when it does.

To put it another way, the assumption that Moore's "law" will hold
forever, is like the old joke about jumping off a twenty story
building and as you go past the 10th fl thinking "so far, so good".
It is a statement of faith that ignores any analysis of exogenous
factors.  It is a bold (and unprovable) assumption that no such
exogenous factors exist or matter in relation to Moore's "law".

It is interesting to compare this to the thread about "Predicting
Graphs".


------------------------------

From: Jeff Lockwood <[EMAIL PROTECTED]>
Subject: Re: apology
Date: Sat, 22 Jan 2000 13:34:08 +1100

On Tue, 18 Jan 2000, Mike Rosing wrote:

> Jeff Lockwood wrote:
> > 
> > I finally realised what has happened. The silly thing I came up with is
> > rubish. The methods I used would have been learned by some of you very
> > early on, as examples of what not to do. If I were to continue with it, I
> > would be most likely to only come up with more such examples.
> > 
> > I will waste no more of your time with nonsense , and apologise for the
> > time I have wasted already.
> 
> It's not wasted if you consider it one step on the learning ladder.
> Keep reading, keep learning, and keep trying.  It's a tall ladder :-)
> 
> Patience, persistence, truth,
> Dr. mike
> 
> 

I posted that , when I found out I had posted a version with a mistake in
it. The thing was supposed to  output B XOR A , not just B. I also
realised that this would probably make little difference. It might even
present an even simpler problem for all I know.

If a file is processed more than once, it does much better, but I'm not
going to be posting that here. I've already wasted enough of your time.


Jeff Lockwood <[EMAIL PROTECTED]>

PGP public key:

  homepages.ihug.com.au/~satan/pgpkey.asc


------------------------------

From: "r.e.s." <[EMAIL PROTECTED]>
Subject: Re: MIRDEK: more fun with playing cards.
Date: Fri, 21 Jan 2000 19:22:07 -0800

"Joseph Ashwood" <[EMAIL PROTECTED]> wrote ...
: I see two major problems with the key setup.
:
: Using the original method of determining the state values for ARC4 leads
to
: a very large number of keys that have repeated values, not a problem in a
: computer but it is a problem for a human with a deck of cards.
:
: Doing the math to figure out the order of the cards using the original
: method leads to a great number of potential problems, ie I can create a
: password that would be completely impossible to use.

Could you give specific examples of these two claims.
I don't understand either one of them, but I seem to
be in a poor "operating mode" today


What do you mean by "keys that have repeated values"?

Are you speaking of problems with the "y=y+S(x)+K(x)"
part of ARC4's setup phase, now involving 52 iterations?

I don't see where the math problems might enter.

: Both of these lead me to believe that if we do chose to base it on ARC4 we
: would need to replace the key setup.

I agree, in the sense that more than 52 iterations would
be advisable in the keying phase, especially if an IV is
being used, as is recommended.

(When I proposed using cards for ARC4's stream generator,
I suggested leaving the state initialization method up to
the user, so that using a shared shuffled deck as the key
would be a possibility if that were deemed acceptable.)


: My initial thoughts on the concept turn
: out to be rather a long process, but I believe it yields enough ambiguity
to
: the initial state to be usable, however I'm still trying to solve for the
: allowed entropy with various length passwords/phrases, and even then I'm
not
: sure if it's useful for a human.
:                 Joe


--
r.e.s. "Mistr Typo"
[EMAIL PROTECTED]



------------------------------

From: Hammer <[EMAIL PROTECTED]>
Subject: Re: NIST, AES at RSA conference
Date: Sat, 22 Jan 2000 03:15:50 GMT

In article <[EMAIL PROTECTED]>,
  [EMAIL PROTECTED] (John Savard) wrote:
[snip]
> Aside from the results so far achieved against some of the candidates
> that did not become finalists, however, I'm not surprised that, given
> the current state of our knowledge, little useful can be done to show
> weaknesses in the AES finalists. As Bruce Schneier noted, they're all
> submitted by teams which include some of the world's top cryptanalysts
> in the open community, so it's hardly surprising they don't have
> weaknesses that even their colleagues could easily find.
>
Is this not a paradoxical situation?  If all the top cryptanalysts are
submitters, who's left to check the work of the top cryptanalysts?  Sub-
top cryptanalysts?? ;-)

With all due respect, and just for clarification, I'm assuming you are
not saying "they must all be good because they were done by the top
level folks?"?  I mean that question most humbly, I just want to
clearly understand your point.

Would there not be - even if only for career/ego satisfaction - at
least some cross-checking within the teams?  In some fields, I think
this would be done in hopes of eliminating one of the other candidates
by finding a weakness, therein furthering the chances of said teams'
submission winning? (theoretical question)

Interesting issue.  Thanks for your thoughts on it.

hammer


Sent via Deja.com http://www.deja.com/
Before you buy.

------------------------------

From: Tom St Denis <[EMAIL PROTECTED]>
Subject: Re: ECC vs RSA - A.J.Menezes responds to Schneier
Date: Sat, 22 Jan 2000 03:22:28 GMT

In article <86asr6$sko$[EMAIL PROTECTED]>,
  Greg <[EMAIL PROTECTED]> wrote:
>
> > Not in practice.  The security of the key generally depends
> > only on the size of the key.
>
> So what you are saying is that given a 10kbit key, no matter
> how I construct the key (say make up of 500 20 bit primes, it
> cannot be broken because the number is just too large for anyone
> to figure out?  Is that what you are saying? I am not saying this
> is wrong.  If it is true, I would like to know.  That would be
> intriguing.

No he is saying given you about this structure you can use ECM to
factor it.  If you had a random n-bit number NFS will be faster.  If
you know this n-bit number is the product of four n/4 bit numbers...
maybe ECM is faster.

>
> > > and with a quick check because the strength lies in the
> > > key itself.  With ECC, the strength lies in the curve, not the
> > > key,
> >
> > The security of ECC certainly does depend on the choice of
> > private key because if it is small, then it can be found
> > by direct search.
>
> But can you not say that about any key search for any cryptosystem?
>
> The strength of ECC is in the key space being too large to
> search and too complex to readily determine the factor, which
> is a result of the curve and field being used.  If you build
> an ECC with field size n and you don't want to use key values
> less than n/2, then when you go to build the key, make certain
> that one of the bits between n/2 and n are set.  Just one of them
> will solve this problem you point out, particularly if a key space
> of n/2 is unsearchable to begin with.
>
> > Further, if it can be guessed that the private key lies in
> > a restricted range,...
>
> ...then there is something wrong, but what ever it is that is
> wrong has nothing to do with ECC or any other cryptosystem.
>
> > With RSA, if the top 20 bits of a key somehow get revealed,
> > that info is of no use to NFS. But with ECC if the top 20
> > bits get revealed, then you can speed up attacks on the key
> > by a factor of 1000.
>
> You are correct, but again this has absolutely nothing to do
> with ECC itself.  However, if this is such an important issue,
> let's talk about it for a moment.
>
> Given an elliptic curve of 500 bits and an RSA key of 500 bits,
> if you expose half the bits, which key is likely to be cracked in
> our life time?
>
> And if there was a failure somewhere in the cryptosystem that
> leaks even one bit, I would be concerned about using that
> cryptosystem, not because it uses ECC, but because of the leak.
>
> --
> The only vote that you waste is the one you never wanted to make.
> RICO- we were told it was a necessary surrender of our civil
liberties.
> Asset Forfeiture- the latest inevitable result of RICO.
> http://www.ciphermax.com/book
>
> Sent via Deja.com http://www.deja.com/
> Before you buy.
>


Sent via Deja.com http://www.deja.com/
Before you buy.

------------------------------

From: Nicol So <[EMAIL PROTECTED]>
Subject: Re: NIST, AES at RSA conference
Date: Sat, 22 Jan 2000 00:17:24 -0500
Reply-To: see.signature

Serge Vaudenay wrote:
> 
> The AES process is the only standardization process where
> cryptanalysts work for free.

I don't see anything wrong with that. I'm not aware of any standards
activities in which participants are paid for their participation. Most
often, the individuals attending standards meeting are paid by their
employers, who *are* the real participants. The participating
organizations absorb the travel expenses of their own employees and
sometimes sponsor the meetings. If you are a self-employed consultant,
then you don't get paid, period. Of course, the participants believe
that the standards will somehow benefit themselves, but standards often
benefit many more parties than just the participants, and they don't
seem to mind.

Think about it, people do unpaid charity work every day, and most of
these good-hearted individuals don't get any special recognition.
Those who don't feel like participating simply don't. You can think of
participating in the AES process as public service. If you don't feel
like participating, just let other volunteers do it.

> They get no honorarium and no publication. 

In my opinion, the prestige associated with being a co-designer of the
AES cipher is worth a lot more than any honorarium you can reasonably
expect to get. As for the part of not getting a publication out of the
effort, I'm not sure about it. If, in the process of cryptanalyzing the
candidate ciphers, you discover a new cryptanalytic technique, you can
always try to turn it into a journal paper (or at least a conference
paper). Does the AES process require you to publish your results in a
way that gives you no publication credit?

> If you think about previous "analysis" of the
> 10 rejected candidates, there are seldom real significant attacks and
> most of them are indeed quite secure.

Unless you want to allow multiple winners, which would reduce
interoperability, it is quite unavoidable that some high-quality
candidates don't get chosen. Situations like this exist in real life,
and people seem to accept it. In an election for a public office, it
doesn't matter that two candidates are comparable in many ways, only one
will be chosen for the office.

> Actually, if an expert do not have any personal interest about AES, he
> should better wait
> for the final standard before doing some substantial work. In the
> meanwhile he can work
> for other standards.

I honestly cannot make the above recommendation. If anyone believes that
it is important that the "best" candidate be chosen as the winner, and
that he is in a position to help make that happen, he should contribute
his analytic efforts while the winner is still being chosen, not
afterward.

-- 
Nicol So, CISSP // paranoid 'at' engineer 'dot' com
Disclaimer: Views expressed here are casual comments and should
not be relied upon as the basis for decisions of consequence.

------------------------------

From: "Douglas A. Gwyn" <[EMAIL PROTECTED]>
Subject: Re: What's with transposition?
Date: Sat, 22 Jan 2000 06:50:06 GMT

Markus Eiber wrote:
> give me a hint or a literature reference for how one can cryptanalize the
> transposition of a binary coded plaintext?

I can't give you a recipe; basically the approach
has to be tailored to the situation.
If you're trying to match up sections of "columns"
as in my example, you need to correlate suitably-sized
stretches of ciphertext such that the alignment is
among the most likely according to the source model,
iteratively extending just as in the example.
(Correlation reduces to counting matching bits.)
If you're trying to multiply anagram similar ciphertexts,
you need to search for good alignments again according
to the source model.  (Those "maximum likelihood
estimation" procedures are driven by the source model.
There are also methods for building the model from the
data itself, if you have no a priori knowledge of the
plaintext characteristics, but this is a relatively
difficult application.)  All I can suggest is to get a
good training in statistics and data analysis and then
apply the methods creatively.

------------------------------

From: "Douglas A. Gwyn" <[EMAIL PROTECTED]>
Subject: Re: Combination of stream and block encryption techniques
Date: Sat, 22 Jan 2000 06:54:36 GMT

Mok-Kong Shen wrote:
> Could we perhaps say that the fundamental component of a stream
> cipher is its key stream,

No!  That's just the most mindless implementation.
Real-world stream ciphers tend not to be of that form.

------------------------------


** FOR YOUR REFERENCE **

The service address, to which questions about the list itself and requests
to be added to or deleted from it should be directed, is:

    Internet: [EMAIL PROTECTED]

You can send mail to the entire list (and sci.crypt) via:

    Internet: [EMAIL PROTECTED]

End of Cryptography-Digest Digest
******************************

Reply via email to