Cryptography-Digest Digest #451, Volume #10      Tue, 26 Oct 99 14:13:04 EDT

Contents:
  Re: Slide this.... ("J. Byron")
  Re: Slide this.... ("J. Byron")
  Another Newbie question ("S.Azam")
  Re: Out of Order Winnowing (Bill McGonigle)
  Re: This compression argument must end now (Patrick Juola)
  Re: "Risks of Relying on Cryptography," Oct 99 CACM "Inside Risks" column 
([EMAIL PROTECTED])
  Re: Portable crypt() function ("P.C. Teo")
  Re: This compression argument must end now (Geoffrey T. Falk)
  Re: Twofish: Optimization for GF(2^8) multiplication desired
  Re: Unbiased One to One Compression
  Re: Newbie question (Volker Hetzer)
  Re: Unbiased One to One Compression (SCOTT19U.ZIP_GUY)
  Re: Twofish Description Improved ([EMAIL PROTECTED])
  Re: some information theory (very long plus 72K attchmt) (Anton Stiglic)
  Re: "Risks of Relying on Cryptography," Oct 99 CACM "Inside Risks" column ("Trevor 
Jackson, III")
  Re: RC 4: security & law (Paul Koning)
  Re: Smartcard and RNG ([EMAIL PROTECTED])

----------------------------------------------------------------------------

From: "J. Byron" <[EMAIL PROTECTED]>
Subject: Re: Slide this....
Date: Tue, 26 Oct 1999 08:36:04 -0400


Tom St Denis wrote in message:
>
><snip insanely stupid post>

Name calling?? That seems rather simple.

>
>Hey why not cryptanalyze your cipher and present it properly.  Also
>state the benefits (faster, more compact, simpler, etc...).

Can someone not make a free contribution to the field without comment? Where
is the analysis of your XTEA ????

J. Byron

>
>Tom
>




------------------------------

From: "J. Byron" <[EMAIL PROTECTED]>
Subject: Re: Slide this....
Date: Tue, 26 Oct 1999 08:44:50 -0400

Tom St Denis wrote in message:
(copied from another sci.crypt post)

>THAT'S BECAUSE I AM IN HIGHSCHOOL YOU >FRIGGIN DOLT!.

More name calling....
Hmmm

J. Byron




------------------------------

From: "S.Azam" <[EMAIL PROTECTED]>
Subject: Another Newbie question
Date: Tue, 26 Oct 1999 13:50:54 +0100

I am a student doing a review project on the field of cryptography and
more specifically the use of quantum cryptography.  I was wondering if
anyone could point me in the direction of the FAQ's for this group or
any other relevant information.  If you could I would be most grateful.

Yours sincerely

S.Azam

------------------------------

From: [EMAIL PROTECTED] (Bill McGonigle)
Subject: Re: Out of Order Winnowing
Date: Tue, 26 Oct 1999 09:22:19 -0400

In article <[EMAIL PROTECTED]>, Mok-Kong Shen
<[EMAIL PROTECTED]> wrote:

> You might be interested in an humble article of mine entitled
> 
>    An Alternative Argumentation to Rivest's Chaffing and Winnowing
> 
> at  http://home.t-online.de/home/mok-kong.shen/#paper3

It's a nice article.  The only flaw I see is that if the binary software
you're using has been approved for export control, then you can't hide
extraneous data in the MAC, since They already know how the MAC works.

They still probably can't prove that you used the binary to generate the
message, of course.


-Bill
=====
[EMAIL PROTECTED] / FAX: (419) 710-9745
Dartmouth-Hitchcock Medical Center Clinical Computing

------------------------------

From: [EMAIL PROTECTED] (Patrick Juola)
Subject: Re: This compression argument must end now
Date: 26 Oct 1999 10:04:47 -0400

In article <7v3cb3$155c$[EMAIL PROTECTED]>,
SCOTT19U.ZIP_GUY <[EMAIL PROTECTED]> wrote:
>In article <_d6R3.6864$[EMAIL PROTECTED]>, gtf[@]cirp.org (Geoffrey 
>T. Falk) wrote:
>>In article <[EMAIL PROTECTED]>, Tim Tyler  <[EMAIL PROTECTED]> wrote:
>>>David Scott has recently invented perhaps the first compressioon system
>>>suitable for cryptography known to man.  Since *all* other known
>>>compression routines potentially weaken your encypherment - and add
>>>"unnecessary" information to the file - I believe this event should not be
>>>under-estimated.
>>
>>That is quite an overstatement. David's system might be suitable for
>>cryptography, if it compresses the data well. The much-hyped "one-on-one"
>>property does not guarantee this at all.
>>
>>For example, here is my favourite compression algorithm: Copy the input
>>file to the output file. This algorithm is "one-on-one." But it is
>>worthless for cryptography.
>    I don't think most people would consider a direct copy a compression
>program but you are free to think so if you wish.

Actually, most compression theorists use it often as a counterexample
to inflated, bogus, and false claims.

I wonder if there's a connection here?  Nah, couldn't be.

        -kitten

------------------------------

From: [EMAIL PROTECTED]
Subject: Re: "Risks of Relying on Cryptography," Oct 99 CACM "Inside Risks" column
Date: Tue, 26 Oct 1999 14:08:45 GMT

In article <[EMAIL PROTECTED]>,
  "Trevor Jackson, III" <[EMAIL PROTECTED]> wrote:
> [EMAIL PROTECTED] wrote:

>>The situation is very different is the choice of ciphers is unknown to
>>the attacker, and for that it is necessary that the choice of the
>>ciphers depend on the key. (There is curious exception: to encrypt the
>>next block choose the cipher completely at random and send only the
>>resulting ciphertext. To decrypt, try in sequence the various ciphers
>>until you get intelligible plaintext.)
>
>It appears to me to be extremely difficult to define "intelligible
>plaintext".  If the plaintext space is so well defined that it is the
>difference between plaintext and noise is detectable than an
>inadequate compression has been used.

If the plaintext is compressed then you decompress it and see if it is
intellible.

>And if I'm sening noise, e.g., a message key, it is impossible by
>definition.

If you are sending a random key then you can add its hash.

There are always ways to send intellible plaintext. What is interesting
about this method is that it is now impossible for the attacker to know
which block has been encrypted by which cipher and this renders any
attack that requires more than 30 or so plaintexts impossible too. If
you encrypt using four ciphers then decryption speed will be on average
2.5 times slower, but with the speed of modern ciphers this is not a
big problem.


Sent via Deja.com http://www.deja.com/
Before you buy.

------------------------------

From: "P.C. Teo" <[EMAIL PROTECTED]>
Subject: Re: Portable crypt() function
Date: 26 Oct 1999 22:25:48 +0800

Is the manual at the web site not available for downloading?
I couldn't download the manual (230 pages in acrobat)
Any places else can find the manual?

Chad Hurwitz <[EMAIL PROTECTED]> wrote in article
<7v0004$[EMAIL PROTECTED]>...
> In article <7uvfcj$[EMAIL PROTECTED]>,
> Chad Hurwitz <[EMAIL PROTECTED]> wrote:
> >
> >Is there a standard function that will be in a library on both a
microsoft
> >platform AND a unix platform?  I can't find any crypt library functions
> >in micro$ofts Visual C++ compiler help.
> >
> >If not, is there source publically available for a portable
> >crypt(key,message) and decrypt(key,crypt_message) or are these kind of
> >things government regulated?
> >
> >-- 
> >/-------------------------------------------------------------------\
> >| spam if you can, spam if you dare, spam if you must, i won't care |
> >| spam is futile,  spam is free,  spam is filtered,  so i won't see |
> >\-------------------------------------------------------------------/
> 
> 
> to answer my own question i found a nice link
> 
> http://www.eskimo.com/~weidai/cryptlib.html
> 
> -- 
> /-------------------------------------------------------------------\
> | spam if you can, spam if you dare, spam if you must, i won't care |
> | spam is futile,  spam is free,  spam is filtered,  so i won't see |
> \-------------------------------------------------------------------/
> 

------------------------------

From: gtf[@]cirp.org (Geoffrey T. Falk)
Subject: Re: This compression argument must end now
Date: Tue, 26 Oct 1999 14:48:47 GMT

In article <7v3cb3$155c$[EMAIL PROTECTED]>,
SCOTT19U.ZIP_GUY <[EMAIL PROTECTED]> wrote:
>In article <_d6R3.6864$[EMAIL PROTECTED]>, gtf[@]cirp.org (Geoffrey 
>T. Falk) wrote:
>>In article <[EMAIL PROTECTED]>, Tim Tyler  <[EMAIL PROTECTED]> wrote:
>>>David Scott has recently invented perhaps the first compressioon system
>>>suitable for cryptography known to man.  Since *all* other known
>>>compression routines potentially weaken your encypherment - and add
>>>"unnecessary" information to the file - I believe this event should not be
>>>under-estimated.
>>
>>That is quite an overstatement. David's system might be suitable for
>>cryptography, if it compresses the data well. The much-hyped "one-on-one"
>>property does not guarantee this at all.
>>
>>For example, here is my favourite compression algorithm: Copy the input
>>file to the output file. This algorithm is "one-on-one." But it is
>>worthless for cryptography.
>    I don't think most people would consider a direct copy a compression
>program but you are free to think so if you wish.

The definition of a compression program is simply a program that
optimizes the size of the output given a particular probability
model. The direct copy is a compression algorithm according to
a certain probability model, namely, the one in which every
binary message of length L is equally likely to occur (with
probability 2^-L).

David, what are the characteristics of the probability model
behind your algorithm?

>  My understanding is that arithmetic is patented but I may look at
>it some day in the future.  I think Huffman compression is very good
>for many things if you don't stick with the direct copying that you
>favored in the paragrph above.

Huffman works best only for sources with uncorrelated source
statistics, and even then it is only optimal when the symbol
probabilities are exact powers of 1/2. Arithmetic coding is still
OK even when the probabilities do not satisfy this.
If your front-end coder can get rid of the correlation between
symbols, then one of these methods makes a great back-end coder
to even out the symbol probabilities. Does your method do this?

>>The quality of the compression makes vastly more difference to the
>>difficulty of attacking the system. This is because the attacker
>>must decrypt the entire message anyways before he can test if it
>>is a valid compressed file. Whereas, to test source statistics he
>>only needs a partial break.
>  What are you talking about.

I am talking about the fact that the attacker cannot compute
Comp(Decomp(X)) unless he knows X.

> There is so little thought given
>to compression routines that it is likely only a portion of message
>would have to be decrypted before it could be tested as a valid
>file for the compression that was used. Even worse if a purely
>random file was compressed the attacker could rule out many
>possible solutions by examing only the first few blocks. This is
>a case where bad compression is much worse than no compression
>at all since it is giving info to the attacker.

This does not make much sense. What do you mean by "random"?
What is your probability model? Please give an example.

Regards
g.

-- 
 I conceal nothing. It is not enough not to lie. One should strive
 not to lie in a negative sense by remaining silent.  ---Leo Tolstoy
ADDRESS ALTERED TO DEFLECT SPAM. UNSOLICITED E-MAIL ADS BILLED $500
Geoffrey T. Falk    <gtf(@)cirp.org>    http://www.cirp.org/~gtf/

------------------------------

From: [EMAIL PROTECTED] ()
Subject: Re: Twofish: Optimization for GF(2^8) multiplication desired
Date: 26 Oct 99 14:31:52 GMT

[EMAIL PROTECTED] wrote:
: Due to help from some of u i could understand the basics of galois field
: multiplication in twofish MDS and RS multiplication. but this method was
: found to be extremely slow. hence i could generate a 255 element array
: for mapped galois field multiplication. this i achieved basically by
: raising a generator polynomial to powers betn. 1 and 254.

I know I can use a 256 byte array to simplify the process of removing
multiples of the polynomial used to reduce a 16-bit "product" to eight
bits. This would be done by taking the numbers from 0000 to FF00 that end
in a zero byte, and finding what they are modulo the polynomial.

As for the polynomial multiplication which replaces addition by an XOR,
one way to do it would be to have, say, a 16 by 16 multiplication table in
memory.

Another technique would be base expansion.

To multiply 5 by 7, multiply, say, 101 by 111 in *decimal* arithmetic, and
you get:

    101
    110
    ---
    101
   1010
  10100
  -----
  11211

which can be translated, digit by digit, to the polynomial product 11011.
By using a bigger base for the arithmetic than the numbers are actually
in, carries are prevented.

Using only base-3 arithmetic, one can do an XOR by doing addition. For an
8-bit by 8-bit product, one has eight partial products, so base-9
arithmetic would be required as a minimum to prevent carries.

John Savard

------------------------------

From: [EMAIL PROTECTED] ()
Subject: Re: Unbiased One to One Compression
Date: 26 Oct 99 14:40:57 GMT

Tim Tyler ([EMAIL PROTECTED]) wrote:
: However, let's not get this out of proportion.  Your objection only
: applies to one particular form of one-on one compression.  Further
: it only applies when a certain alignment problem occurs.  Most computer
: files are only stored to the nearest byte anyway, so I have difficulty
: in seeing your objection as much of a practical problem.  I doubt even
: embedded application typically try to encrypt files which are not 8-bit
: aligned frequently.

The point is that compression fairly naturally produces files that are an
arbitrary number of bits in length, whether or not it is one-to-one. At
first, I wasn't sure such a technique was practical. I may have
misunderstood the posts in which David A. Scott first discussed this idea,
but eventually I found that I could at least approximate one-to-one
compression. But if I try to obtain byte alignment, which I thought Mr.
Scott was claiming he was able to obtain, the price of keeping one-to-one
compression is bias in the last byte.

Of course, the objection is trivial, but what I gained by going
"one-to-one" was also trivial, since it, too, only concerned the *last
symbol in the message*...so another part of my point is that if you are
going to be so fussy as to use one-to-one compression, then you should
take into account this trivial objection - because it equals in magnitude
the rationale behind the compression method to begin with!

John Savard

------------------------------

From: Volker Hetzer <[EMAIL PROTECTED]>
Subject: Re: Newbie question
Date: Tue, 26 Oct 1999 17:39:25 +0200

> 1. An explanation of how a hash key works.  I am in need of creating some software
> that incorporates this functionality at it's basic form but I don't understand how
> it works.  Basically I have written a program and I want to be able to offer several
> users a unique key each that will unlock the software after the expiration date of
> the full working demo.
A hash key can be used in two different ways:

First, you can simply encrypt the hash with the key. Verification then works
by first decrypting the hash and then comparing it to the hash of the data in question.

Second, you APPEND the key to the data, hash the data (which now includes the key),
remove the key from the data and distribute the data and the hash. Now, someone
without the key would be unable to reproduce the correct hash-input and therefore
the hash comparison would always fail. Comparison works just like the original hash
creation. Then you compare the computed hash with the hash you got with the data.

> 
> 2. A good recommendation for a solid beginners book on encryption.
I've learned a lot from Bruce schneier's "Applied Cryptography".

Greetings!
Volker
-- 
Hi! I'm a signature virus! Copy me into your signature file to help me spread!

------------------------------

From: [EMAIL PROTECTED] (SCOTT19U.ZIP_GUY)
Subject: Re: Unbiased One to One Compression
Date: Tue, 26 Oct 1999 16:33:07 GMT

In article <[EMAIL PROTECTED]>, [EMAIL PROTECTED] wrote:
>SCOTT19U.ZIP_GUY <[EMAIL PROTECTED]> wrote:
>: In article <[EMAIL PROTECTED]>, [EMAIL PROTECTED] wrote:
>
>:>If you *don't* have this, then the compressor, when compressing, can
>:>choose more than one compressed file to compress to.
>
>:        I still belive compression should be such that when compressing
>: there is only one file that it should compress to. However when uncompressing
>: one can bend the rules and have more tham one file to decompress to. But
>: when making more than one decompression possible the user my have to use
>: his brain. 
>:     Please comment on the above it you wish. I hope you can see why I
>: think one to one compression just a first step but that you can if careful
>: push it farther.
>
>This is an interesting idea.
>
>I am immedaitely reminded of the "Jefferson wheel cypher".
>
>For those unfamiliar, this was a physical system, consisting of a large
>number of "lettered" rings on it.  Each ring has all 26 letters around the
>rim, but in no sensible order.  These rings may be placed in any order.
>The order is the cypher's key.
>
>Encryption involves turning the wheels until the message is spelled out in
>a long line.  The device is then rotated, and another line is used as the
>cyphertext.
>
>Decryption takes place by putting the message on one line of the
>rings.  The user then turns the machine until the message is revealed
>on one of the remaining 25 lines.
>
>Allowing multiple decompressions seems related to this idea.
>
>As you say, the disadvantage is that the user needs to use his brain.
>
>This brings to mind another idea:
>
>In cryptography, it is customary to consider that the opponent has the
>cyphermachine at his disposal.  However, there may be contexts where
>the enemy is much more likely to encounter a decrypting device than an
>encrypting one.  This might happen (e.g.) when considering the broadcast
>of encrypted sattelite television signals.  *Everyone* has a decrypter -
>but obtaining an encrypter may require considerable espionage.
>
>With this sort of thought in mind, the idea of a "one way" compression
>routine seems to be of potential interest.
>
>A one way compressor would be a compression scheme where owning (and
>reverse engineering) the decompressor does not help much with 
>constructing a compressor.  Construcing a working compressor will be
>*possible*, but very difficult.
>
>It may be that something like signing broadcasts can produce roughly
>the same benefits - but perhaps there might be a use for "one way
>compression".

 Tim why do you think others seem not to realize the bennefits of a
compression that does not give information to an attacker trying to
break an encryption scheme.




David A. Scott
--
                    SCOTT19U.ZIP NOW AVAILABLE WORLD WIDE
                    http://www.jim.com/jamesd/Kong/scott19u.zip
                    http://members.xoom.com/ecil/index.htm
                    NOTE EMAIL address is for SPAMERS

------------------------------

From: [EMAIL PROTECTED]
Subject: Re: Twofish Description Improved
Date: Tue, 26 Oct 1999 15:38:16 GMT

In article <[EMAIL PROTECTED]>,
  [EMAIL PROTECTED] (John Savard) wrote:
> I've been so busy, and when I have returned to work on my web page,
> I've generally chosen easier things to do, like adding yet another
> Quadibloc cipher, or even adding discussions of new topics like data
> compression...
>
> but I have *finally* gone back to my brief description of Twofish, and
> updated it to include a description of the key generation process. The
> description of the key generation process does not have complete
> detail, so some reference to Bruce's paper will still be required to
> implement the cipher, but all the steps are covered.
>
> http://www.ecn.ab.ca/~jsavard/co040802.htm
>
> Maybe someday soon, I'll even describe Kerberos, as I have been
> meaning to for a very long time...

When you describe Kerberos, you may want to include something I read,
though I know not where, that one suggestion to improve it's security
was to use SPEKE for the initial authentication step.

csybrandy
>
> John Savard ( teneerf<- )
> http://www.ecn.ab.ca/~jsavard/crypto.htm
>


Sent via Deja.com http://www.deja.com/
Before you buy.

------------------------------

From: Anton Stiglic <[EMAIL PROTECTED]>
Subject: Re: some information theory (very long plus 72K attchmt)
Date: Tue, 26 Oct 1999 11:55:55 -0400

>

One thing I will agree with you on is the fact that we do not seem to
be understanding each other.  I beleive this conversation has deviated
out of it's initial purpose, we seem to be the only one continuing it.
It's like we come from two different schools, and terms which definitions
seem trivial to one is not the same to the other.
I suggest we end this, including accusations of ignorance to each other,
and persue readings and discusions of other posts.

Anton



------------------------------

Date: Tue, 26 Oct 1999 11:55:13 -0400
From: "Trevor Jackson, III" <[EMAIL PROTECTED]>
Subject: Re: "Risks of Relying on Cryptography," Oct 99 CACM "Inside Risks" column

Vernon Schryver wrote:

> In article <[EMAIL PROTECTED]>,
> Trevor Jackson, III <[EMAIL PROTECTED]> wrote:
>

[snip silliness]

>
>
> >                  It can be used to verify the correct interoperation of
> >the target cipher prior to committment.
>
> The trouble with negotiating protocols or ciphers is in picking a "target,"
> or the equivalent of agreeing on the merchandise and the price.  By the
> time you have a "target cipher" whose "correct interoperation" you can
> verify, you've already the problems in negotiating, just as by the time
> you carry home your melons from the market, the negotiating problems are
> long past.

No.

By the time you _start_ a cipher selection process you already have a working cipher.
There is no need to invent problems here.  If you have a specifc problem in mind, spit
it out.  If not, the handwaving is getting repetitive.

>
>
> Well, some of the major PPP problems involve the equivalent of you deciding
> after you get home that you made a bad deal, and going back to the market
> to try to find and haggle with the vendor.  If you say that the equivalent
> problems won't happen with negotiating ciphers, I'll not believe you.
> Note that some of the other major PPP problems are due to the sloppiness
> of a bunch of vendors who ignore the clear requirements in the main
> PPP RFC's concerning renegotiating parameters.  Those errors have caused
> significant security problems.  (I'm talking of the common junk that
> gives up upon receiving an LCP Configure-Request after reaching OPENED)
>
> >                                         And since fallback/revert is always
> >available the complexity of error/dispute handling is dramatically reduced.
>
> That's another wonderfully funny statement.
> Those who have actually implemented protocols with "fallback/revert" know
> that such features are extremely fertile sources of major problems in
> the real world.

Pure garbage.  How do you distill it to such a degree of putiry?

The premise of DCN is that there is already a cipher selected.  A cipher change
involves a single message from the end suggesting the change and a single response
from the end receiving the suggestion.  This is the minimum necessary to maintain sync
between the ends.

If you feel this is too complicated to implement, pehaps you ought to find another
line of work.


>  We also know that the salescritters sell such features
> to unsuspecting users and system administrators as the opposite of
> what they usually are, sources of bugs and major sources of complexity.
>
> > ...
> >> Much of PPP negotiating is for small numbers, with one peer proposing a
> >> value, and the other saying either "ok" or "let's do X instead."  Almost
> >> all of the rest of PPP negotiating consists of one peer saying "let's do
> >> Z" and the peer saying either "yes" or "no".  That's simpler than
> >> your cipher negotiating, but it also has lots of real life problems.
> >
> >In what sense is is similar?  We want one small number, the ID of the cipher.
> >What additional complexity do you see associated with cipher selection other than
> >cipher selection? (Yes that phrasing is purposefully chosen).
>
> Carefully phrased or not, I do not understand any technical content in that
> question.

I'm asking for an example of the compexity you think is in DCN.  Not in PPP, which is
a bad model, or SSL, whch is worse, or modems which is the worst I've heard.

>
>
> For an example that might answer the question, the job of the CCP (PPP
> compression control protocol) is to get the PPP peers to agree on a 16-bit
> ID for a compression algorithm.  The point of the LCP authentication
> Conf-Req/Rej/Nak packets is to pick a similar, small ID of an
> authentication protocol.  How is that harder than picking the ID of a
> cipher?
>
> >> ...
> >Please point to an example where a higher level user of a channel is unable to
> >detect the phase transition.  Both PPP and SSL fail this test as the presence of
> >absence of a channel counts as detectable.
>
> What do you mean by "phase transition"  Or "absence of a channel"?  Do
> you seriously mean to claim or imply that PPP ever exists, works, or starts
> to work or something similar when there is no "channel" between the peers?
> I think SSL assumes that it has TCP/IP underneath, which is less than what
> any cipher negotiating protoocol will assume for transport.  What do you
> think?
>
> No, please do not bother responding.

OK.


------------------------------

From: Paul Koning <[EMAIL PROTECTED]>
Subject: Re: RC 4: security & law
Date: Tue, 26 Oct 1999 11:45:53 -0400

> > > The law changes almost daily, but all crypto software still
> > > has to undergo a review by the NSA before export is approved.
> > > Basically, you have to demonstrate that the program cannot be
> > > used as-is with larger key sizes. If you're a small guy they'll
> > > probably delay you for years with paperwork.

FWIW, that ("delay... for years") doesn't match my experience.
2-3 months is more typical.

        paul

------------------------------

From: [EMAIL PROTECTED]
Subject: Re: Smartcard and RNG
Date: 26 Oct 1999 09:31:42 -0700

In article <[EMAIL PROTECTED]>,
Medical Electronics Lab  <[EMAIL PROTECTED]> wrote:
>Kwong Chan wrote:
>> 
>> Could anyone please advice any information on implementation of
>> RNGs in smartcard for key exchange.
>
>Check out this paper:
>http://www.helsbreth.org/random/smokerng/detecting
>_random.html



That's a cool paper, but I don't think that it answers the original
poster's question- assuming that by "smart card" he means an ISO7816
card or similar.  These use power from the reader, and there's no
battery and not enough room for the memory and circuitry which you
describe.  Even if there is, if the poster has the ability to put a
circuit into a smartcard chip, he can simply buy or license an RNG core
from one of the companies which provide such things.[0]

I have heard from a researcher that the RNG on many smart cards is quite
poor (some to the point of returning the same "random" value after each
reset!), but his paper is not published yet.  Your best bet would be to
get a sample of some of the cards which you are interested in using and
testing them yourself.  At least that way you might be able to catch the
truly lame errors and eliminate those cards from your program.


[0] backround- there's a number of smartcards which do public key
key generation, so that the private key never has to leave the card.
Of course that means that you need to generate your random numbers
on the card as well.



------------------------------


** FOR YOUR REFERENCE **

The service address, to which questions about the list itself and requests
to be added to or deleted from it should be directed, is:

    Internet: [EMAIL PROTECTED]

You can send mail to the entire list (and sci.crypt) via:

    Internet: [EMAIL PROTECTED]

End of Cryptography-Digest Digest
******************************

Reply via email to