Cryptography-Digest Digest #304, Volume #9       Tue, 30 Mar 99 17:13:03 EST

Contents:
  Re: Does anyone know how to solve vigenere and tranposition cipher? (Jim Gillogly)
  Re: Wanted: free small DOS Encryption program (Robert G. Durnal)
  Re: Live from the Second AES Conference (Robert Harley)
  Re: What is fast enough? (John Savard)
  Re: What is fast enough? (Paul Schlyter)
  Re: Live from the Second AES Conference (Robert Harley)
  Re: True Randomness & The Law Of Large Numbers (R. Knauer)
  Re: Live from the Second AES Conference (David Wagner)
  Re: How do I determine if an encryption algorithm is good enough? (Darren New)
  Re: RNG quality in browsers? ("Sassa")
  Re: strong brain-embedded decryption algorithm (wtshaw)
  freeware implementation of one-time pad? (Charles Blair)
  Re: Wanted: free small DOS Encryption program ("karl malbrain")
  Re: Live from the Second AES Conference (Terje Mathisen)
  Re: True Randomness & The Law Of Large Numbers (R. Knauer)

----------------------------------------------------------------------------

From: Jim Gillogly <[EMAIL PROTECTED]>
Subject: Re: Does anyone know how to solve vigenere and tranposition cipher?
Date: Tue, 30 Mar 1999 08:52:58 -0800

colgates wrote:
> 
> Hi I need help from anyone who know how to decipher tranposition and
> vigenere/Beaufort cipher.
> I got a ciphertext but no sure how to decipher them. Please help. Thanks.

Sure, post 'em.  Somebody ought to be able to help you,
depending on the type of transposition and the amount of
Vig/Beau ciphertext.  If both transposition and Vig/Beau
are applied to the same cipher it could be more challenging.

-- 
        Jim Gillogly
        Sterday, 8 Astron S.R. 1999, 16:51
        12.19.6.1.3, 5 Akbal 16 Cumku, Fifth Lord of Night

------------------------------

From: [EMAIL PROTECTED] (Robert G. Durnal)
Crossposted-To: comp.security.misc
Subject: Re: Wanted: free small DOS Encryption program
Date: 30 Mar 1999 15:53:18 GMT

In <7dpqev$ovj$[EMAIL PROTECTED]>, Milton Pomeroy
<[EMAIL PROTECTED]> wrote:
: Wanted - a free DOS-based encryption program which is small, fast,
:          strong and friendly

: Explanation

: I want recommendations of encryption software to store small amounts of
: sensitive information (up to 30kbytes) for my own use - i.e. I encrypt it,
: and I decrypt it.  Since I plan to carry the encrypted datafile and
: encryption software on floppy disk and use it on various PCs (some of which
: may not be owned by me), I plan to use it from DOS (don't want to load it on
: PC, don't want any temporary decrypted data left on the PC's hard-disk).  The
: PCs will be running DOS, Win95/8, or WinNT.  Typically, I'd run it from the
: floppy in a DOS-Window.
{SNIP}
        Look at TinyIdea, at ftp://garb0.uwasa.fi/pub/crypto/idea3a.zip by
Fauzan Mirza or TinyFish at http://www.ip.pt/~ip200075/files/tinyfish.zip
or ftp://ftp.funet.fi/pub/crypt/msdos/tinyfish.zip. Both of these are under 
512 bytes (yes, BYTES!) in length; the ram usage depends on the size of the
file to be encrypted. The first may not be public domain, but the second is
based on Bruce Schneier's BLOWFISH which IS public domain. For more 
information, see on my site:
        http://www.afn.org/~afn21533/rgdprogs.htm and follow the index.
\\\\\\\\\\\\
My home page URL=http://members.tripod.com/~afn21533/   Robert G. Durnal
Hosting HIDE4PGP, HIDESEEK v5.0, PGE, TinyIdea (link)   [EMAIL PROTECTED]
and BLOWFISH in both Windows and mini-DOS versions.  [EMAIL PROTECTED]
EAR may apply, so look for instructions.
        

------------------------------

From: Robert Harley <[EMAIL PROTECTED]>
Subject: Re: Live from the Second AES Conference
Date: 30 Mar 1999 18:47:38 +0200


Dianelos Georgoudis ([EMAIL PROTECTED]) wrote:
>    Speed may be easy to measure, but I think its significance is overrated

I concur absolutely with this opinion.


*** What is going on? ***

Is the AES process supposed to choose a very fast algorithm that is
somewhat secure or a very secure algorithm that is somewhat fast?

I sincerely hope it is the latter but if the discussions being
reported are anything to go by, it looks like the process is off
track.

The purpose of the whole thing is to replace DES and that is not
because of DES's performance: it is because DES is not secure enough.


As Dianelos mentioned, speed is a relatively easy issue.  That's
presumably why everyone is rushing to measure it today and guesstimate
it for the future.

But if speed were really the primary concern, the choice would be much
easier.  The fastest candidates are clearly RC6 running on x86 and DFC
running on Alpha, so we could narrow it down to two.  Great!

In fact speed is by far the lesser concern compared to security.


Does it matter if transactions take three seconds (or milliseconds or
whatever) instead of two?  It matters a little but just is not a big
deal, after all they have to take some amount of time.

Would it matter if many billions of dollars worth of transactions were
intercepted or endangered or plain held up because of a catastrophic
failure of an AES algorithm?  That would be a nightmare and must be
avoided.


Thus it seems clear that the important question is NOT:

  Which is the fastest algorithm (with decent security)?


but instead:

  Which is the most secure algorithm (with decent performance)?


This is the more difficult question but it is the one we have to ask
and the one we have to answer.

I hope the guys at NIST can still see the wood for the trees.

Bye,
  Rob.

------------------------------

From: [EMAIL PROTECTED] (John Savard)
Subject: Re: What is fast enough?
Date: Tue, 30 Mar 1999 16:18:16 GMT

[EMAIL PROTECTED] wrote, in part:

>Isn't anything above 1MB/sec considered fast enough? I mean my hd controller
>only works at 4.5MB/sec anyways!

So something like 10MB/sec would be useful for encrypting your hard disk
without slowing it down too much...it depends very strongly on the
application.

John Savard (teneerf is spelled backwards)
http://members.xoom.com/quadibloc/index.html

------------------------------

From: [EMAIL PROTECTED] (Paul Schlyter)
Subject: Re: What is fast enough?
Date: 30 Mar 1999 19:37:37 +0200

In article <7dp8g7$9u3$[EMAIL PROTECTED]>,
 <[EMAIL PROTECTED]> wrote:
 
> Isn't anything above 1MB/sec considered fast enough?
 
This depends very much on your application.  In some cases 1 MB/sec
isn't enough, while in other cases 1 kB/s may be much more than
enough.
 
What do you need this speed for?
 
-- 
================================================================
Paul Schlyter,  Swedish Amateur Astronomer's Society (SAAF)
Grev Turegatan 40,  S-114 38 Stockholm,  SWEDEN
e-mail:  [EMAIL PROTECTED]    [EMAIL PROTECTED]   [EMAIL PROTECTED]
WWW:     http://hotel04.ausys.se/pausch    http://welcome.to/pausch

------------------------------

From: Robert Harley <[EMAIL PROTECTED]>
Subject: Re: Live from the Second AES Conference
Date: 30 Mar 1999 17:42:15 +0200


Dianelos Georgoudis ([EMAIL PROTECTED]) writes:
>    After lunch, Prof. Koeune from Belgium, also a sometime
>    participant in sci.crypt, presented a paper on the implementation
>    of several AES candidates on smart cards. He chose two instances:
>    the low cost, 8-bit Intel 8051 and the advanced, 32-bit ARM with
>    1kB of RAM. The approximate results are as follows:
>
>                         8051    ARM
>        Cipher   RAM     cycles  cycles
>        E2       344      9K      2K
>        RC6      205     14K      1K
>        Rijndael  49      4K      2K
>        Twofish   49      ?      10K

I did an implementation of DFC for ARM and timed it at 540 cycles.

That works out at a little over 60 MBits/sec on the ARM chip here
beside me which is *great* performance for portable and embedded
applications.

If anyone wants the source code, just let me know.  It's for ARM Linux
but should be easy to port to any decent development environment.

Bye,
  Rob.

PAD
PAD
PAD
PAD
PAD
PAD
PAD
PAD

------------------------------

From: [EMAIL PROTECTED] (R. Knauer)
Subject: Re: True Randomness & The Law Of Large Numbers
Date: Tue, 30 Mar 1999 16:48:40 GMT
Reply-To: [EMAIL PROTECTED]

On 30 Mar 1999 10:14:12 -0500, [EMAIL PROTECTED] (Herman
Rubin) wrote:

>I suggest you try to understand the terms. 

Please indicate exactly where I fail to "understand the terms", by
pointing me to what you claim is the correct understanding in Feller's
book. But first, since you may have missed the earlier posts, realize
that we are discussing the uniform Bernouilli process (UBP) and its
close cousin, the uniform random walk in one dimension. 

Until you show me in clear unequivocal manner the error of my ways, I
simply cannot take your word for it that I do not understand the
terms. I am sure you can appreciate why.

>The result of a physical process is "truly random", in that it
>has a probability distribution. This does not mean that it has
>the particular independence and distribution properties which
>you seem to think can be assumed;

We are discussing the UBP. Tell us where there is a problem with using
that as a model for an *idealized* TRNG. I use the term "idealized" in
the exact same manner that a mathematician uses it to discuss the
properties of a perfect circle. The fact that perfectly circular
objects do not exist in the physical world does not prevent us from
discussing the concept of the perfect circle as an idealized model of
real wheels. Likewise, nothing reasonable prevents us from discussing
the properties of an idealized TRNG, modeled after the UBP, as the
model for real TRNGs.

>the actual probability distribution is unknowable,

If so, then how could anyone ever claim that statistical tests are
applicable? Even the Central Limit Theorem requires assumptions about
the probability distribution. Yet all you hear on sci.crypt is that
one can utilize "standard statistical tests" to decide with a
reasonable certainty that a TRNG is not truly random.

Anyway, according to our specification for a TRNG, the probability
distribution for finite sequences is uniform.

+++++
A TRNG is a process which is capable of generating all possible finite
sequences equiprobably, namely in an independent and equidistributed
manner.
+++++

There is no difference in that kind of specification compared to the
specification of a perfect circle as the locus of all points in a
plane which are equidistant from a given point. Both specifications
describe an ideal condition, one which can never be met in the pysical
world. But that does not invalidate the usefullness of those
specifications. Just as knowing what the specification for a perfect
circle can assist you in designing a method for making wheels of high
precision (e.g., using a lathe), knowing the specification for an
idealized TRNG can assist you in designing a method for making
crypto-grade random sequences of high precision (e.g., using
radioactive decay).

>and it is not of the simple parametric type usually posited.

Give us an example of a "simple parametric type usually posited". I
have no earthly idea what you are talking about. You speak far too
obscurely for an Informed Layman (tm) to understand.

>It may or may not be so for practical purposes.

Please translate that into something an Informed Layman (tm) can
understand.

Bob Knauer

"The laws in this city are clearly racist. All laws are racist.
The law of gravity is racist."
- Marion Barry, Mayor of Washington DC


------------------------------

From: [EMAIL PROTECTED] (David Wagner)
Subject: Re: Live from the Second AES Conference
Date: 30 Mar 1999 10:46:19 -0800

In article <[EMAIL PROTECTED]>, Jim Gillogly  <[EMAIL PROTECTED]> wrote:
> Oh.  I had assumed we were talking about modern general-purpose
> ciphers rather than theoretical ciphers that were broken in
> specific ways.  I'm less interested in the latter, and will bow
> out of the thread.

Ok.  I just wanted to point out that the "provable security" isn't
necessarily quite as provable as you might think -- there are subtle
pitfalls.

> My suggestion was that to use E2.RC6, where each of E2 and RC6 is to
> be used in its 128-bit mode, you would need a 256-bit key, with all
> 256 bits independently chosen.  E2 would use 128 bits, and RC6 would
> use 128 bits, each using its own key schedule.

Ok, I see now.
But then I don't see how this can be reasonably promulgated as an
AES standard, because it has only (at most) 128 bits of strength, not
256 bits of strength.  I believe that if we standardize on an algorithm
with a 256-bit key, it should have 256 bits of strength.  Moreover, it
was a design requirement for AES that the ciphers be able to support
128 bit key lengths, which E2.RC6 would not satisfy.

Thus, I don't think cascading multiple algorithms will give us a good
AES standard.  (Doesn't mean it's not a useful technique, just that it
doesn't immediately solve the problem of finding an AES cipher.)

------------------------------

From: Darren New <[EMAIL PROTECTED]>
Subject: Re: How do I determine if an encryption algorithm is good enough?
Date: Tue, 30 Mar 1999 19:20:47 GMT

R. Knauer wrote:

> There are several experiments that students of modern physics can
> perform at a university laboratory which are quite simple to set up
> and do, yet the results of those experiments are very extraordinary.

Completely off-topic here, but what else is new?

My favorite is to go someplace they're selling polarized sunglasses.
Hold up one pair to the lights, and hold up a second pair turned 90
degrees. Blocks all the light, right. Now, without moving the first two,
slip a third pair in between at 45 degree angle. Add more filters, get
more light.  Viola!


-- 
Darren New / Senior Software Architect / MessageMedia, Inc.
     San Diego, CA, USA (PST).  Cryptokeys on demand.
"Practical Necromancy: Chapter One - Proper Use of Shovels"

------------------------------

From: "Sassa" <[EMAIL PROTECTED]>
Crossposted-To: comp.infosystems.www.browsers.misc
Subject: Re: RNG quality in browsers?
Date: Tue, 30 Mar 1999 21:00:46 +0200
Reply-To: [EMAIL PROTECTED]

hi


29-Mar-99 10:33 you wrote:
> Your RNG (which uses 5 RNGs in it its implementation) has a "state" of
> 160 bits. Is this state to be stored on the user's harddisk, so the
> pseudo-random sequence continues between uses of the browser?

as i understood, you appeal to the fact that it is difficult to find any
random process inside a computer. so even clock cannot give you completely
random number. i agree with this point.

i don't know how browser should do, but i did it this way: i save that state
to disk, and after each boot update it slightly using new pseudo-random
values. so you should crack previous states to figure out current state
and i suppose this task is not easy. note, that you _should_ take a
track of the person some time _before_ you get idea to crack some secret
information issued by him. pity, that any leaks of information about
previous states may cause cracking more easy.


so i hope it will be still something like secure :)



> How this 160 bits of state is derived from physical sources of randomness
> (such as the system clock) is important as well. This seed should not be
> taken from the system clock directly, relying on the algorithm to make it
> random. Since the pRNG algorithm (essentially a deterministic process)
> cannot be assumed to be secret, the attacker can narrow down the space of
> his search, by assuming some accuracy on the clock used.

> Mika

if any difficulties in understanding what i mean - post me.


--
   Sassa

Apiary Inc.
  ______
@()(_)
/\\

[EMAIL PROTECTED]



------------------------------

From: [EMAIL PROTECTED] (wtshaw)
Subject: Re: strong brain-embedded decryption algorithm
Date: Tue, 30 Mar 1999 12:41:15 -0600

In article <[EMAIL PROTECTED]>,
[EMAIL PROTECTED] (John Savard) wrote:

> [EMAIL PROTECTED] (DuBose8) wrote, in part:
> 
> >Is there such a thing as a brain-embedded decryption algorithm? Someone
> >mentioned this on a mailing list but I've never heard the term.
> 
> I guess they mean something a person can work out in his or her head, as
> opposed to something that has to use a computer - which might be bugged.

The answer is yes in the form of the obscure language scenario such as was
used with the code-talkers.  Crytography flows into linguistics as it does
into mathematics;  there is nothing to replace the utility of having an
extensive database, distinct vocabulary or highly organized set of
interlaced theorems, an organized code-book that is incomprehensible to an
enemy.

Studies have been done of people, mostly twins, who could communicate with
their own language, developed mainly prior to other language training. 
Perhaps not an algorithm in the sense you think of it, such a mode of
behavior clearly can be thought of as functionally cryptographic when
viewed by others.
-- 
Too much of a good thing can be much worse than none.

------------------------------

From: [EMAIL PROTECTED] (Charles Blair)
Subject: freeware implementation of one-time pad?
Date: 30 Mar 1999 20:31:28 GMT

   It should be easy to write something in which the user gives the
plaintext and the pad, and the program creates the cyphertext, with
the user bearing responsibility for the pad being random and used
only once.  Has anyone made available a ``standard'' implementation?

------------------------------

From: "karl malbrain" <[EMAIL PROTECTED]>
Crossposted-To: comp.security.misc
Subject: Re: Wanted: free small DOS Encryption program
Date: Tue, 30 Mar 1999 12:14:37 -0800


<[EMAIL PROTECTED]> wrote in message
news:7dqoa9$hqi$[EMAIL PROTECTED]...
> In article <7dpqev$ovj$[EMAIL PROTECTED]>,
>   Milton Pomeroy <[EMAIL PROTECTED]> wrote:
> > Wanted - a free DOS-based encryption program which is small, fast,
> >          strong and friendly
> >
> > Explanation
> >
> > I want recommendations of encryption software to store small amounts of
> > sensitive information (up to 30kbytes) for my own use - i.e. I encrypt
it,
> > and I decrypt it
(...)

>  I think you can do no better in your request than SCOTT16U.ZIP it is
> available world wide is free and the source code is included. However
> it was written by me an outsider to the established crypto community
> But it has no holes and is not a weak keyed method that you will most
> likely end up using. It is very hard to find something strong. The more
> posative press you see on a method the more like it is broken by the
> NSA or such.
(...)

The <<key>> to your request is the small amount of ENCRYPTED OUTPUT
required.

Whatever method you choose, run a BINOMIAL distribution on the output
BITS/CHARS/WORDS of larger texts for UNIFORM FREQUENCY COUNTS.  In other
words, you should get 0 & 1 bits 50/50, 00/01/10/11 counts at .25, each byte
at 1/256, each BI-GRAM at 1/65536, etc.  Karl M



------------------------------

From: Terje Mathisen <[EMAIL PROTECTED]>
Subject: Re: Live from the Second AES Conference
Date: Tue, 30 Mar 1999 21:06:00 +0200

Bruce Schneier wrote:
> 
> On Mon, 29 Mar 1999 16:15:16 GMT, [EMAIL PROTECTED]
> (John Savard) wrote:
[snip]
> >If everybody's C code is compiled on the same compiler, one may be
> >comparing optimizations or something, but one isn't comparing compilers.
> 
> One is comparing both how well the coder optimized his code, and how
> well the compiler optimizes the particular algorithm.  For example,
> the Borland C compiler can't do rotates well.  Any algorithm using
> rotates will look relatively worse than an algorithm that does not, if
> compared using a Borland compiler.  This relativel difference won't
> exist if the algorithms are compared using a Microsoft compiler.

Indeed.

Even though there might exist crypto algorithms which would happen to
compile into near-optimal code on almost all compilers, I believe a new
standard encryption algorithm is more than important enough to deserve
being implemented in hand-optimized asm code for all major cpu
architectures.

I.e. there is no particular reason to handicap an algorithm just because
it uses a normal cou instruction which is hard/impossible to describe
directly in portable C.

This is why I really like the AES anylysis submitted by B. Schneier's
group, where they compared the relative speed of a theoretically perfect
asm implementation of each algorithm.

The numbers they came up with seems to correlate well with what good
coders have been able to do on several of the algorithms.

Terje

-- 
- <[EMAIL PROTECTED]>
Using self-discipline, see http://www.eiffel.com/discipline
"almost all programming can be viewed as an exercise in caching"



------------------------------

From: [EMAIL PROTECTED] (R. Knauer)
Subject: Re: True Randomness & The Law Of Large Numbers
Date: Tue, 30 Mar 1999 20:53:31 GMT
Reply-To: [EMAIL PROTECTED]

On Tue, 30 Mar 1999 09:07:54 -0700, "Tony T. Warnock"
<[EMAIL PROTECTED]> wrote:

>For a Bernoulli process with p=.5 the standard deviation is
>Sqrt(N)/2. The normalized Z-score for your bounds of +-5% is then .05N/Sqrt(N)/2 or
>.10*Sqrt(N).

I looked up the term "normalized Z-score" in Feller's two volumes (3rd
ed.), and could not find anything. Please explain what that means, and
what it measures.

Is it the area under the Gaussian for the range +-5% of the mean? If
so, I believe we had a different name for it - but then physicists are
notorious for renaming things differently from what mathematicians
call them.

> This means that the Z score is growing with increasing N.

Yes, that means that the dispersion is increasing. IOW, the ink is
diffusing away from the origin with time.

>A sample
>computation shows that for N=100 that 72% of the samples lie within +-5% of the
>mean.

I would point out that the other 28% of the sequences (over 1 out of
4) that are not near the origin is not a trivial number, even for so
few steps as 100.

I had made my statement for values of N much larger than 100. For
example, typical stream cipher keystreams are considerably longer than
100 bits - more like 1,000,000 bits to pick some order of magnitude.
What is the Z-score for that number of bits?

But is what you have presented the way to model this, consistent with
the constraint that the ensemble has 2^N members? IOW, the question is
not being phrased for arbitrarily large N, but for a finite albeit
large value.

I realize that the statistics of infinitely large sequences is
fundamentally different from that of finite sequences, however large.
Are you sneaking in an infinite limit calculation somewhere in your
considerations?

Is this whole debate caused by an attempt on the part of statiticians
who use calculations that assume infinite limits and combinatorialists
who do their calculations on a finite sequences? All of the manifest
"abnormality" (excessive bias) seen in finite sequences disappears
when those sequences grow to infinite length.

Bob Knauer

"The laws in this city are clearly racist. All laws are racist.
The law of gravity is racist."
- Marion Barry, Mayor of Washington DC


------------------------------


** FOR YOUR REFERENCE **

The service address, to which questions about the list itself and requests
to be added to or deleted from it should be directed, is:

    Internet: [EMAIL PROTECTED]

You can send mail to the entire list (and sci.crypt) via:

    Internet: [EMAIL PROTECTED]

End of Cryptography-Digest Digest
******************************

Reply via email to