Cryptography-Digest Digest #280, Volume #9       Wed, 24 Mar 99 21:13:03 EST

Contents:
  Conventional Encryption with PGP 6.0.2i (Ronan HARLE)
  Re: Random Walk (R. Knauer)
  Re: large big huge numbers ([EMAIL PROTECTED])
  Re: Random Walk (R. Knauer)
  Re: Securid Card ("Steve Matthews")
  Re: compare RSA and D-Hellman ([EMAIL PROTECTED])
  Decorrelated Fast Cipher (plus other block ciphers) ([EMAIL PROTECTED])
  Re: Random Walk ("karl malbrain")
  Re: Conventional Encryption with PGP 6.0.2i (Tom McCune)
  Re: compare RSA and D-Hellman (Scott Fluhrer)
  Re: RSA key distribution ([EMAIL PROTECTED])
  Re: Encryption and the Linux password files (Nathan Kennedy)
  Another TEA question ([EMAIL PROTECTED])
  Re: Encryption and the Linux password files ("Douglas A. Gwyn")

----------------------------------------------------------------------------

From: Ronan HARLE <[EMAIL PROTECTED]>
Subject: Conventional Encryption with PGP 6.0.2i
Date: Wed, 24 Mar 1999 23:04:39 +0100

Hello,

There is an option for a conventional encryption in PGP 6.0.2i, but I
didn't find any information about the algorythm (and its strength) used
for this purpose in the manual. Does anyone know something about this ?

Thanks for your help
-- 
Ronan Harle ([EMAIL PROTECTED])
    "The world is moving so fast these days that the person who says it
can't be done is generally interrupted by someone doing it."    
--Fosdick

------------------------------

From: [EMAIL PROTECTED] (R. Knauer)
Subject: Re: Random Walk
Date: Wed, 24 Mar 1999 23:20:34 GMT
Reply-To: [EMAIL PROTECTED]

On Wed, 24 Mar 1999 14:55:20 -0600, Jim Felling
<[EMAIL PROTECTED]> wrote:

>The problem you seem to be having 

I am not having any problem. I am reporting what I read - sometimes
even quoting it directly.

I used to believe what you are saying but have modified my position. I
no longer believe that algorithmic tests can tell you anything
*definitive* about the "non-randomness" of a process which generates
finite sequences. The reason is that there is no a priori measure of
what constitutes either randomness or non-randomness for finite
sequences. The matter is indeterminate on either side of the truth
table.

The best that statistical testing can do for you is act as a
diagnostic warning - alerting you to run tests on the TRNG you are
using to generate the sequences.

>Given a string of numbers, I can conduct tests to see
>if it is biased in a specific way with statistical tests.

Who says that lack of bias is a property of finite true random
sequences? Where is that proven in a mathematics book?

It is true that lack of bias is a property of infinite true random
sequences (Borel normality), but there is no reason to believe that a
property attributable to infinite random sequences (like Borel
normality) is attributable to finite random sequences.

>I cannot say
>that it definitely is biased due to those tests, but I can say that only 1
>in N numbers possesses the properties that it does

This is exactly where I claim the error is being made - a
misapplication of the frequenct interpretation of probability.

You are claiming that the "probability" of occurance is 1/N based on
an analysis of a "large" number of samples, deducing that probability
from the frequency of occurance of that property in the limit of large
sample count.

There is no reason mathematically to believe that just because a very
large sample of finite sequences behaves in a certain way on the
average, that any given sample of one finite sequence must have that
same property. Feller's book is full of examples of the misapplication
of the law of large numbers. 

> and thus flag the
>possibility that the source generating the number I am using is not bias
>free I also cannot definitely say that it is bias free as I am only testing a
>very tiny subset of all possible biases.

I concur with this approach - the use of statisitical tests as
diagnostics.

Let me ask you a crucial question. You buffer a 10,000 bit sequence
and submit it to statistical testing before outputing it, so you can
prevent a broken TRNG from creating a bad keystream. You get a
diagnostic warning and shut the TRNG down for further tests. You find
that the TRNG is not broken - that the original tests have given a
false indication.

Are you going to output that sequence you buffered knowing that it
failed the original tests but was nevertheless generated by a TRNG
that is perfoming to specification?

>I agree but I will maintain that there can be indicative statistical tests
>of randomness useful in the decision that a discreet process which
>generates finite sequences is producing output with potential weaknesses
>to certain methods of analysis.

We appear to agree that pseudo-randomness is a useful criterion for
diagnostics only.

Bob Knauer


"The important thing is to stop lying to yourself. A man who lies 
to himself, and believes his own lies, becomes unable to recognize 
the truth, and he ends up losing respect for himself as well as for 
others. When he has no respect for anyone, he yields to his impulses, 
indulges in the lowest forms of pleasure, and behaves in the end like 
an animal, in satisfying his vices."
--Dostoevsky (The Brothers Karamazov)


------------------------------

From: [EMAIL PROTECTED]
Subject: Re: large big huge numbers
Date: Wed, 24 Mar 1999 22:40:10 GMT



> If you don't need certaintly,  then use a single Miller-Rabin test followed
> by a single Lucas-Lehmer test.  Or use the Frobenius-Grantham algorithm.
>

What are these methods?

Tom

============= Posted via Deja News, The Discussion Network ============
http://www.dejanews.com/       Search, Read, Discuss, or Start Your Own    

------------------------------

From: [EMAIL PROTECTED] (R. Knauer)
Subject: Re: Random Walk
Date: Wed, 24 Mar 1999 23:44:55 GMT
Reply-To: [EMAIL PROTECTED]

On Wed, 24 Mar 1999 21:48:57 GMT, "Douglas A. Gwyn" <[EMAIL PROTECTED]>
wrote:

>Because in this thread you seem to be the only one who claims that
>anyone maintains that there is a reliable test for "randomness".

I am not the only one who claims that. Feller claims that, Li &
Vitanyi claim that, William & Clearwater claim that, et al.

>At least four of us have explained to you what role statistical
>tests actually have in this connection; we're all in general
>agreement, and consistent with established statistical theory.

You have failed to make the connection between True Randomness and
Establishment Statistical Theory.

The best that statistics can claim is that the process which generates
finite true random sequences has the *appearance* of randomness based
on the characteristics of infinite random sequences.

But there is no a priori reason for that to be the case  - in fact,
Feller shows that a surprisingly large number of sequences are do not
have this appearance of  randomness that comes from expectations
derived from using the law of large numbers.

>A complete explanation would amount to a short course in statistics,

No amount of statistical theory is going to change the manifest fact -
discovered from direct calculation - that statisitical tests do not
provide decisive information on true randomness. All they provide is
information about what might be expected of a "large" sample.

>which is not feasible in this forum even if any of use were inclined
>to present it.

Not only is no one inclined to present it, but no one is inclined to
pay any attention to it either. This is not Ding Dong School.

>It has been suggested that you go off and actually
>learn the subject so you will understand what the rest of us are
>saying.

And it has been suggested that you go off and actually learn the
subject so you will understand what the experts are saying.  You have
failed miserably at making a rational case for your position, so it is
you who is in real need of learning the subject at hand.

And BTW, for the record - I am certainly not alone in the position I
am maintaining. There have been others on sci.crypt who have stated
the same basic tenents. It is I who has taken the time to dig more
deeply into this matter and report it here - by consulting the real
experts.

By contrast all you, and the others you claim are supportive of your
position, have done is bluster and pontificate all over the place in a
futile attempt to confuse the issue.

Ironically, it is really a very simple matter to grasp what Feller is
saying - all you have to do is recognize that statistical theory is
not as applicable to finite true random processes, as you were led to
believe when you were a student.

Bob Knauer

"The important thing is to stop lying to yourself. A man who lies 
to himself, and believes his own lies, becomes unable to recognize 
the truth, and he ends up losing respect for himself as well as for 
others. When he has no respect for anyone, he yields to his impulses, 
indulges in the lowest forms of pleasure, and behaves in the end like 
an animal, in satisfying his vices."
--Dostoevsky (The Brothers Karamazov)


------------------------------

From: "Steve Matthews" <[EMAIL PROTECTED]>
Subject: Re: Securid Card
Date: Wed, 24 Mar 1999 22:39:38 -0000

=====BEGIN PGP SIGNED MESSAGE=====
Hash: SHA1

Hi,
Thanks for your informative posting.
Am I correct in assuming SDI als  now produce a 'soft' version of the
token card (softID?) which
can reside on Client machines.. if so how does this differ (if at all
)from the token card implementation.

The company I work for included SDI/Ace Server  support within our PPP
RAS product.. (as a NAS talking to the Server)
It seems to work well although the take up in the UK market sector has
seemed to be a little slow from my (possibly) limited experience. I
have only come across one or two large (v. security conscious)
multinationals who actually use it.

r's
Steve

- --
_________________________________________
                  PGP  Key  ID:  0x95E1EB05

'Flair is a term used by those that have never ridden'
                       -Miguel Indurain
_________________________________________


=====BEGIN PGP SIGNATURE=====
Version: PGPfreeware 6.0.2i

iQA/AwUBNvlpqqVIZcSV4esFEQJXtACfSSGO+iXrnRHy6qHdJg9c3WKbeKYAn32m
HSgwcw0qfZZwBDe1iKuXdVCq
=kt/2
=====END PGP SIGNATURE=====




------------------------------

From: [EMAIL PROTECTED]
Subject: Re: compare RSA and D-Hellman
Date: Thu, 25 Mar 1999 00:21:04 GMT

Here is the actual (I believe :) ) method for picking DH values.


Pick a large prime A, share that between the two people.

Person1 picks another large prime ( call it x ), that is relatively prime to A
(note: if his private number x is prime then you need not check this).

Person2 picks another large prime (call it y).

Person one sends A^x and person two sends A^y.  They both can now calculate
A^(x*y).


If A is not prime you run into the chance of x mod A = 0.

Tom

============= Posted via Deja News, The Discussion Network ============
http://www.dejanews.com/       Search, Read, Discuss, or Start Your Own    

------------------------------

From: [EMAIL PROTECTED]
Subject: Decorrelated Fast Cipher (plus other block ciphers)
Date: Thu, 25 Mar 1999 00:22:54 GMT

Does anybody know anything about DFC?  Just wondering if it's any good.


BTW, I want to make a collection of papers (in PDF, PS or TXT format) of all
the stream and block ciphers I can find.  If you can help point some out for
me I would be glad to add them to my collection (which will be made public on
my website soon).

Thanks,
Tom

============= Posted via Deja News, The Discussion Network ============
http://www.dejanews.com/       Search, Read, Discuss, or Start Your Own    

------------------------------

From: "karl malbrain" <[EMAIL PROTECTED]>
Subject: Re: Random Walk
Date: Wed, 24 Mar 1999 16:34:57 -0800

R. Knauer <[EMAIL PROTECTED]> wrote in message
news:[EMAIL PROTECTED]...
> On Mon, 22 Mar 1999 17:37:59 -0800, "karl malbrain" <[EMAIL PROTECTED]>
> wrote:

(... Previously replied to section deleted...)

> >Again, for sequences of question (e.g. all TEXT encoded under a
> ><<time-framed>> key stream, a statistical test over EQUIDISTRIBUTED
> >group-wise ciphertext (to 24 bits is good enough for 1KK total text)
would
> >make any INDEPENDENT criteria fall-out.
>
> Does anyone here understand what this poster just said?

Please excuse the BIFURCATED response to your reply.  These are the results
of having implemented REDOC III with a 96x96 Key Schedule.  24 bits refers,
naturally to TRI-GRAMS, and 1KK to 1Megabyte of Text.

The crypt produced by REDOC-III produces equi-distributed and independent
sequences -- analyzed up thru TRI-GRAMS, and if the key is changed every
1MB, adequate security (IMHO) is achieved.

However, don't take this out-of-context!! Karl M






------------------------------

From: [EMAIL PROTECTED] (Tom McCune)
Subject: Re: Conventional Encryption with PGP 6.0.2i
Date: Thu, 25 Mar 1999 01:21:18 GMT

=====BEGIN PGP SIGNED MESSAGE=====

In article <[EMAIL PROTECTED]>, Ronan HARLE <[EMAIL PROTECTED]> wrote:

>There is an option for a conventional encryption in PGP 6.0.2i, but I
>didn't find any information about the algorythm (and its strength) used
>for this purpose in the manual. Does anyone know something about this ?
>
>Thanks for your help

It will use which ever algorithm you have set as preferred in PGP
Preferences - Advanced tab:  128 bit CAST or 128  bit IDEA, or Triple DES,
reported to have an effective key size of 112.  The real security of this
option rests on the passphrase you use.

=====BEGIN PGP SIGNATURE=====
Version: PGP Personal Privacy 6.0.2
Comment: http://www.Tom.McCune.net - Default: RSA 2047  0x90321F49

iQEVAwUBNvmPjGR4bNCQMh9JAQHguQf7B3ztVgeOaFqmyfkei1az/hDEKNq96Tgn
YbA7V5MWrYGwf8bEprZRC1lImiqIGjjNJAwq5MEhhn2PNGbWDD8QGIIwunxmYYG5
2Q59xB3itURM87E8jNQICaAhw8ZnHKNFwXBZxZm72zDDlfUjMNeWFM3pS1rxNQSq
QbKXpgAACC3bPs/rJdEPCdid2fgr9UvGMQT5D5acs50bQ7hvT84/A3t7MmAwtuBY
1RLq3S1dR0TD9O7jQQY2kZxrDGIFsFHF/bzYwVTZHV5fwXbE8B7WTB7VB6mnpXLW
9nRXdTpFF+ws6gt3S5Oj1VlgULkos8G97YtNdd0K/DeDMeR3MgPkoA==
=5faH
=====END PGP SIGNATURE=====

------------------------------

From: Scott Fluhrer <[EMAIL PROTECTED]>
Subject: Re: compare RSA and D-Hellman
Date: Thu, 25 Mar 1999 01:28:36 GMT

In article <7dbvh5$5a9$[EMAIL PROTECTED]>,
        [EMAIL PROTECTED] wrote:

>Here is the actual (I believe :) ) method for picking DH values.
>
>
>Pick a large prime A, share that between the two people.
They also pick a number g (and, it helps security if that number is a "generator"
for prime A, that is, g^x mod A takes on A-1 possible values.  That are always
lots of generators for any prime A, and if you can factor A-1, it's easy to test
a candidate g to see if it's a generator (and, you can pick an A where you know
the factorization of A-1, so the difficulty of factorization is not a problem
here)
>
>Person1 picks another large prime ( call it x ), that is relatively prime to A
>(note: if his private number x is prime then you need not check this).
Actually, x need not be prime, and having 0 <= x < A-1 is sufficient (and,
strictly speaking, not necessary -- a larger range just doesn't give you any
additional security)
>
>Person2 picks another large prime (call it y).
Again, y does not need to be prime
>
>Person one sends A^x and person two sends A^y.  They both can now calculate
>A^(x*y).
Nope: person one sends g^x mod A and person two sends g^y mod A.  They both can
now calculate g^(x*y) mod A.

After all, A^x and A^y are bound to be *extremely* large numbers.  For example,
if A and x for both 512 bits, A^x is circa 10^160 digits long, and you probably
won't be able to send a number that large over a network connection before the
heat death of the universe.
>
>
>If A is not prime you run into the chance of x mod A = 0.
So?  That doesn't mean g^x mod A (or A^x for that matter) is any particular
value.

A real reason for making A prime is to make the DLP harder: for composite A,
all the attacker does is take the factorization of A, and solve the DLP for
each component prime.  This is a lot easier than solving the DLP for a
similarly sized prime A.

-- 
poncho


------------------------------

From: [EMAIL PROTECTED]
Subject: Re: RSA key distribution
Date: Thu, 25 Mar 1999 01:36:06 GMT

In article <7dbfqf$ij2$[EMAIL PROTECTED]>,
  "Roger Schlafly" <[EMAIL PROTECTED]> wrote:
> [EMAIL PROTECTED] wrote in message <7db0vg$8fe$[EMAIL PROTECTED]>...
> >The banking industry demanded strong primes, so I gave them a simple and
> >fast technique for generating them.
>
> Given the choice between strong primes and weak primes, naturally
> the bankers prefer strong primes. The question is whether the primes
> constructed by your method are any stronger or safer than other and
> more straightforward methods.

Define 'weak prime'.  I have never seen this terminology used.

>
> >  And while
> >requiring strong primes may not be a mathematical necessity, it makes the
> >user community more at ease with the standard.  This last fact, in and of
> > itself, gives value to the technique.
>
> Translation: Snake oil is valuable if people are suckered by it.

No.

Correct translation:

Having strong primes guarantees that keys are invulnerable to the P-1 and
P+1 factoring algorithms at very little cost.

Even if the attacks are astonomically unlikely to succeed on random primes,
with strong primes they are guaranteed not to succeed. As mathematicians
say : epsilon may be small, but it is still greater than 0.

They guard against an unlikely attack, but a legit one nevertheless.

I object to having them because:

(1) Calling them 'strong' primes gives the naiive user the impression that if
the key ISN'T made from strong primes, it must be weak. You yourself used
the term 'weak prime'.

(2) The attacks (P+1, P-1) have been majorized by ECM.

(3) For randomly chosen primes they really are very unlikely to succeed.

(4) Noone in is/her right mind would use P+1 or P-1 when ECM is available.

The issue is whether the slight increase in time to generate the primes
is worth the very trivial amount of added security. Bankers say yes.
They are the customers. When one's entire customer base demands a
feature one puts it in.

It has also been argued that P-1 and P+1 are the most likely attacks to
be applied because:

A.  They are simple to implement and require less expertise.
B. They are more widely known
C. Have been around the longest.
D. Not everyone is in their right mind.

Ron Rivest admits he was among those to suggest 'strong' primes back in the
late '70's.   If they had been called Rivest Primes rather than Strong Primes,
I doubt whether this discussion would now be occurring.

============= Posted via Deja News, The Discussion Network ============
http://www.dejanews.com/       Search, Read, Discuss, or Start Your Own    

------------------------------

From: Nathan Kennedy <[EMAIL PROTECTED]>
Subject: Re: Encryption and the Linux password files
Date: Thu, 25 Mar 1999 05:42:11 +0800

Sundial Services wrote:
> 
> I'm rather startled to find that Linux still uses a login password
> scheme that is based on a publicly-available password file that is
> easily replaced or substituted.  Systems have been "taken over" in that
> way.

They certainly have.  Systems are taken over many ways.  Which is why no
competent sysadmin would leave this hole in a critical situation.

> I was wondering who has done something to replace that password-scheme
> with a stronger system. 

Shadow passwords have been around for ages and are default with 2.1.x up. 
That's all there is to it.  Only programs like login need to access the
password hashes.

> It seems obvious to me that a public-key
> cryptography system (for example) could be used to create a much
> stronger authorization system 

No.  Unless you're referring to a secure transport mechanism like ssh,
which is also old stuff.  You were talking about human authentication, and
no human wants to type 256 hex digits to log in.

> -- e.g. using a password built in to the
> kernel when it's built, and different for every computer -- which would
> not only render password-files worthless if stolen, but would make it
> impossible to "infect" the password file with records from another
> source, or to substitute the password-file with one of your own.

Bad solution.  Too awkward, and if you're worried about the password file
being stolen, use shadow passwords, plain and simple.  They'd have to have
already compromised the system to access the shadow passwords, at which
point it would be irrelevent.  Now if you're talking about stealing the
box, you'd need fs encryption.

Nate

------------------------------

From: [EMAIL PROTECTED]
Subject: Another TEA question
Date: Thu, 25 Mar 1999 00:29:24 GMT

Does anybody know where I can find info on TEA-x (extended TEA)?  I would
really like to know.


Tom

============= Posted via Deja News, The Discussion Network ============
http://www.dejanews.com/       Search, Read, Discuss, or Start Your Own    

------------------------------

From: "Douglas A. Gwyn" <[EMAIL PROTECTED]>
Subject: Re: Encryption and the Linux password files
Date: Wed, 24 Mar 1999 22:04:54 GMT

John Savard wrote:
> >Any box with a competent sys admin has a shadowed passfile, and not
> >simply the default /etc/password.
> True, and the default password file has to exist for compatibility reasons
> with some old database software that uses it to find users...

But the password file doesn't contain the (encrypted) passwords
when shadowing is used.

------------------------------


** FOR YOUR REFERENCE **

The service address, to which questions about the list itself and requests
to be added to or deleted from it should be directed, is:

    Internet: [EMAIL PROTECTED]

You can send mail to the entire list (and sci.crypt) via:

    Internet: [EMAIL PROTECTED]

End of Cryptography-Digest Digest
******************************

Reply via email to