Cryptography-Digest Digest #664, Volume #9        Sat, 5 Jun 99 13:13:03 EDT

Contents:
  Re: New Computer & Printer for Dave Scott (Illia Kuriakin)
  Function Feedback Registers ([EMAIL PROTECTED])
  Re: Viability of encrypted flash cards? ([EMAIL PROTECTED])
  Re: Challenge to SCOTT19U.ZIP_GUY ([EMAIL PROTECTED])
  Re: Challenge to SCOTT19U.ZIP_GUY ([EMAIL PROTECTED])
  Re: 8Bit encryption code. Just try and break it. - code3.ecr (0/1) 
([EMAIL PROTECTED])
  Re: The BRUCE SCHNEIER  Tirade ([EMAIL PROTECTED])
  Re: DES Effective Security Q ("karl malbrain")
  Re: Challenge to SCOTT19U.ZIP_GUY (Frank Gifford)
  Re: OTP Problems ([EMAIL PROTECTED])
  Re: New Computer & Printer for Dave Scott ([EMAIL PROTECTED])

----------------------------------------------------------------------------

From: Illia Kuriakin <[EMAIL PROTECTED]>
Subject: Re: New Computer & Printer for Dave Scott
Date: Sat, 05 Jun 1999 06:57:59 -1000

I will donate $100 for David Scott's new computer.
We can load it with a C compiler of Our choice,
and a spell checker.

If we get 9 donors of $100 each, we can purchase it
and ship it to David as a completed package. I am serious, 
please sign up today.

------------------------------

From: [EMAIL PROTECTED]
Subject: Function Feedback Registers
Date: Sat, 05 Jun 1999 13:45:46 GMT

I was wondering if there are references to FFR (Function Feedback
Registers).  My example (in C below) works with four byte size
registers, but unfortunetaly the cycle length is only 2^31.98 so it's
not perfect.  Basically the register works like a LFSR with the taps
but when you shift the cells (they are not nesscessary bits or bytes)
they pass thru a function before resting.  For example the one below
would be

Taps -> f1(x) -> A -> f2(x) -> B -> f3(x) - > C -> f4(x) -> D -> output
(Tapping on A and B)

unsigned char func[4][256], regis[4];
unsigned char clock(void)
{
        unsigned char temp, o;

        temp = regis[3] ^ regis[0];

        o = regis[0];
        regis[0] = func[0][(unsigned)regis[1]];
        regis[1] = func[1][(unsigned)regis[2]];
        regis[2] = func[2][(unsigned)regis[3]];
        regis[3] = func[3][(unsigned)temp];

        return o;
}

It seems to me that this would be non linear.  And as long as the
tables are private it may have some cryptographic significance
(Although the cycle is not maximal length).

Tom
--
PGP public keys.  SPARE key is for daily work, WORK key is for
published work.  The spare is at
'http://members.tripod.com/~tomstdenis/key_s.pgp'.  Work key is at
'http://members.tripod.com/~tomstdenis/key.pgp'.  Try SPARE first!


Sent via Deja.com http://www.deja.com/
Share what you know. Learn what you don't.

------------------------------

From: [EMAIL PROTECTED]
Crossposted-To: alt.security,talk.politics.crypto
Subject: Re: Viability of encrypted flash cards?
Date: Sat, 05 Jun 1999 14:55:42 GMT

On Thu, 03 Jun 1999 22:37:39 GMT, "Cor!"
<[EMAIL PROTECTED]>
wrote:
>>Do you think PGP is used by criminals more than anyone
>>else?
>I don't know; I never thought about it before. Are there any statistics
>on this?

If there were statistics as to the type of messages that PGP is used
on, then that would imply that someone can tell.

I hope not.

------------------------------

Date: Fri, 04 Jun 1999 23:52:27 -0400
From: [EMAIL PROTECTED]
Subject: Re: Challenge to SCOTT19U.ZIP_GUY

Tim Redburn wrote:
> 
> On Fri, 04 Jun 1999 20:24:05 GMT, [EMAIL PROTECTED]
> (SCOTT19U.ZIP_GUY) wrote:
> 
> >
> >  I have worked on many aircraft simulations and OFP;s one of the main
> >problems that seems to occur over and over is that other people keep
> >missing the obvious errors in the code becasue most people inheirently
> >put faith on the comments and this leads to major maistakes that take
> >years to find and fix.
> 
> Quite the opposite. Comments tell you what the programmer intended.
> It is then easier then to verify that the code actually works
> as intended. If you don't know what the code was meant to do, how can
> you debug it ?

Actually it's harder than that. Comments often tell you what the
_original_ programmer _originally_ intended.  The secret to good writing
(of any kind) is rewriting.  Rewriting implies that the intentions of
the author evolve during the process.  Thus the final intentions of the
author(s) may be arbitrarily far from the original intentions.

A truism of software maintenance (which is mostly software analysis) is
that the value of comments is often negative.  More often than not.  The
cost of persuing a false trail based on the comments is high.  It can
pollute your concept pool long after you've learned better (holographic
memory effects I suppose).

ThHis explains one of the first rules of maintenance.  Delete the
comments, then study the code.  Review the comments (skeptically) later
to resolve problems and find issues warnings not apparent from the code.

The Grail of good software is self-documenting code.  That does NOT mean
comments.

------------------------------

Date: Sat, 05 Jun 1999 00:00:35 -0400
From: [EMAIL PROTECTED]
Subject: Re: Challenge to SCOTT19U.ZIP_GUY

SCOTT19U.ZIP_GUY wrote:
> 
> In article <[EMAIL PROTECTED]>, Geoff Thorpe <[EMAIL PROTECTED]> wrote:
> >Hi there,
> >
> >Hey, he writes poor code. He refuses to adequately document his own
> >"techniques" (the mindless accusations that only stupid people don't
> 
>    Only a few think the coding is poor.

How many think the coding is great?   I have yet to see one.

> I don't have to follow rules

You should have terminated this sentance at this point.

> that others blindly follow. And the code is full of many comments. But
> it should be clear to any one who has a working knowledge of C and
> some baiscs of how a PC works. Yes I don't make every one happy
> that would be an impossible job.

Try making one person barely satisfied.  Until you have done that you
risk being accused of bad faith.  I.e., you aren't really trying.

Pick someone, anyone, and answer their questions honestly.  Do what they
ask in terms of clarifying your code.  It won't make the code worse.  I
promise.

> 
> David A. Scott
> --
>                     SCOTT19U.ZIP NOW AVAILABLE WORLD WIDE
>                     http://www.jim.com/jamesd/Kong/scott19u.zip
>                     http://members.xoom.com/ecil/index.htm
>                     NOTE EMAIL address is for SPAMERS

------------------------------

Date: Fri, 04 Jun 1999 23:23:48 -0400
From: [EMAIL PROTECTED]
Subject: Re: 8Bit encryption code. Just try and break it. - code3.ecr (0/1)

Michael J. Fromberger wrote:
> 
> In <[EMAIL PROTECTED]> [EMAIL PROTECTED] writes:
> 
> >Michael J. Fromberger wrote:
> >>
> >> Cryptography is no place for secrecy ... an ironic axiom, to be sure,
> >> but true.
> 
> >Do you believ thn that the United States would be better off in terms
> >of communication security if the NSA/CIA/FBI/DIA/TLA published all of
> >their working methods?
> 
> >I think not.
> 
> Well, you're certainly entitled to your opinion. :)
> 
> My essential point is that you cannot simply create ciphers and
> protocols in a vacuum, and expect them to be strong.  An organization
> such as the NSA circumvents this problem slightly, by retaining a
> large staff of well-trained cryptographers and cryptanalysts.  This,
> one could reasonably argue, gives them the ability to simulate the
> analysis, testing, and review that happens in the more open academic
> community -- except that it is self-contained.
> 
> The original discussion, as you recall, was about the release of
> source code for cryptographic software -- not government algorithms.
> If you are truly concerned with the strength and security of a piece
> of software, it would be the height of folly to trust a piece of
> random code whose source you weren't even allowed to analyze...not
> only the encryption algorithms, but how they're employed, how keys and
> data are handled and stored, and so forth.

You are missing the point.  There are no academic security
organizations.  All actual security organizations are non-academic. 
Thus the theoretical basis for your argument is weak.

The historical basis is non existent.  All real security organization
operate on the basis of Need To Know (NTK).  Lacking NTK one does not
obtain protected information.  This fundamental principle applies to
issues as weak as corporate confidentiality, medical records, personal
records, and issues as strong a code-word + color code + top secret
information.  Note that the ranking of security includes higher levels
of security beyond top secret.  One of the highest is top
secret/black/crypto.

Since you have failed to explain take these issues into you acount of
the principles of cryptographic development, your conclusion is
worthless.

On a more abstract basis, your whole proposal appears biaseed.  Consider
that we must assume our ciphers can be broken.  Attempting to compensate
for lack of critical R&D staff by using completely open review might
reduce the weaknesses in our cipher systems, but it guarantess that any
weakness found by a defector* will be exploited for maximum gain.  The
purpose of academic crypto is strong crypto.  The purpose of operational
crypto is securing information.  Applying the methods of the former to
the goals of the latter is not useful.

* defector: one who fails to cooperate in the context of a prisoner's
dilemma.  See Axelrod 1986 "The Evolution of Cooperation"

> 
> If you have a cryptosystem that is strong even when its details are
> known, then keeping it a secret will only make it MORE difficult to
> break, not less.  But since there is not currently any known method
> for proving a cryptosystem's strength, this is a virtually pointless
> theorem.  Anyone who is truly concerned with security needs to be able
> to see how the internals of their encryption system -- software or
> hardware -- are put together.

No.  Absolutely not.  Anyone truly concerned with security is interested
in minimizing the damage that the flaws he MUST assume are present may
cause.  Secrecy is a Good Thing in this effort.

When amateurs develop security systems they make mistakes.  Getting
someone reasonably competent, even another amateur, to check things out
is not a way to eliminate cipher weaknesses, it is a way to eliminate
the class of simple mistakes amateurs make.

Generalizing that process to conclude that all cipher development
everywhere should be open is fatuous on the face of it.  You have
already observed that organizations of a certain size are capable of
adequate internal review.  So there must be a boundary that describes
the critical mass for internal review.  IMO that boundary must exclude
amateurs, singular & isolated teams of any size, and everyone willing to
generalize specific trivial cases to universal rules.  Your conclusion
appears to be that there is no such critical mass less than that of a
massive government agency.

Tell us, does the NSA maintain secrecy to protect their ciphers from
adversaries or to protect their massive budget from the public?

> 
> The most common response to this is to argue: "But most people don't
> have the knowledge necessary to review all the source code themselves,
> so how could this help them anyway?"  And sure, that's true.
> 
> But if the details are available to the cryptographic community at
> large, someone who does not have the requisite technical knowledge can
> at least take advantage of the expertise of well-respected
> cryptographers and cryptanalysts.  If the details are obscured, you
> have to trust the person who wrote the code, when she assures you it's
> secure.  If we could all truly trust each other, we wouldn't NEED
> cryptography.

Again you are working from the basis of "someone who does not have the
requisite technical knowledge".  I.e., an amateur.  Why do you apply
such a limited concept to the efforts of professionls?  An amateur, by
your own definition, needs to solicit the support of professionals.  But
that relationship does not need to be public, nor does it require the
publication of the query or the result.

The same gaps appear in an equivalent assertion regarding the need for a
professional to solicit the advice of other professionals, or
specialists.  Peer review does not require public review.

This is not an adequate conceptual basis for the conclusion that
follows.

> 
> And so, I continue to assert that there should be no black boxes in
> cryptography.

This conclusion is not supported by historical fact, or any consistent
theory.  It is nothing more than a bare assertion.  As such it is a
valid expression of an opinion, but that opinion is worthless until you
explain how and why you reached it.

The fact is there should by no black boxes in your reasoning regarding
crypto.

------------------------------

Date: Fri, 04 Jun 1999 23:29:05 -0400
From: [EMAIL PROTECTED]
Crossposted-To: talk.politics.crypto,alt.privacy
Subject: Re: The BRUCE SCHNEIER  Tirade

John Savard wrote:
> 
> [EMAIL PROTECTED] wrote, in part:
> 
> >Not quite the same.  The key is static and an offsite backup of it is a
> >perfect image.  The data is variable and an offsite backup of it is
> >stale.
> 
> >The technical term for this is fail-safe.  Volatile data isn't.
> 
> I'm getting puzzled here.
> 
> My original reasoning for stating that the one-time-pad is useless for
> filesystem encryption, a la Scramdisk, is that if one can store X megs
> of key securely - and have access to it whenever you need your data -
> then you could just put your X megs of data "on a floppy, in a wall
> safe" or whatever.
> 
> Clearly, if you disagreed, you may have thought of a different way to
> use the OTP, and it seems you have.
> 
> On-site, you have a computer which, when it becomes unattended, has
> the sensitive data on it encrypted by a sheet from the OTP. The OTP is
> kept with security guaranteed by _volatility_, something that can't be
> applied to the data ... and the correct page of the OTP can be
> recovered from an off-site backup.
> 
> While I still wouldn't consider such a scheme a reasonable alternative
> to conventional secret-key encryption in almost any case, still, you
> have indeed proven your point: you have come up with a scheme by means
> of which the OTP could provide security for filesystem encryption.
> 
> I can't really persuade myself that any device for volatile key
> storage would *really* be safer than a decent conventional encryption
> algorithm, even so, but that is a matter of taste, and does not
> invalidate your design.
> 
> John Savard ( teneerf<- )
> http://members.xoom.com/quadibloc/index.html

In truth it requires an esoteric set of conditios for the volatile pad
technique to be useful, but given those conditions, it appears to be
practical.

Thanks for the feedback, I appreciate it.

So far the score is pro:1 vs. con:(unknown but assumed very large)

------------------------------

From: "karl malbrain" <[EMAIL PROTECTED]>
Subject: Re: DES Effective Security Q
Date: Sat, 5 Jun 1999 08:32:49 -0700


Nicol So <[EMAIL PROTECTED]> wrote in message
news:[EMAIL PROTECTED]...
> While I don't have any problem with the result of Biham and Shamir (that
> independent subkeys increase the amount of chosen plaintexts needed for
> differential cryptanalysis only by a very modest amount), I would not
> interpet it as saying that independent subkeys don't help much.

On reading their book last night, what they say is that their attack is on
those 16 sub-keys reached through the S-boxes -- not on the original 56
bits.  In other words they weren't that interested in the initial key
combination selection matricies.

> In my opinion, differential cryptanalysis is not a practical attack
> against DES, as the conditions necessary for its applicability seldom
> occur in practice, if ever.  To date, exhaustive search with fast,
> parallel hardware seems to remain the most practical attack.  Is there
> an attack against DES that requires only a handful of known
> plaintext-ciphertext pairs and 2^32 trial encryptions?  I don't know.  I
> haven't heard of one but I can't rule out its existence either.  Without
> reliable knowledge of the best practical attack against DES there is, it
> would be difficult or impossible to know the security implications of
> independent subkeys.

Well, you're correct here.  Trial encryptions won't be an option  -- I don't
see how it ever is, except for SMART-CARDS or some similar exposed
mechanism.  For distributed applications, it's not.

Biham and Shamir state that S-boxes are a non-linear function -- and that
the linear algebra required to DETERMINISTICALLY compute the sub-keys as you
suggest is too complicated.  Unfortunately, I'm not a mathematician.

> (For the sake of argument, assume that the attack postulated above does
> exist and it is the best practical attack there is.  Further assume that
> independent subkeys increase its complexity to 2^80 trial encryptions.
> Under such assumptions, independent subkeys help *a lot*.)

Implementing the <<official>> key schedule is <<more>> complicated than
distilling 768 key bits instead of stopping at 56.  Actually, as Jim G
notes, B & S give 2^64 chosen plaintexts (trial encryptions) as the maximum
complexity for differential cryptography.  Since it's a 64 bit block cypher,
I'm not sure that's saying much more.

Karl M



------------------------------

From: [EMAIL PROTECTED] (Frank Gifford)
Subject: Re: Challenge to SCOTT19U.ZIP_GUY
Date: 5 Jun 1999 12:45:40 -0400

In article <[EMAIL PROTECTED]>,
Tim Redburn <[EMAIL PROTECTED]> wrote:
>...
>Even if you have personal objections to that style, why wont
>you put your objections to one side and keep everybody
>happy by writing scott19u.zip in that style. 
>...

Actually, I would recommend that David create a separate version of the 
program which is functionally identical.  It would have comments about why 
things are done a certain way, a nicer way that arrays are accessed on 19 
bit boundaries, etc.  Even if this means the new version takes ten times 
longer to run, it would be enough for people to walk through the code and 
test it.  After all, right now, people are interested in testing the security 
of the algorithm.

The other version can still be there as the 'fully optimized' implementation.

-Giff


-- 
Too busy for a .sig

------------------------------

Date: Fri, 04 Jun 1999 23:43:10 -0400
From: [EMAIL PROTECTED]
Subject: Re: OTP Problems

Matthias Bruestle wrote:
> 
> Mahlzeit
> 
> Andrew Haley ([EMAIL PROTECTED]) wrote:
> > Matthias Bruestle ([EMAIL PROTECTED]) wrote:
> 
> > : I doubt it [iButton] can withstand a skillfull student. (See papers of 
>Kuhn&Anderson)
> 
> > Is this assertion based on any knowledge?
> 
> > The last time that I saw Markus Kuhn he recommended something like an
> > iButton to counter his methods of attack.
> 
> As I remember the iButton is a batterie puffered RAM. So throwing it
> in liquid nitrogene should preserve the data long enough to disassemble
> and repower it.

This is based on the assumption that you can disassemble the device
without destroying it.  This assumption is probably invalid for any
organization less than "national technical means".

Certainly this is not a project you'd hand to a student as you
previously suggested.

> 
> There is somewhere a paper (maybe also from Kuhn) about the security
> of computer modules for nuclear weapons. Maybe this could be called
> tamperproof.
> 
> Mahlzeit
> 
> endergone Zwiebeltuete
> 
> --
> PGP: SIG:C379A331 ENC:F47FA83D      I LOVE MY PDP-11/34A, M70 and MicroVAXII!
> --
> When a program is useful it must be changed,
> when it is useless it must be documented.

------------------------------

From: [EMAIL PROTECTED]
Subject: Re: New Computer & Printer for Dave Scott
Date: Sat, 05 Jun 1999 16:06:46 GMT


> If we get 9 donors of $100 each, we can purchase it
> and ship it to David as a completed package. I am serious,
> please sign up today.

Why not teach him how to write proper code/algorithms first? Maybe
something like a programming course, or if we want to splurge a comp
sci course...

Tom
--
PGP public keys.  SPARE key is for daily work, WORK key is for
published work.  The spare is at
'http://members.tripod.com/~tomstdenis/key_s.pgp'.  Work key is at
'http://members.tripod.com/~tomstdenis/key.pgp'.  Try SPARE first!


Sent via Deja.com http://www.deja.com/
Share what you know. Learn what you don't.

------------------------------


** FOR YOUR REFERENCE **

The service address, to which questions about the list itself and requests
to be added to or deleted from it should be directed, is:

    Internet: [EMAIL PROTECTED]

You can send mail to the entire list (and sci.crypt) via:

    Internet: [EMAIL PROTECTED]

End of Cryptography-Digest Digest
******************************

Reply via email to