Cryptography-Digest Digest #363, Volume #11 Sun, 19 Mar 00 07:13:01 EST
Contents:
Re: Assistance needed With finding an algy that does not (Nemo psj)
Re: Enigma encryption updated (Adam D) (Nemo psj)
Re: Opinions? ("Marc Howe")
Re: KDC + secret key == public key? (Don Davis)
Re: Enigma encryption updated (Adam D) (Nemo psj)
Re: Card shuffling (DMc)
Re: 64-bit Permutations (wtshaw)
unknown verifier designated verifier proofs (David A Molnar)
Re: Card shuffling (Mok-Kong Shen)
Re: Card shuffling (Mok-Kong Shen)
Re: Card shuffling (Mok-Kong Shen)
Concerning UK publishes "impossible" decryption law ([EMAIL PROTECTED])
----------------------------------------------------------------------------
From: [EMAIL PROTECTED] (Nemo psj)
Subject: Re: Assistance needed With finding an algy that does not
Date: 19 Mar 2000 04:13:45 GMT
lol Yup it was sorry about that mate I'll have to not post when its 4:00 am in
the morning. I found what my poblem was in any case.
------------------------------
From: [EMAIL PROTECTED] (Nemo psj)
Subject: Re: Enigma encryption updated (Adam D)
Date: 19 Mar 2000 04:16:02 GMT
On a further note if its true that if I encrypt something with that it cant be
decrypted then my algy is secure. Try to Guess why.
-Pure
------------------------------
From: "Marc Howe" <[EMAIL PROTECTED]>
Subject: Re: Opinions?
Date: Sun, 19 Mar 2000 04:16:12 GMT
Certainly the intellect here is great! Discussion like this is one of the
reasons to live. But, to the point, it seems apparent, to me, that Chaos
theory is the notion that a system may be unpredictable in practical
measures (certainly a good opposition to psychology can be made with it),
but specifically I meant that, just as Einstein believed, everything has a
cause and that given a set of circumstances (events), the outcome was
determined accordingly.
For instance, a hurricane's wake may appear chaotic, but each branch, spec
of dust, etc. had only a certain way it could react to the hurricane given
its properties and the hurricane's properties.
It is definitely a good point to hold that most things are
knowable/understandable. However, it must also be pointed out that our
conceptual abilities are limited in relation to the complexity of the system
of this universe.
I do not know if I am expressing what I wanted to express, but any further
thought on the issue is great!
Thanks again.
Marc
------------------------------
From: [EMAIL PROTECTED] (Don Davis)
Subject: Re: KDC + secret key == public key?
Date: Sun, 19 Mar 2000 04:15:15 GMT
> could one use a Kerberos variant to get digital signatures and
> PK encryption with a symmetric algorithm?
hello, mr. hranicky,
i explored your idea in some detail some ten years ago,
when i worked on kerberos at project athena. strictly
speaking, the short answer to your question is "no": what
you're describing is called a message relay, or sometimes
a notarization service. a digital signature has a crucial
difference: non-repudiation. by sharing your messages with
the KDC, you make non-repudiation impossible; non-repudi-
ation is the defining property of a digital signature per se.
that said, there is a close parallel between public-key
certificates and kerberos' use of the ticket-granting ticket,
just as you describe. this analogy is not superficial, but
underlies much of the in the way the kerberos protocol works.
the analogy is most evident in kerberos V5's user-to-user
protocol, which ralph swick and i designed. i must confess
that i didn't understand the public-key analogy, when i was
working on u2u; it was roger needham (cambridge u., uk) who
later told me, during a visit to athena in the late 1980's,
that our u2u protocol was equivalent to a notion of "private-
key certificates". he suggested we should write a paper
about this idea of doing public-key-style stuff with symmetric
keys, even though we ourselves hadn't realized that we'd _had_
the idea. here is the paper we wrote:
D. Davis and R. Swick, "Network Security via Private-Key
Certificates," USENIX 3rd Security Symposium Proceedings,
(Baltimore; Sept. '92). Also in ACM Operating Systems Review,
v.24, #4 (Oct. 1990).
http://world.std.com/~dtd/relay/relay.PS (58 Kbytes)
Abstract: We present some practical security protocols
that use private-key encryption in the public-key style.
Our system combines a new notion of private-key certificates,
a simple key-translation protocol, and key-distribution.
These certificates can be administered and used much as
public-key certificates are, so that users can communicate
securely while sharing neither an encryption key nor a
network connection.
our paper addressed many of the points you make, and some
others besides. note that at that time, in the late '80's,
DES was commonly called a "private-key cipher", instead of
a "symmetric-key cipher", as we say nowadays. today, i
would of course title the paper, "network security via
symmetric-key certificates."
> I was goofing around with the gss-sample program in the stock MIT
> Kerberos distribution and noticed that the GSSAPI protocol appears
> to have support for some kind of signature.
only public-key implementations of GSSAPI support digital
signatures. GSSAPI does not support any variation of our
"symmetric-key certs" idea, though a microsoft engineer has
written an internet draft about adding to the gssapi some
support for krbV5's user2user protocol.
- don davis
former athena staff
boston, ma
-
------------------------------
From: [EMAIL PROTECTED] (Nemo psj)
Subject: Re: Enigma encryption updated (Adam D)
Date: 19 Mar 2000 04:17:59 GMT
Well hell if I knew how to describe it with math I would. I wouldnt even know
where to begin hrmm well thats another weekend of reading (jots down in
notebook)
-Pure
------------------------------
From: DMc <[EMAIL PROTECTED]>
Subject: Re: Card shuffling
Date: Sun, 19 Mar 2000 05:23:47 GMT
On Sat, 18 Mar 2000 10:58:24 +0100, Mok-Kong Shen
<[EMAIL PROTECTED]> wrote:
>Although I have never played a card game, I do know the action of
>shuffling.
There are different kinds of "shuffling" a card deck. One of those
kinds is the "riffle." Another is the "cut." I have some experience
experimenting with the riffle and cut only.
>Evidently a veteran player shuffles (nearly) perfectly, while a
>novice often does less well and a kid maybe very poorly.
In my experience, very few people can riffle a 52 or 54 card deck
(nearly) perfectly. Diconis is reported to state he could. Most
expert conjurors probably can also.
>Does there exist any objective means to determine (or help to
>determine) the relative quality of shuffling, or is one left to
>rely on pure subjectivity in deciding on that issue?
Not at the moment that I am aware of. (My opinion only) is that
the mathematical community has declared randomness an impossible
concept to objectively quantify.
>If one let a card deck be processed through a number of successive
>inferior quality shuffling, it seems plausible that the result
>will asymptotically approach perfectness. Is it possible to
>say something more than simply the fact that near perfectness
>will ultimately be reached without knowing how fast the limit
>is being approached?
My understanding is Diconis theorized about the first seven
riffles starting with an ordered card deck. Many subsequent math
persons and card book authors decided this meant one has to riffle
a card deck at least seven times between deals to insure "real"
randomness. I call it the seven shuffle theory, and I think it
hogwash. I think Diconis would not agree with this bald extension
to his theory.
>I am aware that there is lot of fuzziness in my questions. But
>perhaps we could nonetheless have some discussions.
>(There exist mathematical works on card shuffling based on a
>certain defined way of 'perfect shuffling'. I am interested
>however in shuffling done by humans, which almost always have
>deviations from that.)
I have found studying the results of "perfect riffling" to be
useful in my imperfect understanding of this subject matter.
[EMAIL PROTECTED]
------------------------------
From: [EMAIL PROTECTED] (wtshaw)
Subject: Re: 64-bit Permutations
Date: Sat, 18 Mar 2000 22:56:03 -0600
In article <8b12e5$nd2$[EMAIL PROTECTED]>, [EMAIL PROTECTED] wrote:
> You are in fact restating the point I was trying to make to begin with. ;-)
>
> ..and to answer your question, I could possibly use that as a cipher at the
> time when computers can store and process such amounts of data (but by then
> it would also be useless as a cipher).
>
>
> In a previous article, Tim Tyler <[EMAIL PROTECTED]> writes:
> >[EMAIL PROTECTED] wrote:
> >
> >: A non-repetitive sequence of all 2^64 elements in the set of 64 bit
> >: sequences is 64*(2^64) bits long. I am then talking of a permution of
> 64-bit
> >: _values_.
> >
> >How could you /possibly/ use that as a cypher?!
> >
I refer to my Onega Base Translation Cipher in which Plaintext is in Base
38, 8 characters per block, and ciphertext is in base 64, 7 characters per
block. 42 bits can be transposed according to a key and the 64 character
set can be substituted.
Default keys are:
Trans(On): abcdef ghijkl mnopqr stuvwx yz0123 456789 ()[]{}
Subs(On): abcdefghijklmnopqrstuvwxyzABCDEF
GHIJKLMNOPQRSTUVWXYZ0123456789+/
Note that choice of transposition element designators can be entirely
arbitraty, but the substitution elements are those of the actual cipher
text set. Any permutation keyspace, whether transposition or
substitution, is determined by the number of element, 296 bits for 64
elements and 170 bits for 42, as above.
Such a cipher works well as a simple example of problems of large keys. To
recreate the whole key structure would take a fair amount of ciphertext.
--
To see the results of GW Bush's shaddow, visit the Valley;
notice the miserable conditions he allows to fester.
------------------------------
From: David A Molnar <[EMAIL PROTECTED]>
Subject: unknown verifier designated verifier proofs
Date: 19 Mar 2000 07:40:04 GMT
This is another idea I'm trying to work out. It may be (even) less
comprehensible than the previous posts on "encrypting to unknown public
key", for which I apologize in advance.
In 1996, Jakobsson, Impagliazzo, and Sako introduced the notion of a
"designated verifier proof." This is a proof of the form
Either a secret S is true
OR
I know the secret key of the party V
The party V is called a "designated verifier." The idea is that you can
use these to build proofs of knowledge which are both non-interactive
*and* non-transferable. The way this works is as follows :
Say I build a proof of the above form. I show it to V. Then V either
believes that S is true, or that I've compromised his secret key.
We'll assume that V believes his secret key is safe. Now he must
believe S.
If V tries to show the proof to anyone else...well...in most settings V
knows his own secret key. So there is no reason for anyone else to
believe that S is true. The proof is therefore non-transferable.
This is an insanely cool tool. There's only one minor drawback - in the
constructions I know about, you need to know a chameleon commitment scheme
for which V posesses the secret information required to forge
decommits. This generally means you need to know the identity of V.
This is kind of annoying if you want to apply these proofs in an anonymous
protocol.
The question : can you create a designated verifier proof for V without
knowing the identity of V ?
The first attempt : create a kind of "blinded public key" for the
chameleon commitment, along the lines of a blind recipient-hiding public
key system. That is, a blinded public key such that we can form
commitments which can be forged by the corresponding private key, yet
the original public key cannot be inferred from the blinded key.
Then use the blinded public key for commitments to make a designated
verifier proof. The resulting proof can be faked by V just as if it had
used the "real" public key.
The problems :
Designated verifier proofs can not be recipient hiding.
The whole "designated verifier" property depends on
everyone being able to tell that the commitments can
be forged by V. If the commit doesn't reveal V's
identity, then V can simply keep quiet about his
ability to forge the commit and it's all over.
We can't have a blinded public key with a non-recipient
hiding commitment. That would defeat the purpose of
blinding the public key!
Second try :
Have a blinded public key which produces a "blinded
commitment." Similar to Chaum's notion of a "blind
signature", this blinded commitment is not immediately
verifiable or forgeable by the designated verifier.
Instead, it must be "unblinded" by some other party.
This unblinding transforms the proof into something
which looks just like any other designated verifier
proof. In particular, it can't be linked to the
blinded public key used to create it.
I think this should be possible for the discrete-log
chameleon commitment discussed in the original
designated verifier proof paper by applying Wagner-style
blinding. I'll work out the details in the morning...
Problems :
The secret S could identify a proof, even if none of
the encryption or designation does.
Subliminal channels in these proofs, or marked proofs???
Any other tries, comments, applications, comments, etc. welcome
Thanks,
-David Molnar
------------------------------
From: Mok-Kong Shen <[EMAIL PROTECTED]>
Subject: Re: Card shuffling
Date: Sun, 19 Mar 2000 11:27:23 +0100
Douglas A. Gwyn wrote:
>
> Mok-Kong Shen wrote:
> > There are a number of tests described. But there is no single
> > numerical value that can be computed and taken as the (standard)
> > measure of 'randomness' comparable to the measures obtained
> > in physics.
>
> That's because you're asking the wrong question. What you *should*
> ask for is a test that provides a quantifiable amount of evidence
> for or against the hypothesis that a process is operating according
> to spec, where the spec includes some component of randomness. E.g.
> for a possibly "random" permutation, the spec could be that each
> index in sequence is assigned to any so-far-unassigned cell with
> equiprobability. To contrast the "to spec" hypothesis with *every*
> possible "not to spec" hypothesis, one needs a theory that develops
> in the main space and the dual space simultaneously. Kullback has
> already presented this in "Information Theory and Statistics" (1959).
> For general guidance on using "weight of evidence" in making
> rational decisions, see Good's writings.
Why I was asking the wrong question? For the players of a particular
session, isn't it a vital question whether the (particular) deck
that they play has been well shuffled? Note that Knuth gives
a plethora of tests for detecting non-randomness (not proving
randomness!) of given sequences but never gives a practical
'measure' of randomness (I guess that's impossible). On the other
hand, what we consider in the present case is ONE shuffled deck.
I earnestly doubt whether we can seriously say (after somebody
has shuffled a deck) how 'well' he has performed the task. If the
deck is the same as before the shuffling, then of course that's a
bad shuffling. But in other cases it's pretty hard to say, I
conjecture. To avoid misunderstanding, let me stress that it is not
the 'average' in some sense of his performance but the 'result' of
one particular 'shuffling' action by him that interests the players
of the game and hence needs to be evaluated.
M. K. Shen
------------------------------
From: Mok-Kong Shen <[EMAIL PROTECTED]>
Subject: Re: Card shuffling
Date: Sun, 19 Mar 2000 11:27:35 +0100
Jim Reeds wrote:
>
> Summary: effectiveness of card shuffling is studied by the theory of random
> walks on groups, in particular, by the rate of convergence to uniformity.
That seems to be indeed useful for comparing the relative performance
(in some sense 'average') of different persons doing the shuffling
task. However, if one person on a specific session of the game
shuffles a deck, could we say how well (effective) that deck
has been shuffled? (This is the first question in my original post.)
I surmise this is impossible. I have no doubt that there is a rate
of convergence to uniformity of the kind that you mentioned,
presumably being a numerical value dependent on the person performing
the shuffling, and that that rate can be experimentally determined
to some satisfactory degree. However, given that kind of data I yet
fairly doubt whether we can seriously say how a (particular) deck
gets increasingly well (effectively) shuffled as it is sucessively
handled either by the same or different persons. (This is the
second question in my original post.) It is at least conceivable
that two shuffles exactly cancel each other or that one gets back
after a number of shuffles to the original deck that one starts
with. Hence I don't yet clearly see that my second question is
solvable at all in a sense that is relevant for the players in a
particular session of the game.
M. K. Shen
------------------------------
From: Mok-Kong Shen <[EMAIL PROTECTED]>
Subject: Re: Card shuffling
Date: Sun, 19 Mar 2000 11:42:40 +0100
DMc wrote:
>
> My understanding is Diconis theorized about the first seven
> riffles starting with an ordered card deck. Many subsequent math
> persons and card book authors decided this meant one has to riffle
> a card deck at least seven times between deals to insure "real"
> randomness. I call it the seven shuffle theory, and I think it
> hogwash. I think Diconis would not agree with this bald extension
> to his theory.
Would a deck successively shuffled by seven persons have the
same quality (some degree of satisfaction from the point of
view of the players) independent of the persons doing the
shuffling, i.e. whether these are veteran players or novices or
young kids? I think one could plausibly entertain doubts to
any positive answer to that.
M. K. Shen
------------------------------
From: [EMAIL PROTECTED]
Subject: Concerning UK publishes "impossible" decryption law
Date: Sun, 19 Mar 2000 11:42:56 GMT
Like mentioned before, instead of building strange and potentially
dangerous traps or bullshit like this you can also
1. Grab a copy of urandom lib or some other random number generator
2. Write a program that takes 1..n symmetric keys and can encrypt 1..n
files each with its own key--- use PGP for the symmetric encryption
3. Implement an option that automatically encrypts large random files
with a random key unknown to you
4. To decrypt, you have to specify 1..m keys with m<=n
Since you don't know the random key used in encrypting the random junk
files, no law can enforce you to reveal it. Since encrypted random files
(hopefully) aren't distinguishable from encrypted non-random files, the
attacker does not know m. Now when someone knocks on your door and wants
you to expose all m keys, you can wipe out some of them (if on disk,
otherwise just not tell about them) and only expose keys that decrypt
unsuspicious data. Since it cannot be decided of the rest of the
encrypted files wether they have been encrypted with a pure random key or
not, no law (of a civilized country) can force you to reveal all keys
because the number of keys is unknown to the attacker. Make sure you
always use the "add random junk file" encryption option, so nobody can
take the usage of this option as a positive sign that you were trying to
hide additional files. (when asked, you'll say "this option was turned on
by default")
Do not reccommend the utility for use in uncivilized countries.
Best regards,
Erich Steinmann
Sent via Deja.com http://www.deja.com/
Before you buy.
------------------------------
** FOR YOUR REFERENCE **
The service address, to which questions about the list itself and requests
to be added to or deleted from it should be directed, is:
Internet: [EMAIL PROTECTED]
You can send mail to the entire list (and sci.crypt) via:
Internet: [EMAIL PROTECTED]
End of Cryptography-Digest Digest
******************************