Cryptography-Digest Digest #970, Volume #9 Mon, 2 Aug 99 12:13:03 EDT
Contents:
Re: How to write REALLY PORTABLE code dealing with bits (Was: How Big is a Byte?)
(DaveP)
Re: Current export laws (David Ochel)
Re: NBE: Not crackable by brute force key search (Volker Hetzer)
Re: How to write REALLY PORTABLE code dealing with bits (Was: How Big is a Byte?)
(Alan Braggins)
Re: A most useful cipher (SCOTT19U.ZIP_GUY)
Re: the defintion of Entropy ("Tony T. Warnock")
Re: With all the talk about random... (Patrick Juola)
Re: With all the talk about random... ("Tony T. Warnock")
Re: CFB mode with same initialization vector (SCOTT19U.ZIP_GUY)
Re: Is breaking RSA NP-Complete ? (Anton Stiglic)
Re: Cryptanalysis of R250 (SCOTT19U.ZIP_GUY)
Re: the defintion of Entropy (Anton Stiglic)
Re: the defintion of Entropy (Anton Stiglic)
Re: With all the talk about random... ("Tony T. Warnock")
Re: question: SHA --> stream cipher (Francois Grieu)
Re: Is breaking RSA NP-Complete ? ([EMAIL PROTECTED])
----------------------------------------------------------------------------
From: DaveP <[EMAIL PROTECTED]>
Crossposted-To:
alt.folklore.computers,alt.comp.lang.learn.c-c++,comp.lang.c++,microsoft.public.vc.language
Subject: Re: How to write REALLY PORTABLE code dealing with bits (Was: How Big is a
Byte?)
Date: Mon, 02 Aug 1999 09:24:02 -0400
It appears that us oldtimers _do_ remember it right. I dug deep on my
bookshelf and found "Introduction to Computer Networks" from Digital
Equip. Corp. (1974). Its glossary lists
"BYTE
A binary element string operated upon as a unit and usually shorter
than a computer word, e.g., six-bit, eight-bit, or nine-bit bytes."
It has evolved to where it's almost always eight bits now, but that was
not always true. Now let's get back to the main topic of this group and
quit bickering.
--
Dave P.
"A man who carries a cat by the tail learns something he can
learn in no other way" --Mark Twain
------------------------------
From: David Ochel <[EMAIL PROTECTED]>
Subject: Re: Current export laws
Date: Mon, 26 Jul 1999 12:14:41 +0200
Hi,
[EMAIL PROTECTED] wrote:
>
> Does anyone know where I can find the current export laws for
> cryptography? (It's not like I'm gonna follow them anyway. =] )
http://cwis.kub.nl/~frw/people/koops/lawsurvy.htm
cu David
------------------------------
From: Volker Hetzer <[EMAIL PROTECTED]>
Subject: Re: NBE: Not crackable by brute force key search
Date: Mon, 26 Jul 1999 13:06:33 +0200
Mickey McInnis wrote:
> Well, you could design a cryptosystem where this wasn't true, and there
> might be reason to do so. One example would be a system that made a
> hash of the orignal cleartext, appended the hash to the cleartext, and
> then encrypts.
>
> While this sounds like you've deliberately weakened your cryptosystem,
> it's not necessarily a bad idea.
Yes but this is just a definition thing. What you call weakened
cryptosystem I call (from an enemies point of view) an assumtion about
the plaintext, namely that the last few bits and all the previous ones
must be in a certain relationship. It's not much different from for
instance an assumtion that the deciphered message is guaranteed to be
7bit ASCII.
Greetings!
Volker
------------------------------
From: Alan Braggins <[EMAIL PROTECTED]>
Crossposted-To:
alt.folklore.computers,alt.comp.lang.learn.c-c++,comp.lang.c++,microsoft.public.vc.language
Subject: Re: How to write REALLY PORTABLE code dealing with bits (Was: How Big is a
Byte?)
Date: 02 Aug 1999 15:40:02 +0100
[EMAIL PROTECTED] (Eli Y. Orkman) writes:
> Okay, I'm sorry I fell for this silly troll.
That's the weirdest excuse for getting something completely wrong
I've seen in a long time. It's traditional to say "_I_ was trolling
all along". No one would believe you, but at least it makes some
kind of sense.
------------------------------
From: [EMAIL PROTECTED] (SCOTT19U.ZIP_GUY)
Subject: Re: A most useful cipher
Date: Mon, 02 Aug 1999 15:38:13 GMT
In article <[EMAIL PROTECTED]>, [EMAIL PROTECTED]
(wtshaw) wrote:
>When we type on a keyboard, we do a most un-ASCII type of thing: While
>presented with 47 standard lower case character keys, we shift into
>alternate upper case ones to give us the basic 94 in the lower ASCII set.
>We also routinely use the space key and double return for a blank line
>between paragraphs.
>
>Well, 47 characters, plus 1 for shift and 1 for space gives us a set of 49
>characters, quite handy for converting to certain other bases.
>
>Mafra, there is such a place, converts base 49 to base 26 by way of base
>seven information units, hepits. Each character in base 49 can be
>represented by 2 hepits, 5 characters as 10 hepits, 10 characters as 20
>hepits. These 10 or 20 hepits can be transposed according to a key. And,
>they can be converted to 6 or 12 common alphabetic letters, base 26.
>
>The base set and order for 49 characters is the 47 character keys on the
>keyboard, top left to right, top to bottom, to bottom right, plus shift
>and space. That's right, letters in two cases, digits, all the embossed
>symbotls. The application frontend cuts multiple spaces to one in
>preformatting, but two spaces are made to represent a double return blank
>line.
>
>I could have made a substitution key of 49, but left it alone, choosing
>fairly simple keys for this application. Default keys (for Mafra 20) are:
>Subs(Ma): abcdefghijklm nopqrstuvwxyz
>Trans(Ma): abcde fghij klmno pqrst
>
>In summary, in Mafra, plaintext groups are 5 or 10 characters, ciphertext
>groups, 6 or 12. The application will NOT see tabs, multiple spaces, or
>single returns, but handles everything else on the keyboard, outputting
>only alphabetic letters, either case OK. If you need to make a list,
>double space it.
>
It seems like all you are doing is a form of compression which in itself
is a good idea so the the "entopy/bit" of the message increases. The next
step after this entopy increase would be to encrypt the message.
Have you compared this to dynamic huffman compression. You can get
one it my site. Or if your limiting the message to some fixed number of
printable characters and the spce and the "new line" and "double new line"
You could make the starting tree of the dynamic huffman only use X number
or character ( symbols). May methods all currently use all 256 symbols for
all combinations of 8 bits. But there is no reason that much smaller subset
could be used.
I could write such a linited one for you because if I limit the subset of
symbols used it would greatly increase the "entropy per bit". If the users
of "sci.crypt" can come be with such a subset. The smaller the subset
the better. The big advantage of this type of compression for a first pass to
encryption is that if a message is intercepted and the "enemy" uses a wrong
key he will get a compressed file that when uncompressed will contain
onlt the subset of synbols used and there are more apt to be false messages.
However I for one would not like to see the "new line or double new line" as
a symbol in the subset beause it is easy to make a reader program that does
line wrapping so why waste a symbol for it.
Is there any one interested in this idea for compression. The user is still
free to pick the encryption method of his choice. It just would be nice that
as a first pass before encryption we could use a common method to greatly
increase the entropy of the message with out using a compression that
has headers or is not "one to one".
David A. Scott
--
SCOTT19U.ZIP NOW AVAILABLE WORLD WIDE
http://www.jim.com/jamesd/Kong/scott19u.zip
http://members.xoom.com/ecil/index.htm
NOTE EMAIL address is for SPAMERS
------------------------------
From: "Tony T. Warnock" <[EMAIL PROTECTED]>
Subject: Re: the defintion of Entropy
Date: Mon, 02 Aug 1999 08:49:14 -0600
Reply-To: [EMAIL PROTECTED]
[EMAIL PROTECTED] wrote:
> In article <[EMAIL PROTECTED]>,
> "Douglas A. Gwyn" <[EMAIL PROTECTED]> wrote:
> > [EMAIL PROTECTED] wrote:
> > > If the period of the output is infinit then it's truly random and
> the
> > > entropy in bits is inifite as well. Simple as that.
> >
> > Nope. Consider 0101101110111101111101111110...
> > This has "infinite period" but is very far from random.
> >
>
> Then you are going to say the entropy of a 63-bit LFSR is 2^63 - 1 bits
> and I will say you are wrong. Your sequence can be represented with a
> few bits. A truly random source is independant and infinite in
> length. Maybe I forgot to mention that... oh well.
>
> Tom
> --
> PGP key is at:
> 'http://mypage.goplay.com/tomstdenis/key.pgp'.
> Free PRNG C++ lib:
> 'http://mypage.goplay.com/tomstdenis/prng.html'.
>
> Sent via Deja.com http://www.deja.com/
> Share what you know. Learn what you don't.
Actually, the sequence cannot be computed with finite storage, only with
storage growing as log(number of bits produced)--it still isn't random even
though all k-bit combinations occur with the proper frequency. This is true
even though there is a proponderance of 1's.
Tony
------------------------------
From: [EMAIL PROTECTED] (Patrick Juola)
Subject: Re: With all the talk about random...
Date: 2 Aug 1999 09:39:54 -0400
In article <[EMAIL PROTECTED]>,
Robert C. Paulsen, Jr. <[EMAIL PROTECTED]> wrote:
>[EMAIL PROTECTED] wrote:
>> Nothing is truly random though. Why do you think we have laws of
>> physics and developing laws of quantum mechanics? Just cuz you can't
>> explain something doesn't mean it's random.
>>
>> I think there is a misconception. A truly random number just occurs.
>> Nothing dictates what it will be or when it will be (such as counting
>> or detecting alpha particles). A unpredictable number (such as the
>> various methods on real life sources) is just that but not really
>> random.
>>
>> If you could look at a piece of amercium and see alpha particles leave
>> you could predict what your counter will produce, but since that's
>> believed to be difficult we assume it's 'random'. Simple as that.
>
>No it's not. Here is a quote from a passage in _The Character of Physical
>Law_ by Richard Feynman from the chapter "Probability and Uncertainty --
>the Quantum Mechanical view of nature"
>
>================[ quote ]=============================
>It is impossible to predict in any way, from any information ahead of
>time [the results of a measurement of quantum behavior]. That means that
>physics has given up, if the original purpose was - and everybody thought
>it was - to know enough so that given the circumstances we can predict
>what will happen next. ...
As much as I admired the late Professor Feynman, he's wrong, dead wrong,
in this little snippet. It's easy to do partial predictions of
measurements of quantum behavior; I predict a larger piece of americium
will produce more radioactive decays than a smaller piece. Q.e.d.
More generally, although the interpretation that Dr. Feynman gives
above is the most commonly accepted one, Sir Arthur Clarke's second
law -- "When a distinguished but elderly scientist says that something
is impossible, he is almost certainly wrong" -- is relevant here.
-kitten
------------------------------
From: "Tony T. Warnock" <[EMAIL PROTECTED]>
Subject: Re: With all the talk about random...
Date: Mon, 02 Aug 1999 08:53:42 -0600
Reply-To: [EMAIL PROTECTED]
[EMAIL PROTECTED] wrote:
> In article <[EMAIL PROTECTED]>,
> "Robert C. Paulsen, Jr." <[EMAIL PROTECTED]> wrote:
> > That seems like a natural explanation to me too, but when I made
> > such a suggestion in another thread a few weeks back several people
> > replied saying essentially that ...
> >
> > a) There was no quantum indeterminacy involved in dice rolling, and
> > b) quantum indeterminacy was not required to get true randomness
> > from rolling dice.
> >
> > As far as I know, the only behavior in the universe known to
> > involve true randomness is is from quantum effects.
> >
> > Other stochastic effects, chaos, complexity, etc. are just ways of
> > describing or dealing with situations where we lack enough
> > information to make predictions based on the underlying determinacy,
> > even though this information is obtainable in principle.
> >
> > It is not unheard of for quantum randomness to make itself known on
> > a macroscopic scale -- a Geiger counter is the obvious example.
> > Perhaps rolling dice is another example. I really don't know if the
> > results of dice rolling actually is effected by quantum
> > indeterminacy but it would be interesting to see a "proof" one
> > way or the other.
>
> Nothing is truly random though. Why do you think we have laws of
> physics and developing laws of quantum mechanics? Just cuz you can't
> explain something doesn't mean it's random.
>
> I think there is a misconception. A truly random number just occurs.
> Nothing dictates what it will be or when it will be (such as counting
> or detecting alpha particles). A unpredictable number (such as the
> various methods on real life sources) is just that but not really
> random.
>
> If you could look at a piece of amercium and see alpha particles leave
> you could predict what your counter will produce, but since that's
> believed to be difficult we assume it's 'random'. Simple as that.
>
> I am no phyisic person so if some of my facts are off please note. But
> the idea still remains. Nothing is truly random.
>
> Tom
Your facts are off. Quantum Mechanics is truly random (at least the current
theory.)
It's different from an lack-of-knowledge type of randomness and can lead to
different results. The lack of marginal distributions is (at least to me)
the most interesting.
------------------------------
From: [EMAIL PROTECTED] (SCOTT19U.ZIP_GUY)
Subject: Re: CFB mode with same initialization vector
Date: Mon, 02 Aug 1999 15:57:42 GMT
In article <[EMAIL PROTECTED]>, [EMAIL PROTECTED] (Daniel
Vogelheim) wrote:
>Hello group,
>
>why is encryption in CFB mode insecure, if you use the same
>initialization vector multiple times?
>
>Applied Cryptography, 2nd ed., says so (chapter 9.6, p.201 middle),
>but doesn't explain. I have been unable to figure it out on my own,
>and a search turned up nothing. Where can I find a paper eplaining
>details?
>
>Any help is greatly appreciated.
>
>Sincerely,
>Daniel Vogelheim
I don't like to adverstise or quote to many people but all
the in vogue chaining mods are weak. But I did a quick
alta vista search on your question the link it came up
with somthing that may anwser some of your question. However you
can also like at ritters page or the FAQ.
http://www.rsa.com/rsalabs/faq/html/2-1-4.html
Most commerical applications use methods simalar to above.
One way you can test for a weakness in my mind any way. Is
to find an encryption method that hides little of the method from the user and
does not change the length of the message being encrypted.
Fist encrypt with one key then reverse the byte order of file and
encrypt with second key. Change one bit in the file then try to recover
the plain text. If the whole file is trashed( use a binary file compare)
then the encryption chaining method might be ok. If only a few blocks
are wrong the chaining method sucks. And you need something
like methods used in scott16u or scott19u which have been proven secure
against such modern attacks as the "Slide attack" which Mr BS and DW
think is hot stuff though most likely decades behind what the NSA uses.
David A. Scott
--
SCOTT19U.ZIP NOW AVAILABLE WORLD WIDE
http://www.jim.com/jamesd/Kong/scott19u.zip
http://members.xoom.com/ecil/index.htm
NOTE EMAIL address is for SPAMERS
------------------------------
From: Anton Stiglic <[EMAIL PROTECTED]>
Subject: Re: Is breaking RSA NP-Complete ?
Date: Mon, 02 Aug 1999 11:09:01 -0400
About the class ZPP:
Refering to D.S Johnson, A Catalog of Complexity Classes.
I quote some definitions and discuss some stuff:
def. R: the class R consists of all decision problems soved by polynomial-time
RTMs.
def. RTM: A random Turing Machine is an NDTM (Non-Deterministic TM)
such that, for ezch possible input string, either there are no accepting
computations or else at least half of all computatinos are accepting.
Intiutively, this meens that if a _decisional_ problem PROB is in RTM,
there exists an algo.A such that,:
-given an element e not in PROB, A(e) will answer no,
-given an element e in PROB, A(e) will answer yes with prob. 0.5
We can call it k times, if it always answers no, we are sure with prob.
0.5^k that it is not in PROB.
def. COMPOSITE: n is in COMPOSITE if there is an integer b,
1 < b < n, such that b divides n.
fact. COMPOSITE is in R.
proof: it is know (refer to references in reference... :) that an integer n
is coposite if and only if more than half of the integers b, 1 <= b < n, are
"witnesses" to this in a technical sense that can be verified in polynomial time
given just n and b.
So we define an algo as such: for an input n, pick a random number 1 < b < n
and say yes if and only if b is a witness to the compositeness of n.
The total time is O(log^3n), i.e. polynomial in the size of the input number n.
COMPOSITEs complementary problem is PRIME_NUMBER, so PRIME_NUMBER
is in co-R.
Two other facts that have been prooven in litterature:
- PRIME_NUMBER is also in R.
- If the Extended Riemann Hypothesis holds, then COMPOSITE is in P.
(of cours, I will not proove these statements here, I personaly don't know the
proofs ;)
def. ZPP = R intersection with co-R
fact: (deduced from facts above) PRIME_NUMBER is in ZPP.
Anton
------------------------------
From: [EMAIL PROTECTED] (SCOTT19U.ZIP_GUY)
Subject: Re: Cryptanalysis of R250
Date: Mon, 02 Aug 1999 16:02:50 GMT
In article <[EMAIL PROTECTED]>, The Evil Anti-Rick <[EMAIL PROTECTED]>
wrote:
>Does anyone have any cryptanalysis information on the R250 PRNG
>algorithm and/or cryptanalysis information of byte-wise XOR stream
>ciphers?
>
> An an exercise, I wrote a simple XOR stream-cipher using R250 as the
>RNG. That was the easy part. I looked through all my old textbooks for the
>equations needed to break such a cipher and came up empty-handed. I recall
>that it seemed pretty straight-forward, and it was presented as an acedemic
>exercise for first-year crypto students.
>
> Any pointers, links, answers and/or useful incantation would be
>appreciated. Pointless criticisms or other wastes of bandwidth will be
>re-directed to /dev/null.
>
>--R. Pelletier,
>Sys Admin, House Galiagante
>and Amateur Cryptographer
It is not so much that you are using R250 or what ever. It is
that if you use the same key twice so that you if your enemy
has one plaintext message encryptrd message pair he can
easily decrypt all future messages by getting the stiring of bits
used in the XOR. So that all future messages are broken.
See attacks on OTP ciphers where the same key used more
than once.
David A. Scott
--
SCOTT19U.ZIP NOW AVAILABLE WORLD WIDE
http://www.jim.com/jamesd/Kong/scott19u.zip
http://members.xoom.com/ecil/index.htm
NOTE EMAIL address is for SPAMERS
------------------------------
From: Anton Stiglic <[EMAIL PROTECTED]>
Subject: Re: the defintion of Entropy
Date: Mon, 02 Aug 1999 10:30:04 -0400
> It's a lot easier then that. The entropy in bits of a sequence is the
> number of bits required to represent the seqeunce. Truly random
> sources have an infinite number of bits and thus total entropy. PRNG
> algorithms have a finite number of bits and can be compressed if they
> recycle. PRNG sequences can also be compressed to the PRNG state as
> well. Like the output of a LFSR is 2^n - 1 bits long but you only need
> n bits to re-create the sequence...
>
> Simple no?
Allmost the right defintion. The only thing is that you mesure the number
of
bits required to represent "a _next _ bit" in your sequence. So if you
have
2^n - 1 bits wich can be encoded with n bits, you need to compute the
average
bits needed to compute _ONE_ bit in that sequence.
Anton
------------------------------
From: Anton Stiglic <[EMAIL PROTECTED]>
Subject: Re: the defintion of Entropy
Date: Mon, 02 Aug 1999 10:26:41 -0400
[EMAIL PROTECTED] wrote:
> In article <[EMAIL PROTECTED]>,
> Anton Stiglic <[EMAIL PROTECTED]> wrote:
> > Keith A Monahan wrote:
> >
> > > If that was an intuitive approach, I'd hate to see a non-intuitive
> approach.
> > >
> > > Keith
> > >
> >
> > The non-intuitive approach is realy more non-intuitive :)
> >
>
> But your first approach was wrong anyways.
>
> What says ...00000000... (some inifite length bits) is more/less random
> then ...1001110001... ?
>
> If the period of the output is infinit then it's truly random and the
> entropy in bits is inifite as well. Simple as that.
I don't think you understood the approach ( I probably did not explain it
well enough). Entropy
is a mesure of knowledge (information), when in my example I was using S->
000000... (S spits
out 0 all the time), you could think of S beeing a deterministic algorithm
wich we are CERTAIN
will always spit out 0 (because, for example, we have seen the code!).
When computing
entropy, you need to first copute your probabilities, if you have no info,
you assign equal probability
to each output (a uniform distribution).
Entropy was developed by Shanon, iln 1949 (Bell technical journal...)
Hope this helps!
Anton
------------------------------
From: "Tony T. Warnock" <[EMAIL PROTECTED]>
Subject: Re: With all the talk about random...
Date: Mon, 02 Aug 1999 09:00:34 -0600
Reply-To: [EMAIL PROTECTED]
Dave Knapp wrote:
> [EMAIL PROTECTED] wrote:
> >
> > Nothing is truly random though. Why do you think we have laws of
> > physics and developing laws of quantum mechanics? Just cuz you can't
> > explain something doesn't mean it's random.
>
> It's either truly random or it's nonlocal. Me, I prefer truly random,
> as do most other physicists.
>
> > If you could look at a piece of amercium and see alpha particles leave
> > you could predict what your counter will produce, but since that's
> > believed to be difficult we assume it's 'random'. Simple as that.
>
> No, the randomness is _when_ the Am decays, not what happens to the
> alpha particle once it is emitted.
>
> -- Dave
It seems to be a bit of both. At least in terms of entanglements.
------------------------------
From: [EMAIL PROTECTED] (Francois Grieu)
Subject: Re: question: SHA --> stream cipher
Date: Mon, 02 Aug 1999 18:00:16 +0200
David Bernier <[EMAIL PROTECTED]> wrote :
> (0) Choose a secret random IV of 160 bits, say w_0
> (1) Send w_0 securely to recipient [e.g. through public-key algorithm]
> (2) let w_{i+1} = SHA[w_i] for i=0,1,2,3,4...
> (3) The bit-stream [cryptographically strong pseudo-random numbers]
> is then obtained by concatenating w_0, w_1, w_2, ....
I assume your pseudo-random numbers are XORed with the plaintext
to obtain the ciphertext: c_i = p_i XOR w_i
A serious weakness is that if an opponent knows or guesses some of
the plaintext, say p_i, all the plaintext that follows can be obtained:
p_{i+1} = c_{i+1} XOR SHA[p_i XOR c_i]
A better solution is w_i = SHA(K|i) where | is concatenation,
and K is a key.
For better security (maybe even provable under the assumption that
SHA has certain properties), K and i shall be combined in more
elaborate ways involving several SHA iteration; see
<http://www-cse.ucsd.edu/users/mihir/papers/hmac.html>
Francois Grieu
------------------------------
From: [EMAIL PROTECTED]
Subject: Re: Is breaking RSA NP-Complete ?
Date: Mon, 02 Aug 1999 15:51:14 GMT
Safuat Hamdy wrote:
> And PLEASE everybody recall that classes like P, NP, PSPACE and so on
refer
> to DECISIONAL problems ("does some property hold on some object", i.e.
"is some
> word in a particular set?"),
Sort of right.
Formally they refer to languages, where a language is a
set of strings. They are the particular sets, not the
problems of deciding whether an object is in the set.
> hence in a precise sense a statement like
> "factoring is (not) NP-hard" is simply junk.
Wrong. NP-hard is a set of functions, not languages or
decision problems. A function is NP-hard if and only
if it is polynomially reducible to and from deciding
some NP-complete language.
--Bryan
Sent via Deja.com http://www.deja.com/
Share what you know. Learn what you don't.
------------------------------
** FOR YOUR REFERENCE **
The service address, to which questions about the list itself and requests
to be added to or deleted from it should be directed, is:
Internet: [EMAIL PROTECTED]
You can send mail to the entire list (and sci.crypt) via:
Internet: [EMAIL PROTECTED]
End of Cryptography-Digest Digest
******************************