Cryptography-Digest Digest #505, Volume #11       Fri, 7 Apr 00 00:13:00 EDT

Contents:
  Re: Q: Entropy (Bryan Olson)
  Re: Is this code crackable? ("1198")
  Re: GSM A5/1 Encryption (David A. Wagner)
  Re: Factoring Composite Numbers (NFN NMI L.)
  Re: Even more crypto humor ! (NFN NMI L.)
  Re: Is this code crackable? (Mario Lyken)
  Re: Q: Simulation of quantum computing (NFN NMI L.)
  Re: Magnetic Remenance on hard drives. (was: Re: Evidence Eliminator - Who is trying 
to silence our program? It's not working...) ("Sam")
  Re: Factoring Composite Numbers (lordcow77)
  Re: Factoring Composite Numbers (Tom St Denis)
  Re: Gray code computation? ("Trevor L. Jackson, III")
  Q's on Crypt. and jobs (Jon Pierre Fortney)
  Re: Q: Entropy (David Hopwood)
  Re: Q: Entropy (David Hopwood)

----------------------------------------------------------------------------

From: Bryan Olson <[EMAIL PROTECTED]>
Subject: Re: Q: Entropy
Date: Fri, 07 Apr 2000 00:28:18 GMT

[EMAIL PROTECTED] wrote:
> Mok-Kong Shen writes:
>
> > This means that a single (particular) message does have entropy.
>
> I'm afraid that you are misunderstanding the quote.
>
> A "message" in that context is a total potential communication, the
> set of all possibilities for that message, not any individual concrete
> message.  In crypto, it's the cyphertext you're looking at, which
> could (due to your ignorance) possibly decode to any of a number of
> plaintexts.
>
> Reread the start of the section: "Information theory defines the
> amount of information in a message as the minimum number of bits
> needed to encode all possible meanings of that message."  A "message"
> is something with many possible meanings, not an already-decoded
> result.

In individual message text, meaning a particular choice from
a message space, does indeed carry information and therefore
have a well defined entropy (but only in the context of the
probability space from which it is drawn).  According to
Shannon, the set of possible messages has "equivocation",
while a specific message has information.  Both are of the
same dimension, called entropy.

Here's an example that may help explain what I've been
writing about in the "entropy" threads:

Suppose Alice will shuffle a deck of 52 cards, and choose
one at random.  She will not look at it, but will show it to
Bob, who will then give her one of two (truthful) messages:
either "the card is an ace" or "the card is not an ace."

The equivocation of an unknown card is

   log_2(52) = 5.70 bits.

The entropy of the message space, in bits is

     48/52 * log_2(52/48)
   + 4/52  * log_2(52/4)
   = 0.391


If Alice receives the "the card is an ace" message, then
there are four equally probable possibilities, so the
equivocation of the card has dropped to 2 bits.  But the
entropy of the message space was only 0.391 bits.  How could
the entropy have dropped from 5.7 bits to 2.0 bits if she
only received 0.391 bits in the message?

The two messages carry different amounts of information. The
information in a message, in bits, is the log base two of
one over its probability.

    the card is an ace     -> log_2(52/4)  = 3.70 bits
    the card is not an ace -> log_2(52/48) = 0.115 bits


(Note that the equivocation of the message space is the
average (expected) message entropy.  The reference in
question, /Applied Cryptography/, considers only spaces in
which all messages are equally probable, and in such cases
the equivocation of the message space is equal to the
information in any of the message texts.)

Now the calculations make sense.  There were 5.70 bits of
equivocation, then Alice received 3.7 bits of information,
and so there are 2.0 bits of equivocation left.

In cases in which Bob gives Alice the "not and ace" message,
she should have 5.70 - 0.115 = 5.585 bits of equivocation
left.  There are 48 possible cards remaining, all equally
probable, and the log base two of 48 is 5.585.


--Bryan
--
email: bolson at certicom dot com


Sent via Deja.com http://www.deja.com/
Before you buy.

------------------------------

From: "1198" <[EMAIL PROTECTED]>
Subject: Re: Is this code crackable?
Date: Fri, 7 Apr 2000 08:33:44 +0800


>
>Or even more importantly, "still available".
>
>The classic example is a spy out in the field -- he can carry with


You have given an interesting CLASSIC example... No very desirable in modern
operation..

Once the spy leave the HQ, the problem already started if not before.. the
longer he stays before he can get the object information the higher the
risk... the key file or OTP can be stolen, damaged or altered by virus etc..
or he can be intercepted or killed. One biggest given away is the direction
of he travels so it is not desirable to travel too often between Hq and
designation.. I am not against better encryption but it only contribute to a
single link in the whole process..



------------------------------

From: [EMAIL PROTECTED] (David A. Wagner)
Subject: Re: GSM A5/1 Encryption
Date: 6 Apr 2000 17:21:34 -0700

In article <8cimfj$cod$[EMAIL PROTECTED]>,  <[EMAIL PROTECTED]> wrote:
> My question, is if you were to design a voice encryption system,
> how would you go about overcomming this weekness and
> susceptibility to attack (cf Silence Frames).

I'd use a secure stream cipher, i.e., one that resists known- and
chosen-plaintext attacks, like any decent modern cipher should.
Then the silence frame stuff is irrelevant -- you just don't have
to worry about it, period.

It's only relevant in the case of A5/1 because A5/1 is, in modern
terms, considered weak: A5/1 is susceptible to known-plaintext attacks,
so it cannot be considered strong by cryptographic standards.
However, if one wants to evaluate the practical impact of the attacks,
one has to inquire about the availability of known plaintext, and the
point I was making is that silence frames may provide exactly the
known plaintext required by the shortcut attacks on A5/1.

All of these details come about only because the GSM folks failed to
use a secure stream cipher.

------------------------------

From: [EMAIL PROTECTED] (NFN NMI L.)
Subject: Re: Factoring Composite Numbers
Date: 07 Apr 2000 01:04:01 GMT

<<1)  How long would it take to factor 'n' using the latest in factorization.>>

Gaaach.  This involves Big-O (or little-o, I forget) notation.  Generally, the
lastest methods using a whole bunch of computers can factor a general 150-digit
number in a few months.  

<<2)  The most basic of all factorization methods is to divide 'n' up to the
sqrt('n').  If I were to further decrease this space by 99% so that the
total space to search was 1% of sqrt('n') would this be a viable decrease or
would this expectancy still be to great?>>

It would make it much faster of course, but would not change the rate at which
the method slowed down.  I.E.  1% of a kazillion years for baby dividing a
150-digit number is still a kazillion years.

-*---*-------
S.T. "andard Mode" L.
STL's Quotation Archive: http://quote.cjb.net

------------------------------

From: [EMAIL PROTECTED] (NFN NMI L.)
Subject: Re: Even more crypto humor !
Date: 07 Apr 2000 01:05:58 GMT

<<I feel xor.>>

This is only funny at first sight if you routinely say it "Zore" and not "Ex
Or", like I do.  :-P

Next thing you know, there'll be a function called daan or quon.

-*---*-------
S.T. "andard Mode" L.
STL's Quotation Archive: http://quote.cjb.net

------------------------------

From: [EMAIL PROTECTED] (Mario Lyken)
Subject: Re: Is this code crackable?
Date: Fri, 07 Apr 2000 01:07:08 GMT

[EMAIL PROTECTED] (Dan Day) wrote:

>The classic example is a spy out in the field

Another is navy ships at sea. Although I really don't know how the world's
navies encrypt their communications, it certainly seems reasonable that
they would use one-time pads.
-- 
"Mario Lyken" is actually 0378 916542 <[EMAIL PROTECTED]>.
 01234 56789 <- Use this key to decode my email address and name.
              Play Five by Five Poker at http://www.5X5poker.com.

------------------------------

From: [EMAIL PROTECTED] (NFN NMI L.)
Subject: Re: Q: Simulation of quantum computing
Date: 07 Apr 2000 01:08:37 GMT

<<Can one simulate a quantum computer with a common digital computer?>>

Of course.  Anything a quantum computer can do, a classicial computer can do. 
Both are universal.  I suppose the difference lies in different ways of
interpreting "emulation".

-*---*-------
S.T. "andard Mode" L.
STL's Quotation Archive: http://quote.cjb.net

------------------------------

From: "Sam" <[EMAIL PROTECTED]>
Crossposted-To: alt.privacy,alt.security.pgp
Subject: Re: Magnetic Remenance on hard drives. (was: Re: Evidence Eliminator - Who is 
trying to silence our program? It's not working...)
Date: Thu, 6 Apr 2000 18:16:50 -0700



EE Support wrote in message ...
>On Wed, 15 Mar 2000 22:15:17 +0000, Withheld <[EMAIL PROTECTED]>

snip

>Disks use encoding systems to maximise the on/off states available to
>the heads at a certain speed.
>
>What goes to the disk as
>010101010101010101010101010
>
>Is magnetically saved more like
>000000011111100000111111111
snip

Just curious, do you mean the data element (ie 0) is repeated on the disk,
but the order remains the same ?  For example data "01101" going to the disk
would be recorded as "00001111111100001111"

Thanks



------------------------------

Subject: Re: Factoring Composite Numbers
From: lordcow77 <[EMAIL PROTECTED]>
Date: Thu, 06 Apr 2000 18:20:18 -0700

In article <[EMAIL PROTECTED]>, Tom St Denis
<[EMAIL PROTECTED]> wrote:
>
>so great for rsa style numbers.  A good start is to understand
why
>
>x^2 = y^2 (mod n), x != +-y (mod n)
>
>Is an important relationship.... if this is true, then gcd(x -
y, n) is
>a trivial factor of n.  This property is supposedly used in QS
and NFS.
>

I will attempt to explain in slightly more detail how this
property is used to factor large numbers. The strategy is to
find pairs a_i, b_i such that (a_i)^2=b_i mod n and b_i is
smooth over a certain factor base comprised of all the primes
under some limit (there are variations on this, such as allowing
b_i to have one large prime factor). One must now find a subset
of b_i's such that their product is a perfect square. This is
done using linear algebra methods to find a linearly dependent
set of vectors expressing the parity of the prime powers of each
of the individual b_i's. If one has collected more than factor
base+1 relations, one is guranteed to find such a set of vectors
S. Multiplying the corresponding a_i to the b_i in this set will
give a relation Product(a_i,i in S)^2=Product(b_i,i in S)^2.
Taking the square root of both sides yields either the trivial
factorization of the composite via gcd(x-y,n) or non-trivial
factors.


* Sent from RemarQ http://www.remarq.com The Internet's Discussion Network *
The fastest and easiest way to search and participate in Usenet - Free!


------------------------------

From: Tom St Denis <[EMAIL PROTECTED]>
Subject: Re: Factoring Composite Numbers
Date: Fri, 07 Apr 2000 02:16:36 GMT



lordcow77 wrote:
> 
> In article <[EMAIL PROTECTED]>, Tom St Denis
> <[EMAIL PROTECTED]> wrote:
> >
> >so great for rsa style numbers.  A good start is to understand
> why
> >
> >x^2 = y^2 (mod n), x != +-y (mod n)
> >
> >Is an important relationship.... if this is true, then gcd(x -
> y, n) is
> >a trivial factor of n.  This property is supposedly used in QS
> and NFS.
> >
> 
> I will attempt to explain in slightly more detail how this
> property is used to factor large numbers. The strategy is to
> find pairs a_i, b_i such that (a_i)^2=b_i mod n and b_i is
> smooth over a certain factor base comprised of all the primes
> under some limit (there are variations on this, such as allowing
> b_i to have one large prime factor). One must now find a subset
> of b_i's such that their product is a perfect square. This is
> done using linear algebra methods to find a linearly dependent
> set of vectors expressing the parity of the prime powers of each
> of the individual b_i's. If one has collected more than factor
> base+1 relations, one is guranteed to find such a set of vectors
> S. Multiplying the corresponding a_i to the b_i in this set will
> give a relation Product(a_i,i in S)^2=Product(b_i,i in S)^2.
> Taking the square root of both sides yields either the trivial
> factorization of the composite via gcd(x-y,n) or non-trivial
> factors.

Err that's what I meant to say...  Geez I have lots to learn...

Tom

------------------------------

Date: Thu, 06 Apr 2000 22:45:39 -0400
From: "Trevor L. Jackson, III" <[EMAIL PROTECTED]>
Subject: Re: Gray code computation?

The easy way to generate gray codes in sequence is with a counter.  IIRC the
code corresponding to the counter value is code = count XOR (count >> 1).

Joseph Ashwood wrote:

> A while back someone posted an algorithm for computing gray
> codes in series, I thought is was under the gray code like
> thread but I can't seem to find it on Deja, code someone
> repost the code that was there, I'll be using it to test a
> cipher.
>                     Joe


------------------------------

From: [EMAIL PROTECTED] (Jon Pierre Fortney)
Subject: Q's on Crypt. and jobs
Date: Fri, 07 Apr 2000 03:19:58 GMT

Over the past year or so I’ve become very interested in Cryptograpy and am 
interested in pursuing it as a possible career, whether it be in industry or 
academia.  I thought this may be a good place to post some questions, so if 
you’re willing to wade through my posting any advice and/or comments would be 
more than welcome.  Thanks in advance!

One of the things that I'm wanting is some hints on the general direction I 
should go with my studying.  Let me tell you briefly what I've done to give 
you some idea where I'm at.  I suppose some of you are familiar with Doug 
Sinson's book Cryptography, Theory and Practice.  From what I gather it is a 
"standard" book that introduces a lot of basic crypto systems and a fair, 
though not by any means overwhelming, amount of math.  That was the text for 
the class I sat in on last semester and while we only covered about half of it 
in class I've basically worked through the whole thing.  I've also worked 
through Neal Koblitz' book A Course in Number Theory and Cryptography.  That 
was quite a bit more complicated mathematically and not so "practical."  It 
covered basic number theory, finite fields, and the basics of elliptic curves. 
 I've also just started to work through Koblitz' Algebraic Aspects of 
Cryptography, which looks like it has some stuff on Polynomial rings and quite 
a bit more on Elliptic Curves.  I've also been reading Bruce Schneier's book 
Applied Cryptography, very computer based, almost no math.  My basic question 
is "Where should I go from here?"

My knowledge of computers/programming is basically nil, so what I'm planning 
on doing next is trying to hit that skill set.  I love the math, but as of yet 
I haven't really been able to play around with any "live" systems, which I 
would really like to do.  Besides, I suppose if I ever actually want to get a 
job that part is essential.  What programming languages would be the best to 
know.  Also, do you know of any sources of cryptosystem code that I could play 
with?  I've found some on the web.  What other sorts of computer stuff should 
I try to learn?  Is computer security considered a part of crypto by most 
people?  (They certainly seem different to me)  Ie, if I were to go for a job 
in the private sector do you know if I would be expected to know that as well? 
 Any hits or suggestions on how to best learn/familiarize myself with this 
aspect would be very much welcome.  As I said, I know almost nothing about 
computers.  

Next my questions venture to the mathematics aspects.  Above I've mentioned 
some of the types of math related to crypto that I've at least seen.  (Though 
I'm by no means an expert on it.)  What other branches of math would you all 
suggest trying to study?  Is algorithm theory something I should look into?  
Also, what about protocol analysis?  What sort of math feeds into that?  I've 
had some very basic Information Theory and Coding Theory, and it seems they 
pop up in crypto, but how much?  Is it important to know that?  

Finally, I’m wondering about jobs in crypto. (esp non-government jobs).  How 
extensive does my knowledge in cryto have to be to get one, and were is the 
best place to start to look for one!?  I’m moving to LA at the start of the 
summer so if anyone has any ideas about possible places to look for jobs in 
crypto or somehow related to crypto out there I’d be extreamly grateful for 
your advice.

Thanks again, 
Jon Pierre Fortney

------------------------------

Date: Fri, 07 Apr 2000 02:20:35 +0100
From: David Hopwood <[EMAIL PROTECTED]>
Reply-To: [EMAIL PROTECTED]
Subject: Re: Q: Entropy

=====BEGIN PGP SIGNED MESSAGE=====

Mok-Kong Shen wrote:
> But let me quote from Schneier's Applied Cryptography (which may not
> be familiar to you physicists but is well-known in cryptology):
> 
>      Formally, the amount of information in a message M is
>      measured by the entropy of a message.
> 
> This means that a single (particular) message does have entropy.

No, that sentence, if you take it in context, means something like
"entropy is a formalisation of the idea of 'amount of information'".

It's also clear from context that
a) the chapter does not give a formal mathematical treatment of entropy
   (Schneier refers to Shannon's papers and [Gal68] for that).
b) messages are referred to only in the context of a particular
   distribution.

Here is the full quote (in the lines starting #, from the first page
of Chapter 11 of AC2):

# Information theory defines the *amount of information* in a message
# as the minimum number of bits needed to encode all possible meanings
# of that message, assuming all messages are equally likely. For
# example, the day-of-the-week field in a database contains no more
# than 3 bits of information, because the information can be encoded
# with 3 bits:
# [example encoding snipped]
#
# If this information were represented by corresponding ASCII
# character strings, it would take up more memory space but would
# not contain any more information. Similarly, the "sex" field of
# a database only contains 1 bit of information, even though it might
# be stored as one of 6-byte ASCII strings: "MALE" or "FEMALE".
#
# Formally, the amount of information in a message M is measured by
# the entropy of a message, denoted by H(M).

As the notation H(M) is used in information theory, M is a random
variable, not a message or bit string. It's quite common, even among
mathematicians when speaking informally, to fudge the difference
between a random variable and an outcome of that random variable,
and "Applied Cryptography" is not the right book in which to look for
a more formal treatment than that.

# The entropy of a message indicating sex is 1 bit; the entropy of a
# message indicating the day of the week is slightly less than 3 bits.

Note that the examples given are distributions, not particular messages.


[Gal68] R. G. Gallager,
        Information Theory and Reliable Communications,
        New York: John Wiley & Sons, 1968.

- -- 
David Hopwood <[EMAIL PROTECTED]>
PGP public key: http://www.users.zetnet.co.uk/hopwood/public.asc
RSA 2048-bit; fingerprint 71 8E A6 23 0E D3 4C E5  0F 69 8C D4 FA 66 15 01


=====BEGIN PGP SIGNATURE=====
Version: 2.6.3i
Charset: noconv

iQEVAwUBOO03xTkCAxeYt5gVAQFotQf+N/erSJ+G69xIPMvmJBwHoNuv/8Z4Z1sM
SaSzlcOJ858nz9NGxh2B3VX/eGm/yXGtZNH7Zrr8j5D4QQJdryjefxIr/ZC6rMMc
ve8eyAAkqfJDn/U2peA44TMBRDlia0+dEml0X+SyyqJYlM3c7p9o+PbWGSR9Mrgh
17WTqJW6D8ubDh40dPVql8xxxuOYZ7xEroIMyvR6F9s9WOz5S6s9vS59CToi7Z7p
WQr1rQM05C0ndecJgqPxt//vDixLm8fPCIerIw62gW5feTKPOVyeuRWIHfDwOvMh
HKsw1W+7hkuxvYqAimBpNsXFWzkjdJ+hq5zpAg5w8jKi9QN2v7YdEg==
=Rzcn
=====END PGP SIGNATURE=====



------------------------------

Date: Fri, 07 Apr 2000 02:54:55 +0100
From: David Hopwood <[EMAIL PROTECTED]>
Reply-To: [EMAIL PROTECTED]
Subject: Re: Q: Entropy

=====BEGIN PGP SIGNED MESSAGE=====

Mok-Kong Shen wrote:
> David Hopwood wrote:
> >
> > Yes - which means that if you fix any specific pattern of 40 outcomes
> > in advance, you won't see that pattern from an unbiased coin, either.
> 
> In any scientific experiment one never (CAN never) 'fix' any
> results in advance.

In a scientific experiment you should always fix the hypothesis in
advance.

Here, the hypothesis, parameterised by a bit sequence x, is "the outcome
of 40 tosses of this unbiased coin will not be x_1..x_40". A hypothesis
of this form for any x will never be proven wrong in practice by
experiment, if we are talking about tosses of real coins (if we are
simulating coin tosses using an RNG with a high output rate, change
the sequence length to 100 and the same principle applies).

This of course demonstrates a limitation of the scientific method in
cases where counterexamples to the hypothesis occur rarely. It's the
reason why you can't prove that software is bug-free by experiment,
for example (which in that case is called "testing").

> The result of a scientific experiment comes out, AFTER the experiment
> is done, as far as I am aware.

Well, in most cases experiments are done with the expectation of a
particular result (consistent with the hypothesis). If the result is
not as expected, the experimenters do a great deal of checking of their
assumptions, equipment, and the details of the experimental set-up
before concluding that the hypothesis was wrong.

- -- 
David Hopwood <[EMAIL PROTECTED]>
PGP public key: http://www.users.zetnet.co.uk/hopwood/public.asc
RSA 2048-bit; fingerprint 71 8E A6 23 0E D3 4C E5  0F 69 8C D4 FA 66 15 01


=====BEGIN PGP SIGNATURE=====
Version: 2.6.3i
Charset: noconv

iQEVAwUBOO0/3jkCAxeYt5gVAQGE/Af+K7BOPGlCDo3ntyWrKhQtWgLJYE9CDupT
U7dti2X/ySWWW96Iztrt+VBRtPjgWrSC5TtAyvKYvIyE8hopKaTkgA6P3LDyGpNM
2DCtUu5jWQRXnFvu4Tq48PHSbq2kDGPDo2imu0ZZVFn1KMCJDt4Smg8F/bkqgjvo
mdyKHjVuQfFN3ri/ICkvPAXe29Yb0M+JrQ8mAbP5elJcEWlzshX/Y7RmNIYKEWF3
MTLjE62Tpl5x4n40ePVDiY4gnHLawqrgLjuQ6aQ2XpTSUkQoqrNHVFPQXvXQ0pi8
tp2vIEAa1DYve79IbHhNgGT7ynx5T/0GDb3hkO7frF7H1myWIDYNJw==
=tsR/
=====END PGP SIGNATURE=====


------------------------------


** FOR YOUR REFERENCE **

The service address, to which questions about the list itself and requests
to be added to or deleted from it should be directed, is:

    Internet: [EMAIL PROTECTED]

You can send mail to the entire list (and sci.crypt) via:

    Internet: [EMAIL PROTECTED]

End of Cryptography-Digest Digest
******************************

Reply via email to