Cryptography-Digest Digest #280

2001-05-01 Thread Digestifier

Cryptography-Digest Digest #280, Volume #14   Tue, 1 May 01 19:13:00 EDT

Contents:
  Eurocrypt presentation on NSS (Dan Bailey)
  Re: Censorship Threat at Information Hiding Workshop (Paul Pires)
  Re: RSA BRUTE FORCE (Jack Brennen)
  Re: Style of discussions (John Bailey)
  Any one got a Clue on configuring freeswan? (mickey)
  Re: RSA BRUTE FORCE ([EMAIL PROTECTED])
  Feige-Fiat-Shamir and Guillou-Quisquater Identification Schemes (Brice Canvel)
  Re: Feige-Fiat-Shamir and Guillou-Quisquater Identification Schemes (Quisquater)
  Re: RSA BRUTE FORCE (John Savard)
  Re: I do not feel secure using your program any more. (James Felling)
  Re: Censorship Threat at Information Hiding Workshop (Darren New)
  Re: Intacta.Code ... (Thomas Christensen)
  Re: Intacta.Code ... (Thomas Christensen)
  Re: Censorship Threat at Information Hiding Workshop (Paul Pires)



From: Dan Bailey [EMAIL PROTECTED]
Subject: Eurocrypt presentation on NSS
Date: Tue, 1 May 2001 16:06:40 -0400

Recently, there have been postings on this group that the NSS scheme
outlined in the NSS paper to be presented at Eurocrypt May 8 has been
attacked, and that any NSS signature can be easily forged.  There have
also been pointers to a web site making these claims without providing any
details.  It is unfortunate that such allegations are widely disseminated
without detailed supporting material. We should stress that these attacks
do not undermine the security of NSS and would like to comment on these
recent discoveries.

First, there are indeed two attacks which have been put forth to the NSS
signature scheme as described in the Eurocrypt paper.  The first pertains
to an oversight in the paper, where a signature validation condition was
oversimplified.  This permitted an easy (and easily countered) forgery
attack via linear algebra. It appears Jacques Stern has found a version of
this attack as well, and a preliminary announcement has been posted to his
website. Lacking the details of Stern's methods, we cannot comment further
at this time.  However, we would like to stress that the signatures posted
on his website are *not* valid signatures, as they fail to satisfy several
defining conditions sketched in the Technical Note referred to below.

The second is a very interesting strengthening of a statistical transcript
attack first described in our paper.  This attack shows a manner in which
transcripts of a few tens of thousands of signatures could be sufficient
to leak significant information.  This is countered by strengthening the
encoding/masking methods already mentioned in the NSS paper as a defense
against averaging attacks.

Further details will be provided during our Eurocrypt presentation. NTRU's
current software toolkits already include the strengthened defenses.

Again, we would like to emphasize that these attacks do not undermine the
security of NSS.  They were directed at certain encoding/validation
aspects of the Eurocrypt paper and do not impact the underlying NSS
algorithm.

You can learn more about the NSS algorithm by reviewing detailed Technical
Note #017: Enhanced Encoding and Verification Methods for the NTRU
Signature Scheme on our web site located at
http://www.ntru.com/technology/tech.technical.htm.

Posted on behalf of Jeffrey Hoffstein, co-Vice President Research and
Founder, NTRU



--

From: Paul Pires [EMAIL PROTECTED]
Subject: Re: Censorship Threat at Information Hiding Workshop
Date: Tue, 1 May 2001 13:17:58 -0700


Roger Schlafly [EMAIL PROTECTED] wrote in message 
news:ahDH6.125$[EMAIL PROTECTED]...
 Paul Pires [EMAIL PROTECTED]
  is to establish a tangible asset which can be protected and valued. 14 to
  20 years sounds about right to me. I have never understood why the term
  and the renewal provisions are so different for copyright versus patent.
  I suspect there were some vested interest in the the publishing industry
  that had to be appeased. 56 years? that's absurd.

 Neither patents nor copyrights can be renewed. Patents expire 20 years
 after the application date. Copyrights last for the life of the author, plus
 70 years.

Bad wording on my part. Extensions are not renewals. The point remains that
congress doesn't feel the need to constantly muck with patent terms.

 Every 20 years or so, when the Mickey Mouse copyright is about to
 run out, Congress extends the copyright term for another 20 years.
 The last extension was challenged in the courts (Eldred v. Reno), but
 the courts have upheld the extension (so far). For more info, see:

Nice reference. I have to agree with the dissenting opinion that retroactively
extending a term is in conflict with the limits on the congress.

By securing for limited times to authors and inventors the exclusive right to
their respective writings and discoveries.

If a term can be extended retroactively then it cannot be seen as very limited.
Consideration

Cryptography-Digest Digest #280

2000-12-06 Thread Digestifier

Cryptography-Digest Digest #280, Volume #13   Wed, 6 Dec 00 04:13:00 EST

Contents:
  Re: Math background required for Cryptology ? (Tom St Denis)
  Re: What's better CAST in PGP or Blowfish 128bit? (Tom St Denis)
  Re: Possibly-new attack on D-H? (Paul Rubin)
  Re: About governments and my ex-relatives in Finland and the U.S.A. ...  ("John A. 
Malley")
  Re: [Question] Generation of random keys (Benjamin Goldberg)
  Re: MD5 byte order (Benjamin Goldberg)
  Re: [Question] Generation of random keys ("John A. Malley")
  Re: Journal of Craptology (David A Molnar)
  Re: About governments and my ex-relatives in Finland and the U.S.A. ...  (Error_404)
  Re: newbie: how to persuade my managment not to do our own home-grown encryption? 
(Jon Haugsand)
  Re: Simulataneous encryption and authentification (was IBM's new algorithm) (David 
Wagner)
  Re: ARCFOUR (RC4) used for CipherSaber-N (Glide)
  Re: Smart Card vs 1.44 Disk (Francois Grieu)
  Re: About governments and my ex-relatives in Finland and the U.S.A. ...  (Richard 
Heathfield)
  Re: What's better CAST in PGP or Blowfish 128bit? (Bill Unruh)
  Re:  Are AES algorithms export restricted? (Bill Unruh)
  Re: Revised cipher (Jorgen Hedlund)



From: Tom St Denis [EMAIL PROTECTED]
Subject: Re: Math background required for Cryptology ?
Date: Wed, 06 Dec 2000 03:29:07 GMT

In article BhgX5.134660$[EMAIL PROTECTED],
  "Ryan J Schave" [EMAIL PROTECTED] wrote:
 I have recently become interested in cryptology.  Unfortunately my
knowledge
 of math is pretty weak.  I imagine this small detail will hold me
back from
 learning everything I can about cryptology.  I have pulled out my old
math
 books from college and looked at the TOC of each of them.

 What topics in math should I have a firm grasp of before I can expect
to get
 the most of cryptology?  Obviously many topics in math are based on
other
 topics, but I don't want to spend time teaching myself stuff that I
won't
 use in my study of cryptology.

 Hope this makes sense.

Hmm well you should be familiar with programming languages such as C.
Should have a familiarity with how a processor works (assembly language
at least).  Should have some finite math and linear algebra.

If you want to get into PK crypto you need to know alot of Number
Theory.  If you want to get into symmetric crypto you need to know alot
of discrete mathmatics.

All in all if you want to be an avid amateur I would suggest high
school level maths.

If you want to be a pro, go to university and take compsci/math courses.

Tom


Sent via Deja.com http://www.deja.com/
Before you buy.

--

From: Tom St Denis [EMAIL PROTECTED]
Subject: Re: What's better CAST in PGP or Blowfish 128bit?
Date: Wed, 06 Dec 2000 03:26:53 GMT

In article 90jlus$2rpc$[EMAIL PROTECTED],
  "Noname" [EMAIL PROTECTED] wrote:
 I need strong algorithm and it can be slow in encrypt/decrypt. I need
the
 best:o).

You need to learn about crypto is what you need.

Tom


Sent via Deja.com http://www.deja.com/
Before you buy.

--

From: Paul Rubin [EMAIL PROTECTED]
Subject: Re: Possibly-new attack on D-H?
Date: 05 Dec 2000 20:17:06 -0800

Tom St Denis [EMAIL PROTECTED] writes:
 I believe in PGP new primes are chosen for each new DH/DSS key. 

No there's a table of fixed Sophie Germain primes of various lengths.  
They are generated according to an algorithm given in a comment in
the code.  It also says how long it took to generate each one.  For
the 1024 bit one I think it was on the order of an hour.

--

From: "John A. Malley" [EMAIL PROTECTED]
Subject: Re: About governments and my ex-relatives in Finland and the U.S.A. ... 
Date: Tue, 05 Dec 2000 20:19:54 -0800


JustBrowsing wrote:
 
[snip]

 Actually I'm kidding, but I cant believe this is a real message. Probably a
 test to
 see who's really awake in this news group :)

Sometimes I wonder if these posts aren't truly:

1. Steganographic messages to some person or cell of persons.
2. Code signals to some person or cell of persons (some key word or
phrase is in the message and means something to those with the code book
or agreed upon use)

The crafted tone and content leads the casual reader to dismiss the
postings as inconsequential ramblings. This may deliberately hide the
true meaning.

It's just unusual to see these postings appear when they do, in
clusters, followed by relatively long periods of silence.

Or maybe they are just someone's idea of a joke on the readers of
sci.crypt and other USENET groups?

John A. Malley
[EMAIL PROTECTED]

--

From: Benjamin Goldberg [EMAIL PROTECTED]
Subject: Re: [Question] Generation of random keys
Date: Wed, 06 Dec 2000 04:20:31 GMT

Per Claesson wrote:
 
 Alan Rouse wrote:
 
  The original post on this thread was requesting source code to
  ge

Cryptography-Digest Digest #280

2000-07-24 Thread Digestifier

Cryptography-Digest Digest #280, Volume #12  Mon, 24 Jul 00 12:13:01 EDT

Contents:
  Re: random malar-key (Mike Rosing)
  Re: Rabin's information dispersal algorithm (Mike Rosing)
  Re: Random Appearance (Paul Koning)
  Re: Proving continued possession of a file (Ríkharður Egilsson)
  Re: Hash function? (Boris Kazak)
  Re: Random Appearance ("Tony T. Warnock")
  Re: Proving continued possession of a file (Mark Wooding)
  books by W. F Friedman (Charles Blair)
  Re: New stream cipher ("Trevor L. Jackson, III")
  Re: Practical application of Ciphersaber (Was: RC4-- repetition length?) ("Trevor L. 
Jackson, III")
  Re: Hash function? (Sander Vesik)
  Re: Hash function? ("Scott Fluhrer")
  Re: Practical application of Ciphersaber ("Scott Fluhrer")



From: Mike Rosing [EMAIL PROTECTED]
Subject: Re: random malar-key
Date: Mon, 24 Jul 2000 08:18:46 -0500

Abyssmal_Unit_#3 wrote:
 
 don't worry, it only takes power away (can u spare it?) ,  it doesn't (shouldn't) 
introduce any unwanted abberations of thoughts.
 heee !   ;-))
 
 ok, maybe try hooking up like an ECG instead? how about an aura detector?  ;-D

EEG's would work fine.  You just have to sample at  10 kHz and integrate for
at least 1000 samples.  the result (modulo your sample bit width) will be
random.

It's stuff like this that got me into crypto in the first place.  About 15 or
so years ago I built an EEG digitizer.  (I also was pumping signals into my
head, but it worked, so I stopped :-)  What I wanted to do was ship EEG signals
around, mostly for gaming and music purposes, but I could see lots of other
applications.  A direct link to my thoughts is something worth protecting,
and I think strong crypto is required.

I gave up on EEG's, but crypto is still fun!

Patience, persistence, truth,
Dr. mike

--

From: Mike Rosing [EMAIL PROTECTED]
Subject: Re: Rabin's information dispersal algorithm
Date: Mon, 24 Jul 2000 08:34:48 -0500

Wei Dai wrote:

 IDA is a method of producing k pieces from a message such that any n of
 them can be used to recover the message. If the message length is m,
 then each piece has size m/n, and it takes O(m*k) operations to
 generate the pieces and O(m*n) operations to recover the message. The
 new method improves the run-time to O(m+m*(k-n)) operations to generate
 the pieces and O(m+n^2+m*r) operations to recover the message, where r
 is the number of pieces among the first n pieces that are missing.

 The new method treats the message as the values of a degree n-1
 polynomial evaluated at points 0,...,n-1. The first n pieces are
 produced simply by cutting the message into n equal sized chunks. The
 other k-n pieces are produced by interpolating the polynomial at k-n
 other points. Recovery involves interpolating the values of the
 polynomial at only the missing points. The coefficients of the
 polynomial is never explicitly determined.

I assume you mean the coefficients are not determined during recovery,
but they must be known during generation.  How does one know which
points are "missing" if you don't know the coefficients?

Patience, persistence, truth,
Dr. mike

--

From: Paul Koning [EMAIL PROTECTED]
Subject: Re: Random Appearance
Date: Mon, 24 Jul 2000 10:00:43 -0400

Mack wrote:
 
 Joseph Ashwood wrote:
  That's not correct. OTP's are subject to known-plaintext
  attacks, ...
 
 No, they are not, since the only plaintext that is "recovered"
 is exactly the plaintext that was already known.  The point
 being that "One-Time Pad" in such a context means the
 theoretically secure system, not a potentially flawed
 attempt at implementation of such a system.
 
 
 
 The code book systems to which I was refering are exactly
 the same as OTP's.  The pad may be reused at the risk of
 a known plaintext attack.  The words for messages are
 compiled into a list each word has a different meaning.
 These pads are usually only a couple of pages and then
 compiled into a book.  Different 'chapters' are used depending
 on some information. This system was used during WW2
 if I am not mistaken.  Each 'chapter' is an OTP.  If a chapter is
 reused it is subject to known-plaintext attack .  Of course if the
 OTP is stolen it is also broken.  In actual use pads were used
 for a certain time period.

Clearly if they were used "for a certain time period" they
were nothing at all like OTP.

Your description sounds thoroughly muddled.  Can you point
to a reference that supports the "one time" notion of
these "code books"?  I find it hard to believe considering
the work involved in creating codes by hand.

What does sound familiar is code books of widely varying
sizes (from one page tactical codes to quite large code
books).  And of course all of these would be reissued
from time to t

Cryptography-Digest Digest #280

2000-03-08 Thread Digestifier

Cryptography-Digest Digest #280, Volume #11   Wed, 8 Mar 00 16:13:02 EST

Contents:
  Re: Your Recommended Choice On Std Crypto Parts (Doug Stell)
  Re: cannot understand CFB mode code.. ("Steve A. Wagner Jr.")
  Re: IDEA question. ("Steve A. Wagner Jr.")
  Re: Your Recommended Choice On Std Crypto Parts (Terry Ritter)
  Re: CONFERENCE ON NATURALISM -- FINAL NOTICE (John Savard)
  Re: Can someone break this cipher? ("Steve A. Wagner Jr.")
  Re: are self-shredding files possible? (Michael Sierchio)
  Re: The Voynich manuscript (John Savard)
  Re: why xor?(look out,newbie question! :) (John Savard)
  Re: why xor?(look out,newbie question! :) (John Savard)
  Re: NIST, AES at RSA conference (Tim Tyler)
  Re: Where do I get it? ("Steve A. Wagner Jr.")



From: [EMAIL PROTECTED] (Doug Stell)
Subject: Re: Your Recommended Choice On Std Crypto Parts
Date: Wed, 08 Mar 2000 19:54:58 GMT

On Tue, 07 Mar 2000 19:06:45 GMT, [EMAIL PROTECTED] wrote:
Ben,

Here's a few answers and opinions. Questions best left to others have
been removed from this response.

If I want speed and good security for encrypting
a data stream, and I only want one choice of
crypt algorithim for each part, with no patent $
issues, what should i choose, more importantly
what would You use.

The cost comes in two parts; patent cost for the algorithm and cost of
using someone's implementation. The algorithm patent cost depends in
part on where you are and where you intend to use the algorithm.

I do not want to support N different algorithims,
or key length choices, etc..
The data is not ultra-sensitive, ie., not e-
commerce, or long-term secrets.

Being able to gracefully transition between algorithms in the future
is a good feature to consider in the protocol. By graceful I mean the
ability to transition without having to render clients unable to
communicate until all communicating clients are replaced.

 - Public Key? : D  H

For anonymous key agreement, D-H is fine.
For authenticated key agreement, KEA (a dual D-H) is a good choice.
Ouside the USA, RSA is available royalty free.
Other choices involve ElGamal, DSA and hybrids, depending on what you
want to do.
Remember that many of these algorithms require that you have a
authenticatable public key to avoid impersonation attacks.

 - Cryptographic Hash? : SHA-1???
  - Is SHA1 $ free in all situations?

I believe so.

  - Given data is not high sensitive, is a weaker
faster crypt hash acceptible.

MD4 is about the fastest hash you can get that isn't totally bad. MD4
has been shown to have weaknesses and both MD5 and SHA are
improvements on MD4. SHA-1 is a better improvement than MD5, it turns
out. MD4 isn't use much any more, but it does offer a choice of high
speed versus lower security, if that is the tradeoff you wish to make.

doug


--

From: "Steve A. Wagner Jr." [EMAIL PROTECTED]
Subject: Re: cannot understand CFB mode code..
Date: Wed, 08 Mar 2000 16:32:27 -0800

Perhaps my implementation will be easier to understand...This was
derived from some code I downloaded from cryptography.org:

//BLOWFISH.H
#ifndef __BLOWFISH_H
#define __BLOWFISH_H

#ifndef uchar
#define uchar unsigned char
#endif
#ifndef ushort
#define ushort unsigned short
#endif
#ifndef ulong
#define ulong unsigned long
#endif

#ifndef ENCRYPT
#define ENCRYPT 1
#endif
#ifndef DECRYPT
#define DECRYPT 2
#endif

typedef struct {uchar ptx[8],ctx[8],pos;} bf_context;
voidbf_encrypt(ulong *xl, ulong *xr);
voidbf_decrypt(ulong *xl, ulong *xr);
voidbf_init(uchar *key,ulong keylen);
voidbf_cfb_init
(bf_context *bf,uchar *key,ulong keylen,uchar riv[8]);
uchar   bf_cfb(bf_context *bf,uchar n,char direction);

#endif//__BLOWFISH_H

//BLOWFISH.C
#include "blowfish.h"

#define S(x,i) (bf_S[i][x.w.byte##i])
#define bf_F(x) (((S(x,0)+S(x,1))^S(x,2))+S(x,3))
#define bf_R(a,b,n) (a.word^=bf_F(b)^bf_P[n])

union aword
{ulong word; uchar byte[4]; struct {ulong
byte3:8,byte2:8,byte1:8,byte0:8;} w;
};

static ulong bf_P[16+2]=
{0x243F6A88,0x85A308D3,0x13198A2E,0x03707344,
 0xA4093822,0x299F31D0,0x082EFA98,0xEC4E6C89,
 0x452821E6,0x38D01377,0xBE5466CF,0x34E90C6C,
 0xC0AC29B7,0xC97C50DD,0x3F84D5B5,0xB5470917,
 0x9216D5D9,0x8979FB1B,
};

static ulong bf_S[4][256]=
{0xD1310BA6,0x98DFB5AC,0x2FFD72DB,0xD01ADFB7,
 0xB8E1AFED,0x6A267E96,0xBA7C9045,0xF12C7F99,
 0x24A19947,0xB3916CF7,0x0801F2E2,0x858EFC16,
 0x636920D8,0x71574E69,0xA458FEA3,0xF4933D7E,
 0x0D95748F,0x728EB658,0x718BCD58,0x82154AEE,
 0x7B54A41D,0xC25A59B5,0x9C30D539,0x2AF26013,
 0xC5D1B023,0x286085F0,0xCA417918,0xB8DB38EF,
 0x8E79DCB0,0x603A180E,0x6C9E0E8B,0xB01E8A3E,
 0xD71577C1,0xBD314B27,0x78AF2FDA,0x55605C60,
 0xE65525F3,0xAA55AB94,0x57489862,0x63E81440,
 0x55CA396A,0x2AAB10B6,0xB4CC5C34,0x1141E8CE,
 0xA15486AF,0x7C72E993,0xB3EE1411,0x636FBC2A,
 0x2BA9C55D,0x741831F6,0xCE5C3E16,0x9B87931E,
 0xAFD6B

Cryptography-Digest Digest #280

1999-09-20 Thread Digestifier

Cryptography-Digest Digest #280, Volume #10  Mon, 20 Sep 99 14:13:03 EDT

Contents:
  Re: Okay "experts," how do you do it? (Sundial Services)
  Re: some information theory (Tim Tyler)
  Re: Large number arithmetic (Anton Stiglic)
  Re: Okay "experts," how do you do it? ([EMAIL PROTECTED])
  Re: Okay "experts," how do you do it? ([EMAIL PROTECTED])
  Re: Okay "experts," how do you do it? (Patrick Juola)
  Re: unix clippers that implement strong crypto. (SCOTT19U.ZIP_GUY)
  Re: Comments on ECC (DJohn37050)
  Re: FPGAs (Medical Electronics Lab)
  Re: Okay "experts," how do you do it? (Tom St Denis)
  AES finalists AMD Athlon performance? (David Crick)
  Re: Comments on ECC ("Joseph Ashwood")



Date: Mon, 20 Sep 1999 08:26:39 -0700
From: Sundial Services [EMAIL PROTECTED]
Reply-To: [EMAIL PROTECTED]
Subject: Re: Okay "experts," how do you do it?

Ahh yes, the NP-complete problem.

Okay, then, "for-GET how it works."  Let's look only at the input, the
output, and the key.  Let's pretend we cannot determine the algorithm.

What is it about an algorithm's input and output that enables us to say
that it is a "good" encryption algorithm?  What is it about the
algorithm's dependence upon its two inputs, 'p' and 'k', as realized in
the output 'c', that makes the algorithm a "good" one or a "bad" one?





 Patrick Juola wrote:

  There OUGHT to be an objective test-bed that we can plug these
 algorithms into, to test them.
 
 Actually, it's much harder than you think to come up with an algorithm
 for analyzing other algorithms.  In point of fact, that's one of the
 few things that's easy to prove *impossible* in computer science

--

From: Tim Tyler [EMAIL PROTECTED]
Subject: Re: some information theory
Reply-To: [EMAIL PROTECTED]
Date: Mon, 20 Sep 1999 15:46:33 GMT

Anti-Spam [EMAIL PROTECTED] wrote:
: Tim Tyler wrote:
: Anti-Spam [EMAIL PROTECTED] wrote:
: : Tim Tyler wrote:

: If you have a set of N target data strings in need of compressing, the
: optimum compression technique (in terms of size) is essentially to map
: these strings onto the integers from 1 to N.
: 
: Given a string taken at random from the starting set, the resulting
: "compressed file" will be indistinguishable from random.

: Let me paraphrase, to make sure I understand this assertion:

: Start with a set of N data strings.  Assume each data string is an
: encoding for a symbol S.
: I assume this set corresponds then to N symbols  ( S1, S2, S3 ... Sn )
: that will appear in messages (compressed data/files). Call this set A.

: Conjecture: the optimum compression (here defined in terms of size of
: the resulting compression-encoded data/file) is achieved when (1) the N
: symbols are encoded as integers 1 through N in the compressed data/file,
: irrespective of the frequency of occurance of the Nth symbol in any
: particular message. 
: Furthermore, (2) the random assigment of the integer index value code
: for the resulting compression encoded data/file will pass all confidence
: tests for random bit strings of the length of the resulting compressed
: data/file. 

: Part (1):

: here is the set A: { S1, S2, S3 ... Sn }
: index into set A:0,  1 , 2  ... N-1   -  N integer values. 

: Assume the symbols S1 ... Sn do not occur equiprobably. The maximal
: entropy code (and thus minimal number of bits and thus minimal sized bit
: strings for messages composed of the symbols) requires 

: H = - (SUM over i = 1 to n )[ ( probability of Si ) * ( log(base2) of
: probability of Si) ] 

: where probability of Si = number of occurances of the ith symbol / total
: number of symbols that occur. Occur where, how?  Look at a
: pre-compressed message M composed of Q occurances of symbols found in
: the set A.  For some messages maybe all of the symbols in A occur. For
: others, maybe only a subset of the symbols in A occur.
: The frequencies of occurance are a function of the particular message
: M.  

: So each symbol can be encoded as one out of 2 ^ H unique possible binary
: strings, with H varying from message M1, M2, M3...  (Static and adaptive
: huffman codes create strings shorter and longer than H bits such that
: the *average* number of bits per symbol tends to H bits, to handle the
: fractional bit values given by this formula for maximal entropy
: encoding.) 

: There are N symbols in the set A. An integer index into the set A
: requires ( log(base 2)of N ) bits to be represented. The use of an
: integer value (index) to code for the symbol in the compressed data/file
: results in the minimal size compressed data/file iff:

:   H =  ( log(base 2)of N ).  

: i.e this is only true if the probability of occurance for each Si is the
: same for all symbols. All symbols as they appear in a specific message