Cryptography-Digest Digest #510, Volume #13      Sun, 21 Jan 01 02:13:01 EST

Contents:
  Re: Dynamic Transposition Revisited (long) (gcone)
  Re: using AES finalists in series? (Terry Ritter)
  Re: using AES finalists in series? (Terry Ritter)
  How to pronounce Vigenere (Richard John Cavell)
  Re: using AES finalists in series? (Terry Ritter)
  Re: using AES finalists in series? ("Douglas A. Gwyn")
  Re: Differential Analysis (Richard Heathfield)
  Re: Transposition code (Richard Heathfield)
  Re: Kooks (was: NSA and Linux Security) (Greggy)
  Re: ECC Domain Generation ("Michael Scott")
  JPEG infidelity for crypto (wtshaw)

----------------------------------------------------------------------------

From: gcone <[EMAIL PROTECTED]>
Subject: Re: Dynamic Transposition Revisited (long)
Date: Sat, 20 Jan 2001 22:10:10 -0800


Terry Ritter wrote:
[snip]
> 
> A Dynamic Transposition cipher is conceptually very simple:
> 
>    (1) We collect plaintext data in bit-balanced (or almost
>        bit-balanced) blocks.
> 
>    (2) We shuffle the bits in those blocks under the
>        control of a keyed pseudorandom sequence.
> 

The strength of dynamic transposition rests on these two points. 

The benefit of bit-balancing is explained as follows in the original
post:

> 
> When every plaintext block is exactly bit-balanced, any
> possible plaintext block is some valid bit-permutation of
> any ciphertext block.  So, even if an opponent could
> exhaustively un-permute a ciphertext block, the result
> would just be every possible plaintext block.  

and a concrete means of bit-sbuffling (a.k.a. pseudorandom permutation)
is offered as

> 
> .)  The usual solution is the well-known algorithm by
> Durstenfeld, called "Shuffle," which Knuth II calls
> "Algorithm P (Shuffling)," although any valid permutation
> generator would be acceptable.
> 
[snip}
> 
> If we shuffle each block just once, an opponent who somehow
> knows the correct resulting permutation can use that
> information to reproduce the shuffling RNG sequence, and
> thus start to attack the RNG.  And even though we think
> such an event impossible (since the correct permutation is
> hidden by a plethora of different bit-permutations that
> each produce exactly the same ciphertext from exactly the
> same plaintext), eliminating that possibility (by shuffling
> each block twice) is probably worthwhile.  This does not
> produce more permutations, it just hides shuffling sequence.

Algorithm P cannot generate more than M distinct permutations when a
linear congruential generator of modulus M is used as the PRNG per
Knuth, Vol. 2, section 3.4.2 Random Sampling and Shuffling. (Page 145 in
the Third Edition.)  

In fact, Algorithm P cannot possibly generate more than M distinct
permutations when the PRNG is of the form X(n+1) = f(X(n)) such that
X(n) can take on only M distinct values. (Knuth, same reference.)

Consider a simple multiplicative congruential generator on Z*_p, such as
X(n+1) = g * X(n) mod p, where p is prime, g is a generator on Z*_p, and
a seed S, 0 < S < p is the initial value X(0).  Then we can express the
nth value of the linear congruential generator as X(n) =  ( S * g^n )
mod p.  X(n) takes on p-1 values.  Driving Algorithm P with X(n) results
in only p-1 distinct permutations. 

The number of possible permutations of N values is N!  ( N factorial,
not "surprise" :-) )  A block of N bits would have N! possible
permutations.  Use Algorithm P driven with a multiplicative congruential
generator X(n) to permute the N bits.  Algorithm P generates all
possible permutations iff  p-1  >=  N!, or p >= N! + 1. 

For 8 bits, a prime modulus p > 8! + 1 = 40321  is needed to ensure
Algorithm P generates every possible permutation of the 8 bits.
 
For 16 bits, a prime modulus p > 16! + 1 , on the order of 2.092 x
10^13, is needed to ensure Algorithm P generates every possible
permutation of the 16 bits.

For 512 bit blocks, a prime modulus p > 512! + 1 (a big number) is
needed to ensure Algorithm P generates every possible permutation of the
512 bit block. 

The dynamic transposition cipher relies on every permutation of the N
bits in the block being possible (and equally probable.) However, here
we see that the prime modulus p of a simple PRNG driving Algorithm P
grows factorial in magnitude with the number of bits to permute when we
try to ensure every permutation of the N bits is possible. Speedy
encryption will require smaller prime moduli as N increases, with the
fraction of the N! permutations actually used decreasing as N increases.
Certain permutations of the N bits will never occur, and Knuth indicates
the excluded permutations are given by a "fairly simple mathematical
rule such as a lattice structure." That will help cryptanalysis.  

This is only a first step in exploring the strength of the dynamic
transposition cipher.  I can't say anything about how or if this result
generalizes to other shuffling algorithms (other than Algorithm P from
Knuth), for example.  Or is that fraction of the N! permutation "enough"
to preclude brute-force attacks?  Or can one predict characteristics of
the kinds of permutations that will be generated using a simple LCG with
Algorithm P (such as the presence of certain cycles in permutations and
dependencies between cycles in permutations across encryptions of
successive blocks - the Knuth comment on lattice structure seems to
point to that possibility.)? 

But it's a start :-)


John A. Malley
[EMAIL PROTECTED] 








 

 







This flavor of PRNG is cryptographically insecure.

------------------------------

From: [EMAIL PROTECTED] (Terry Ritter)
Subject: Re: using AES finalists in series?
Date: Sun, 21 Jan 2001 06:11:47 GMT


On Fri, 19 Jan 2001 15:39:56 GMT, in <[EMAIL PROTECTED]>,
in sci.crypt "Douglas A. Gwyn" <[EMAIL PROTECTED]> wrote:

>Bryan Olson wrote:
>> Yes, the series of all five finalists together should be at
>> least as secure as any one.  Yes, it probably has a lower risk
>> of cryptologic failure.  A five hundred cipher chain should be
>> safer still.
>
>It should be noted that according to Kerchhoff's principle,
>no matter how many components are chained together the
>enemy should be assumed to know all about that.  

Well, yes and no.  

In particular, the ciphering structure itself could be keyed.  One
might use key bits to select from among the possible ciphers at each
level, and more key bits to select how many levels of ciphering will
be used.  

When the ciphering structure is itself keyed, Kerckhoff's principle
means that the opponents know that this is possible, but they don't
know the resulting structure.  Opponents are not assumed to know the
results of keyed selections.


>As usual,
>all the keying material together constitutes the message
>key.  If you want to use a particular number of key bits,
>e.g. 128, then the question you ought to be asking is what
>is the most effective way to use them?  Dividing them
>among several factor stages is certainly not it.

Well, no and yes.

That idea that we need "key efficiency" represents a time now long
gone.  In the context of modern communications, why should anyone be
anxious about sending additional keying material?  Do we really worry
about sending another 256 bits, or 1024 bits, or whatever?  This is
message key material, random and endless; we can take all we need, and
send it at almost no cost.  

There is no need to divide a single minimum keyspace into use by
multiple ciphers.  Just "bite the bullet" and send a key for each
cipher!  Make each random, independent, and as large as it needs to be
to give the associated cipher stand-alone strength.  

---
Terry Ritter   [EMAIL PROTECTED]   http://www.io.com/~ritter/
Crypto Glossary   http://www.io.com/~ritter/GLOSSARY.HTM


------------------------------

From: [EMAIL PROTECTED] (Terry Ritter)
Subject: Re: using AES finalists in series?
Date: Sun, 21 Jan 2001 06:17:36 GMT


On Thu, 18 Jan 2001 22:29:38 GMT, in <[EMAIL PROTECTED]>,
in sci.crypt "Douglas A. Gwyn" <[EMAIL PROTECTED]> wrote:

>Terry Ritter wrote:
>> AES is of course an attempt to limit cipher development, ...
>
>No, it's an attempt to control the cost of implementation and
>operation of commercial encryption by promoting interoperability.
>
>It's similar to the function of standards for screw threads,
>programming languages, etc.

The two statements are not necessarily contradictory:  One could be
the cover story; the other addresses what could be an internal agenda.

The only standard needed for ciphering is a standard cipher
*interface*, not a standard cipher.  We have to deliver keys; we might
as well deliver the name of the desired cipher, or actual running
code.  If we had a standard interface, we could change ciphers easily
if something went wrong.  

But we get is a standard cipher, and *not* a standard interface.  Odd.

---
Terry Ritter   [EMAIL PROTECTED]   http://www.io.com/~ritter/
Crypto Glossary   http://www.io.com/~ritter/GLOSSARY.HTM


------------------------------

From: Richard John Cavell <[EMAIL PROTECTED]>
Subject: How to pronounce Vigenere
Date: Sun, 21 Jan 2001 17:19:47 +1100

No idea.  Can anyone help?

=============================================================
Richard Cavell - [EMAIL PROTECTED]

Newsgroups - Please keep any discussion on the group, and copy your
replies to me via email. (Server problems).  Sending me bulk email
guarantees a nasty response.

Judge Thomas Penfield Jackson on Bill Gates: "He has a Napoleonic concept
of himself and his company, an arrogance that derives from power"
=============================================================


------------------------------

From: [EMAIL PROTECTED] (Terry Ritter)
Subject: Re: using AES finalists in series?
Date: Sun, 21 Jan 2001 06:25:08 GMT


On Sat, 20 Jan 2001 15:11:14 -0000, in
<TVha6.2821$eI2.65510@NewsReader>, in sci.crypt "Gary Watson"
<[EMAIL PROTECTED]> wrote:

>"Douglas A. Gwyn" <[EMAIL PROTECTED]> wrote in message
>news:[EMAIL PROTECTED]...
>> Bryan Olson wrote:
>> > Yes, the series of all five finalists together should be at
>> > least as secure as any one.  Yes, it probably has a lower risk
>> > of cryptologic failure.  A five hundred cipher chain should be
>> > safer still.
>>
>> It should be noted that according to Kerchhoff's principle,
>> no matter how many components are chained together the
>> enemy should be assumed to know all about that.  As usual,
>> all the keying material together constitutes the message
>> key.  If you want to use a particular number of key bits,
>> e.g. 128, then the question you ought to be asking is what
>> is the most effective way to use them?  Dividing them
>> among several factor stages is certainly not it.
>
>======= Hypothetical Example ============
>
>Let's say that I am Director of Information Services for Islamic Jihad, and
>the head mullah has assigned me to set up a secure videoconferencing link
>from our headquarters in Damascus, to our field commands in Beirut and
>Bagdad, and to our terrorist cells in Tel Aviv, New York, Tokyo and the city
>where Tom St. D lives.  We have access to lots of money, computer
>scientists, and electronic engineers, but don't necessarily have a crippie
>we can trust.  
>
>After following AES for the last two years, we assess the
>possibility that the proposers of each algorithm are shills for NSA at about
>5%, and 10% for Rijndael since it was actually chosen by NIST.  

I know it is just an example, but it seems to be the wrong way to
think about the problem.  It is not necessary for somebody to be a
shill for NSA to create a cipher which has unnoticed faults.  That
means the probability of having a weak cipher is higher than the
probability of being a shill, to an extent which we can neither
estimate nor quantify.  


>We assess
>the liklihood that NSA can break one of these algorithms if it was developed
>in good faith at 0% as long as we are careful about keying data and Tempest
>attacks.  

No.  The issue is whether unknown weakness exists.  Because the
weakness is unknown, we don't know about it.  

We don't care about whether a cipher is weak to us, or to the
academics who will talk, we only care about whether our opponents
consider the cipher weak, and that is something we fundamentally
cannot know.  

But if weakness does exist, it is not at all unreasonable to imagine
that NSA could and would exploit it, along -- unfortunately -- with
other technically sophisticated opponents.  


>The main threats to our cryptonet are NSA, Mossad, GCHQ, the
>French, and to a lesser extent  host countries like Syria, Iraq, and Saudi
>Arabia.  These are serious people but they can't brute force a 256 bit
>keyspace.  We have to assume that our TCP/IP traffic will be intercepted by
>all major spy agencies.  So what are the Chosen Faithful to do?
>
>If we decide to use Rijndael there is, by our rekoning, a 10% chance that
>one night we are going to be captured and spirited away to a Mossad torture
>cell and subjected to very very loud Boyzone music for the rest of our days.
>Though it's only one chance in ten, such a gruesome fate is simply
>unthinkable.
>
>On the other hand, using a pair of  high-end Xilinx FPGAs, it's possible to
>put all five AES finalists in series, pipelining them such that there is no
>loss of throughput in a streaming video application.  The Great Satan has
>thoughtfully published the VHDL for all five finalists, so the R&D cycle
>would be negligible.  Using a static RAM based FPGA has the advantage that
>it can be quickly zeroized if capture seems imminent or it could even
>zeroize itself after a set period of time.

What is the point in erasing devices which contain public algorithms?
Only keys (and data) need be protected.  


>So, if the above guesstimates of the liklihood of the candidates being
>deliberately and covertly weakened are accurate, and assuming that separate
>randomly generated keying data were used for each cipher, the probablility
>that all five systems are compromised and thus of the eternal Boyzone
>torture, is the product of the probabilities or 1 in 1.6 million, which
>frightening though it is, is within reason.

I think not.  First, for the result to be correct, each of the
individual probabilities would have to be independent, and I think we
do not know that.  When we have designs based on generally similar
concepts of ciphering, the possibility of new generally-applicable
attacks cannot be dismissed.  

Next, we cannot compute with values we cannot know, and we cannot know
the probability of weakness.  

On the other hand, the probability of weakness for a cipher on its own
is likely to be reduced by inclusion in a ciphering stack or sequence.
The reason for this is that the most powerful attacks benefit from
knowing both the plaintext and ciphertext for a particular cipher --
the input and output values for that one cipher.  But since the data
values between the ciphers are not exposed in multiple ciphering (we
at most have the original plaintext and the ciphertext after all
ciphers have been applied), attacks which might otherwise work may no
longer do so.  This is increased strength.  


>(I know that I'm glossing over whether these probabilities are independent,
>but this could cut both ways and besides maybe I'm hoping to be promoted to
>head mullah before anyone notices.)
>
>============ End of Hypothetical Example ============
>
>
>You people are the real experts in this topic, not I, but isn't this a
>reasonable way to look at the strength of the ciphers?  

It is absolutely reasonable to be aware that ciphers cannot be
certified as strong, and may, therefore, be weak.  Any cipher can have
undisclosed weakness, and if that happens, users may be exposing their
data and never know, until it is too late.  

Nevertheless, we cannot predict the probability of weakness which we
do not know.  Trying to compute probabilities on this basis is just
unworkable.  


>If only one of the
>five ciphers is secure, then the product of the five is necessarily secure
>given separate random keying data, right?  

Pretty much right.


>Even if it turned out that one
>cipher was the inverse of the other, as one previous poster had suggested as
>a counterexample, using separate random keying data would negate this
>weakness, would it not?

Right.


>Assuming all of the above is correct, another question I'm not competent to
>answer is how strong this cipher chain would be -- with each cipher at 256
>bits, is it on the order of (2^256)^5, or is it closer to (2^256) * 5  ?

Everything depends upon weakness which we do not know.  If we do not
know the weakness, we cannot compute the probability that it might
exist.  If we do know the weakness, there is no probability about it
at all: it is either there or not.  

Using additional ciphers should increase the keyspace additively.  But
that is not very interesting if each of the ciphers already had a
"large enough" keyspace.  There is no point.  

The advantage of a cipher stack is the ability to surmount a weakness
in any particular cipher, and also the ability to protect each cipher
from known-plaintext or defined-plaintext attack (on that individual
cipher).


>As you point out, dividing the 1280 keying bits* into convenient "factor
>stages" is bound to make an attack simpler, but is the resulting simplicity
>of any practical value to the analyist?  Is this like telling someone their
>life is easier since they merely have to count the grains of sand on Malibu
>beach instead of all of the beaches in the world?

I think the original comment assumed that nobody would ever think of
using a cipher which transported more than 128 bits of keying, since
that is already unsearchable.  

I am that nobody.  I routinely use message keys of 992 bits -- not
because I think any fewer could be brute-forced, but instead because
doing this can avoid an internal expansion stage which conceivably
might add weakness.  I may thus possibly avoid a form of technical
weakness, and I am quite willing to impose a 128-byte key burden to do
that.  The era of needing keys to be "efficient" is long gone.  

I have no problem at all with the idea of a modern communications
system transporting whatever amount of key each cipher needs, and
neither should anybody else.


>I guess what I'm trying to say is that chaining these five unrelated ciphers
>is one way to guard against an undiscovered or deliberate weakness coming to
>light in one or more of them.  

I would dispute that the ciphers are unrelated.  The very structure of
the competition and the pre-selection process tended to produce
ciphers of generally similar technology.  


>Even if all of them have some sort of
>weakness, wouldn't simply being buried in the middle of a cipher chain make
>the weaknesses harder to exploit?

Yes indeed it would.


>I know the above topics can be counter-intuitive, as I still haven't come to
>grips with why double DES is only worth about 58 bits or whatever instead of
>112 bits.  Several people have tried to explain this to me with no
>success...
>
>
>* that's five times 256.  I couldn't do it in my head, either.
>--
>
>Gary Watson
>[EMAIL PROTECTED]  (you should leave off the digit two for email)
>Speaking only for myself and not the company.

---
Terry Ritter   [EMAIL PROTECTED]   http://www.io.com/~ritter/
Crypto Glossary   http://www.io.com/~ritter/GLOSSARY.HTM


------------------------------

From: "Douglas A. Gwyn" <[EMAIL PROTECTED]>
Subject: Re: using AES finalists in series?
Date: Sun, 21 Jan 2001 06:31:33 GMT

Gary Watson wrote:

> ...  After following AES for the last two years, we assess the
> possibility that the proposers of each algorithm are shills for NSA at about
> 5%, and 10% for Rijndael since it was actually chosen by NIST.  We assess
> the liklihood that NSA can break one of these algorithms if it was developed
> in good faith at 0% as long as we are careful about keying data and Tempest
> attacks. ...

Unfortunately for your argument, you are in effect
using a model of independent stochastic events, but
in the real world if NSA (or example) can readily
break one of the AES candidates they can just as
readily break the others too, and ditto for a known
concatenation of them.  The "probabilities" here
aren't genuine probabilities at all and they don't
produce a composite through simple multiplication.

> Assuming all of the above is correct, another question I'm not competent to
> answer is how strong this cipher chain would be -- with each cipher at 256
> bits, is it on the order of (2^256)^5, or is it closer to (2^256) * 5  ?
> As you point out, dividing the 1280 keying bits* into convenient "factor
> stages" is bound to make an attack simpler, but is the resulting simplicity
> of any practical value to the analyist?  Is this like telling someone their
> life is easier since they merely have to count the grains of sand on Malibu
> beach instead of all of the beaches in the world?

That sort of analysis assumes that cryptanalysis is
done via a brute-force key search, but we know that
that is infeasible.  Therefore, if the cryptanalysis
*is* feasible for some agency, it is necessarily done
much more cleverly, so none of that arithmetic applies.
What is certain is that using each key bit only in a
limited, factorizable portion of the encryption
equations is no more secure than using it in the best
possible way in the entire set of equations.  Indeed,
the former is a strict subset of the latter, i.e. you
are imposing a constraint on the system design, which
runs the risk of eliminating better (more secure)
designs.  Thus my advice is to use all the key bits
in a single unified method instead of partitioning
them among several smaller-keyed independent methods.


------------------------------

Date: Sun, 21 Jan 2001 06:31:57 +0000
From: Richard Heathfield <[EMAIL PROTECTED]>
Subject: Re: Differential Analysis

Matt Timmermans wrote:
> 
> Ah, you for got the rule #253 -- always count to ten before clicking the
> send button.  And the corollary -- if you don't have time to count to ten,
> then you don't have time to post.
> 
> "Tom St Denis" <[EMAIL PROTECTED]> wrote in message
> news:94cm6a$tf5$[EMAIL PROTECTED]...
> >
> > Actually upon reflection I agree I have been a bit tempermental.  I am in
> > fact sorry.  I've just been pre-occupied with exams and can't be bothered
> to
> > humor silly posts.
> >
> > Advice taken, I think I will try to be a bit cooler :-)
> >

*UNPLONK* ... for now.

(Matt, thanks for quoting Tom in such detail.)


-- 
Richard Heathfield
"Usenet is a strange place." - Dennis M Ritchie, 29 July 1999.
C FAQ: http://www.eskimo.com/~scs/C-faq/top.html
K&R answers, C books, etc: http://users.powernet.co.uk/eton

------------------------------

Date: Sun, 21 Jan 2001 06:41:08 +0000
From: Richard Heathfield <[EMAIL PROTECTED]>
Subject: Re: Transposition code

Benjamin Goldberg wrote:
> 
> This only writes items, one row at a time, into columns in shuffled
> order.  I want to output the data reading down the columns.

Have you considered using a two-dimensional array? Or have I
misunderstood you?

-- 
Richard Heathfield
"Usenet is a strange place." - Dennis M Ritchie, 29 July 1999.
C FAQ: http://www.eskimo.com/~scs/C-faq/top.html
K&R answers, C books, etc: http://users.powernet.co.uk/eton

------------------------------

From: Greggy <[EMAIL PROTECTED]>
Subject: Re: Kooks (was: NSA and Linux Security)
Date: Sun, 21 Jan 2001 06:41:37 GMT

I hope that you accept this as an example of why I won't bother with
this discussion any longer.


> > >   If one believes that TONA became part of the Constitution
> > >   merely because it was frequently published, one should
> > >   immediately mount an expedition to find Buss Island, a
> > >   "phantom" island in the North Atlantic which appeared on
> > >   maps from 1592 until 1856. See Donald S. Johnson, Phantom
> > >   Islands of the Atlantic 80 (1994). Buss Island had its own
> > >   conspiracy theorists; in 1770, an anonymous author accused
> > >   the Hudson's Bay Company of keeping its location a secret
> > >   in order to maintain financial control over it.
> >
> > I think everyone can see that you are desparate with such folly
> > parallels.
>
> I think everyone can see that you can't come up with any reason to
> distinguish the two.  If you believe that the "missing 13th
amendment"
> must be real because it was frequently published on the order of state
> legislatures who were intelligent people, etc., why do you refuse to
> accept the reality of Buss Island, which was frequently published by
> mapmakers who were intelligent people, etc.?


If I were a map maker, would I go see if a new island actually existed
BEFORE I copy someone else's charts to improve my own?  Of course not.

There is no parallel with those who already had intimate knowledge of
the 13th amendment in their days.

When you decide to say something of value I will respond.




>       On August 1, 1849, C. Robinson and J.M. Patton, who were
>       preparing a revised edition of the laws of Virginia, wrote to
>       William B. Preston, Secretary of the Navy, and noted that
>       although TONA was included in the Revised Code of 1819, "[w]e
>       are satisfied that this amendment was never adopted, though it
>       is difficult to account for the fact that it should have been
>       put into the Code of 1819 as an amendment which had been
>       adopted." The revised code noted that the previous
>       publication was in error.

Anyone can say that.  Jol Silversmith says that.  So what?



--
Jol Silversmith - I wasn't there so I cannot say why no one
protested within the Virginian legislature that day in 1819
not to include the 13th amendment in their publications, or to
require all 21 states to ratify the same.  But I am absolutely
certain I know more than they did back then what was really
going on all around them.         Boy, I'm good!


Sent via Deja.com
http://www.deja.com/

------------------------------

From: "Michael Scott" <[EMAIL PROTECTED]>
Subject: Re: ECC Domain Generation
Date: Sun, 21 Jan 2001 06:50:38 GMT

Its all to be found at

http://indigo.ie/~mscott

Mike Scott

"Splaat23" <[EMAIL PROTECTED]> wrote in message
news:94dq5e$p2k$[EMAIL PROTECTED]...
> Ok, this has really boggled me. To complete my practical knowledge of
> elliptic curve cryptosystems, I have been trying to code ECC domain
> parameter generation, and can do everything except for the one crucial
> step: determining the cardinality of the curve. I have found, read, and
> reread a bunch of papers on the subject, but have been completely
> unable to piece together a workable algorithm from the math.
>
> Someone out here _must_ actually know how to do this. Yes, I trust the
> ECC domain parameter validation algorithms - I have studied the math
> enough to know the logic is sound - but I have some moral problem just
> using NIST's recommended curves.
>
> As far as I can tell, the best method known for solving cardinality is
> to find cardinality modulo primes and use CRT to construct the final
> cardinality. It is supposedly an O(log(n)^5) algorithm overall. None of
> the papers I have found detail the generation of these candidates. Any
> help?
>
> - Andrew
>
>
> Sent via Deja.com
> http://www.deja.com/



------------------------------

From: [EMAIL PROTECTED] (wtshaw)
Subject: JPEG infidelity for crypto
Date: Sun, 21 Jan 2001 00:12:05 -0600

To demonstrate problems the jpeg process introduces in images, see a
comparison of
the relatively sharp image of 8bitcolor.GIF and 8bitcolor.jpg at
www.radiofreetexas.com/wts/pix

Notice fine artifacts that are introduced into solid color areas, and
colorization of the outlines in black.  

Along with GIF's, bitmaps on PC's and PICT's on Mac's are amongst
acceptable formats for faithfull bit representation, within available
resolution of the monitors, of course.
-- 
Some people say what they think will impress you, but ultimately
do as they please.  If their past shows this, don't expect a change.

------------------------------


** FOR YOUR REFERENCE **

The service address, to which questions about the list itself and requests
to be added to or deleted from it should be directed, is:

    Internet: [EMAIL PROTECTED]

You can send mail to the entire list by posting to sci.crypt.

End of Cryptography-Digest Digest
******************************

Reply via email to