Cryptography-Digest Digest #490, Volume #12 Sun, 20 Aug 00 10:13:01 EDT
Contents:
Re: Breaking Simple XOR Encryption (Simon Johnson)
Newbie question on VIM encryption key ("Wenjie Zhao")
Re: How Many? (John Savard)
Re: How Many? (John Savard)
Re: How Many? (John Savard)
Re: OTP using BBS generator? (Mok-Kong Shen)
Re: How Many? (Mok-Kong Shen)
Re: How Many? (Mok-Kong Shen)
Re: How Many? (Mok-Kong Shen)
Question on Decorelation Theory ([EMAIL PROTECTED])
Re: How Many? ("Trevor L. Jackson, III")
Re: about Openssl (Paul Schlyter)
Simple Stream Cipher ([EMAIL PROTECTED])
----------------------------------------------------------------------------
From: Simon Johnson <[EMAIL PROTECTED]>
Subject: Re: Breaking Simple XOR Encryption
Date: Sun, 20 Aug 2000 06:43:09 GMT
In article <[EMAIL PROTECTED]>,
"Douglas A. Gwyn" <[EMAIL PROTECTED]> wrote:
> Peter wrote:
> > I would appreciate an explanation of the attack that is
> > used against simple XOR "encryption" schemes.
>
> The first thing to understand is that XOR is *not* an
> encryption scheme, nor a class of such schemes; it is
> just a primitive Boolean-logic operation that has many
> uses, including as a *component* in some encryption
> schemes (but not in others). Knowing that XOR was used
> in a scheme does not tell one nearly enough to make use
> of that information.
>
> Different cryptanalytic attacks are applied against
> different encryption schemes. I don't have a copy of
> Schneier's book at hand at the moment and thus I don't
> know which *specific* method he was talking about. For
> learning cryptanalysis in general, there are better
> texts than Schneier's.
>
Altough you are perfectly correct, You knew exactly what he meant by
the 'XOR' system of encryption, as do i, as does everyone else in the
world. I therefore would say its perfectly alright (but not brilliant)
to call XOR an encryption algorithm.
--
Hi, i'm the signuture virus,
help me spread by copying me into Signiture File
Sent via Deja.com http://www.deja.com/
Before you buy.
------------------------------
From: "Wenjie Zhao" <[EMAIL PROTECTED]>
Subject: Newbie question on VIM encryption key
Date: Sun, 20 Aug 2000 17:16:50 +0800
Reply-To: "Wenjie Zhao" <[EMAIL PROTECTED]>
Hi, there
Sorry to post to this crowded NS with such a subject.
Could somebody kindly provide me some pointer or
source codes(on SUN OS) to break the VIM encryption
key? ---As I forgot my key and I cannot get my text back.
Best Regards,
--
Wenjie ZHAO
------------------------------
From: [EMAIL PROTECTED] (John Savard)
Subject: Re: How Many?
Date: Sun, 20 Aug 2000 10:48:30 GMT
On Sat, 19 Aug 2000 18:21:27 -0400, Future Beacon <[EMAIL PROTECTED]>
wrote, in part:
>Does anybody view any other principle or element as essential to
>encryption?
It isn't even clear that those two are essential: if one views
transposition as a kind of linear transformation, ciphers, even
somewhat secure ones, have been based on transposition alone.
Basically, Shannon's paper, "Information Theory of Secrecy Systems"
identified _confusion_ and _diffusion_ as the two essential components
of a cipher, and they roughly correspond to a substitution and a
transposition. On my web site, I've pointed out that there are a
number of other commonly used constructs that contribute to the
security of some ciphers, so I compiled a somewhat longer (and more
detailed - Shannon wasn't wrong to say that everything reduced to
means of achieving the _goals_ of confusion and diffusion, but others
are wrong to see substitution and transposition as the only _means_ of
doing so) list, of about five primitives, to promote more creative
thinking among cipher designers.
Thus, on
http://home.ecn.ab.ca/~jsavard/crypto/pp0106.htm
I note that there are three other goals, subgoals of confusion and
diffusion, which I call indirection, convolution, and alternation,
that are useful in thinking about the design of a cipher system. Then
I list simple cipher methods which correspond to them.
John Savard
http://home.ecn.ab.ca/~jsavard/crypto.htm
------------------------------
From: [EMAIL PROTECTED] (John Savard)
Subject: Re: How Many?
Date: Sun, 20 Aug 2000 10:40:10 GMT
On Sat, 19 Aug 2000 21:57:03 -0400, "Douglas A. Gwyn"
<[EMAIL PROTECTED]> wrote, in part:
>That's nonsense. One could equally well (or better) say that
>the only "real" primitive is NAND, since the entire digital
>system could be constructed using just that one operator.
>However, this sort of microanalytical approach is doomed,
>because in most cases the real *meaning* can only be dealt
>with at higher levels of structure. One should work at an
>appropriate level for best effect.
Well, it's clear that NAND is not an appropriate level.
A lot of block ciphers - SAFER comes particularly to mind - are
constructed as if "substitutions and linear transforms" really are the
building blocks. Even then, the repetition does ensure the whole is
stronger than the parts.
The mind-set behind this owes something to Shannon's identification of
confusion and diffusion as fundamental.
On *my* web site, I've tried (in the conclusions section of the first
chapter, on paper-and-pencil ciphers) to increase the number of basic
primitives from two to...five.
Not because I thought the primitives had any kind of key to
cryptanalysis - and I think that's your point here, that they don't,
and you're quite right - but to encourage a bit more 'imagination' in
cipher design.
John Savard
http://home.ecn.ab.ca/~jsavard/crypto.htm
------------------------------
From: [EMAIL PROTECTED] (John Savard)
Subject: Re: How Many?
Date: Sun, 20 Aug 2000 10:56:30 GMT
On Sat, 19 Aug 2000 22:02:50 -0400, "Douglas A. Gwyn"
<[EMAIL PROTECTED]> wrote, in part:
>Well, no, in general transposition hinders cryptanalysis.
>Would you rather cryptanalyze system A or system T*A where
>T is an unknown transposition?
This is true in general.
One particular proposal for a cipher, where two ECB applications of a
block cipher are accompanied, in between, by an _unkeyed_
transposition (varied, based on the table of byte frequencies which
the transposition does not affect) was shown to be weak because of a
chosen-plaintext attack involving enciphering a message consisting of
the same block repeated many times. (In this way, there were only
eight types of byte floating around, giving a probability of duplicate
blocks in the output.)
This is very interesting and informative. It shows a class of attacks
that need to be avoided. But what disturbs me is that some people seem
to draw the conclusion, from this and similar attacks, that adding
complications to a cipher is so dangerous that it should be avoided
completely.
Instead, I would think that these attacks are simply examples that
help to illustrate the need to follow certain "rules of thumb" when
adding complications that prevent them from making things worse.
Complication is the very *source* of security, and being stingy with
complication is as bad as being stingy with the size of the key.
John Savard
http://home.ecn.ab.ca/~jsavard/crypto.htm
------------------------------
From: Mok-Kong Shen <[EMAIL PROTECTED]>
Subject: Re: OTP using BBS generator?
Date: Sun, 20 Aug 2000 14:06:31 +0200
"Douglas A. Gwyn" wrote:
>
> Mok-Kong Shen wrote:
> > In statistical tests, one tests in essence the presence of
> > certain amount of patterns (orderliness). If that amount
> > is lower than a threshold, it means the sequence is
> > sufficiently random (with respect to the type of patterns
> > examined).
>
> As has been observed before, it is not the *data* that might
> be random, but the *process* that generates it. It is thus
> better to consider the property of "randomness" as meaning
> agreement with a specific source model. Then the test merely
> measures the degree of deviation from the model behavior.
>
> Some small fraction of the time, even a perfectly functioning
> true random generator *will* produce a highly patterned
> stretch of output data. When it does, the statistical test
> will flag the deviation; such occasional misdiagnosis is
> unavoidable. There is a trade-off between minimizing the
> chance that a valid source is rejected, and minimizing the
> chance that an malfunctioning source is accepted. Tests
> should be designed to make an appropriate trade-off.
Right. Thanks for elaboration of the proper interpretation
of tests which I pleaded to be always carried out in practice.
>
> > A test gives for one sequence only one value (result), the
> > test statistic. If we let the source generate a large
> > amount of such sequences, we can see whether the distribution
> > of these values sufficiently well correspond to that of an
> > ideal random source.
>
> That is merely a different ("batched") kind of single test.
For that test (with the corresponding pattern being tested)
it shows how the source deviates from an ideal source.
Since it is a 'single' (type) of test, it is not sufficient.
Consequently we need more, in fact a more or less complete
spectrum of these. See also below.
>
> > If we do the same for all statistical tests available, we
> > can have indeed a quite good feeling of the quality of the
> > source in my humble view.
>
> "All" tests available is an essentially unbounded number.
> It would be best to use tests designed specially for
> detecting the most likely kinds of deviation from the model,
> which for the most part requires analyzing the structure of
> the generator and its likely failure modes.
I meant by 'all available' what one knows and can afford
to do (always with respect to economy, necessity, etc.),
not 'all' that exist or will ever be invented in this world.
There is of course an issue of intelligent choice of tests,
our work being limited by the available resources, as is
always the case in life. In fact one is involved in a
'game' here. For, if the opponent has better resources
(tests) and is hence able to detect non-randomness that
we can't, then we lose. If we are confident that he is
not very 'rich', we could even opt to do less.
>
> This entire business is a well-established branch of
> statistics, and there are professionals who are paid to
> set up such tests. It's best to employ an expert than to
> try to "wing it" and end up with unwarranted confidence in
> a defective system due to poor test design.
That the tests employed should be proper ones established
in the mathematical science and that in serious applications
trained statisticians should be engaged to do the job seem
to be self-evident. It is true, though, that these are
sometimes neglected in practice by the project managers.
Once again I like to stress the need of practical tests,
which should not be simply 'exempted' by the 'fame' of
some mathematical 'proofs' (in particular those that do
not in fact claim perfect security on a par with the
ideal OTP) in connection with the sources being used.
M. K. Shen
------------------------------
From: Mok-Kong Shen <[EMAIL PROTECTED]>
Subject: Re: How Many?
Date: Sun, 20 Aug 2000 15:06:37 +0200
John Savard wrote:
>
> One particular proposal for a cipher, where two ECB applications of a
> block cipher are accompanied, in between, by an _unkeyed_
> transposition (varied, based on the table of byte frequencies which
> the transposition does not affect) was shown to be weak because of a
> chosen-plaintext attack involving enciphering a message consisting of
> the same block repeated many times. (In this way, there were only
> eight types of byte floating around, giving a probability of duplicate
> blocks in the output.)
>
> This is very interesting and informative. It shows a class of attacks
> that need to be avoided. But what disturbs me is that some people seem
> to draw the conclusion, from this and similar attacks, that adding
> complications to a cipher is so dangerous that it should be avoided
> completely.
I use to think about adding complications in general like
a situation where, visiting a city for the first time, a
person drives me from one point to another. After he has
done several turns left and right, my possibility of
finding the way back myself greatly diminishes. (In fact,
adding more rounds in a block cipher is the simplest
instance of adding complications.) A friend of mine claims,
though, that he has little problem with that.
>
> Instead, I would think that these attacks are simply examples that
> help to illustrate the need to follow certain "rules of thumb" when
> adding complications that prevent them from making things worse.
> Complication is the very *source* of security, and being stingy with
> complication is as bad as being stingy with the size of the key.
However, there are often hard limitations posed on the
designs. For example, the AES candidates are required to
also run well on low-end hardware, which, I believe,
should much better have had specifications allowing
user's choice of (strength) levels depending on his
actual need and resources available.
M. K. Shen
------------------------------
From: Mok-Kong Shen <[EMAIL PROTECTED]>
Subject: Re: How Many?
Date: Sun, 20 Aug 2000 15:06:42 +0200
John Savard wrote:
>
> Thus, on
>
> http://home.ecn.ab.ca/~jsavard/crypto/pp0106.htm
>
> I note that there are three other goals, subgoals of confusion and
> diffusion, which I call indirection, convolution, and alternation,
> that are useful in thinking about the design of a cipher system. Then
> I list simple cipher methods which correspond to them.
I think that, when considering diffusion/confusion
(permutation/substitution), there is the issue of
granularity (bits, bytes, words, etc.). The two
mechanisms need not always work at the same granularity
and in one single algorithm different parts/levels can
operate at different granularities.
M. K. Shen
------------------------------
From: Mok-Kong Shen <[EMAIL PROTECTED]>
Subject: Re: How Many?
Date: Sun, 20 Aug 2000 15:06:33 +0200
John Savard wrote:
>
> A lot of block ciphers - SAFER comes particularly to mind - are
> constructed as if "substitutions and linear transforms" really are the
> building blocks. Even then, the repetition does ensure the whole is
> stronger than the parts.
[snip]
I suppose the reason is that there results (the 'linearity'
notwithstanding) a much larger transformation, which gives
rise to bigger difficulty for the opponent.
[snip]
> Not because I thought the primitives had any kind of key to
> cryptanalysis - and I think that's your point here, that they don't,
> and you're quite right - but to encourage a bit more 'imagination' in
> cipher design.
I agree. It is the 'right' viewpoint that helps. An object
always invariably has to be viewed at different angles
depending on the purpose of observation. The view of a car
by the engineer designing it may not be appropriate for
the driver or the policeman regulating the traffic.
M. K. Shen
------------------------------
From: [EMAIL PROTECTED]
Subject: Question on Decorelation Theory
Date: Sun, 20 Aug 2000 12:56:55 GMT
When a function is 1-wise decorrelated such as F(x) = x + k, you need
only one input to begin a simple attack on it.
When something is pair-wise decorrelated such as F(x) = ax + b, that
means you have to look at pairs of inputs/outputs?
Then wise is F(x) = (ax + b)/c 3-wise decorrelated?
Hmm....
Thanks in advance for any help,
Tom
Sent via Deja.com http://www.deja.com/
Before you buy.
------------------------------
Date: Sun, 20 Aug 2000 09:14:09 -0400
From: "Trevor L. Jackson, III" <[EMAIL PROTECTED]>
Subject: Re: How Many?
"John A. Malley" wrote:
> Future Beacon wrote:
> >
> > I don't know whether to call them principles or elements. How many
> > are essential to encryption? I see transposition as a special case
> > of substitution.
>
> A transposition alters dependencies between consecutive symbols in the
> plaintext. Transposition helps mask correlations between plaintext
> symbols so these correlations attenuate in the ciphertext. It's a
> permutation on the positions of the plaintext. It doesn't care what
> plaintext symbol is at that position in the plaintext sequence to
> permute. Transposition preserves the frequencies of occurrence of
> plaintext symbols in the ciphertext. And that helps cryptanalysis.
>
> A substitution maps plaintext symbols to ciphertext symbols. A
> substitution can be a permutation on the alphabet of plaintext symbols.
> For simple mono alphabetic substitution the same plaintext symbols gets
> replaced with the same ciphertext symbol no matter where it occurs in
> the plaintext - but that's not true for a periodic transposition. The
> frequency of occurrence of a plaintext symbol doesn't match the
> frequency of occurrence for that symbol in the ciphertext. But in
> general, substitutions preserve correlations between plaintext symbols
> so these correlations appear between ciphertext symbols. And that helps
> cryptanalysis. ( I acknowledge polygram substitutions, polyalphabetic
> substitutions and aperiodic substitutions make cryptanalysis progressive
> harder, but attacks on these systems take advantage of correlations in
> ciphertext.)
>
> Combining these two cryptographic primitives together is the heart and
> magic of good cryptography. Transposition alters the order of plaintext
> symbols. Substitution alters the symbols used to represent the
> plaintext. Successive transposition ( also called permutation) and
> substitution on the same plaintext block, a round, is at the core of
> DES and other block ciphers.
While these operations are conceptually distinct, their effects are not.
Given that there are (at least) four levels at which these operations can be
applied (bit, char, block, message), the effect of a transposition at a lower
level is similar to that of a substitution at a higher level.
>
>
> > Both binary operations and transformations could
> > be viewed as look-up tables. I seem to view every element of
> > encryption as look-up table formation, look-up table use or random
> > number sources.
>
> These look like implementations of math or logic functions. As such one
> can implement permutations and substitutions.
> >
> > Does anybody view any other principle or element as essential to
> > encryption?
>
> Well, algorithmically transposition and substitution seem to cover
> cryptography. Even public key ciphers. For example, take RSA. (Out on a
> limb here, asbestos suit ON. Check.)
Neither of them cover masking with noise (steganography).
>
>
> Look at M^e mod n = C as C = (M + K) mod n. K is a secret key. The key
> value K = (C - M ) mod n. M is substituted to a new number, C, with an
> additive substitution modulo n. To decrypt, subtract K from the
> ciphertext (the number C ) modulo n as M = C^d mod n = (C - K) mod n =
> M. The mathematical operations of exponentiation to e to encrypt and
> to d to decrypt could be done exactly the same with a secret key K
> added to M modulo n and subtracted from C modulo n.
>
> So yes, implementation here is done with modulo arithmetic, but we can
> describe RSA as a substitution cipher taking plaintext represented as a
> numerical value on a field and substituting a new value on that same
> field for it.
>
> John A. Malley
> [EMAIL PROTECTED]
> >
> > If they are few in number, it might be instructive to study them
> > rather than studying a compendium of techniques that each contain
> > more than one of them.
> >
> > Jim Trek
> > Future Beacon Technology
> > http://eznet.net/~progress
> > [EMAIL PROTECTED]
------------------------------
From: [EMAIL PROTECTED] (Paul Schlyter)
Subject: Re: about Openssl
Date: 20 Aug 2000 14:21:28 +0200
In article <w9Dn5.12682$[EMAIL PROTECTED]>,
Dave Fortunato <[EMAIL PROTECTED]> wrote:
> Dave
>
> haifeng <[EMAIL PROTECTED]> wrote in message
> news:[EMAIL PROTECTED]...
>
>> Hello,
>> need help.
>>
>> Who knews Openssl0.9.x? And
>> May I use it to get "CA"? May I use it to do "Certificate Validation"?
>> How do it?
>
> On openbsd check
> http://openbsd.documenta.net/issue1/openssl.html
Also check out: http://www.openssl.org
--
================================================================
Paul Schlyter, Swedish Amateur Astronomer's Society (SAAF)
Grev Turegatan 40, S-114 38 Stockholm, SWEDEN
e-mail: pausch at saaf dot se or paul.schlyter at ausys dot se
WWW: http://hotel04.ausys.se/pausch http://welcome.to/pausch
------------------------------
From: [EMAIL PROTECTED]
Subject: Simple Stream Cipher
Date: Sun, 20 Aug 2000 13:21:21 GMT
Instead of just adding (via xor) the key stream why not do something
like the pair-wise decorrelation given as F(x) = a.x + b where (a,b)
are made on the fly from some LFSR or LFG.
It wouldn't be the fastest cipher in the world but it is really
simple. On byte-sized words you could even do the calc in Z(257).
Or am I crazy?
Tom
Sent via Deja.com http://www.deja.com/
Before you buy.
------------------------------
** FOR YOUR REFERENCE **
The service address, to which questions about the list itself and requests
to be added to or deleted from it should be directed, is:
Internet: [EMAIL PROTECTED]
You can send mail to the entire list (and sci.crypt) via:
Internet: [EMAIL PROTECTED]
End of Cryptography-Digest Digest
******************************