Cryptography-Digest Digest #639, Volume #13       Tue, 6 Feb 01 06:13:01 EST

Contents:
  Re: A story of distrust .... my ex-mother, Eeva Nuora, Varkaus, Finland 
  Redundancy in algorithms ([EMAIL PROTECTED])
  Re: Do you like playing with numbers? ("Michael Brown")
  Re: lypanov ? (Anh Vu Tran)
  Different cipher type ("Michael Brown")
  Re: On combining permutations and substitutions in encryption (Terry Ritter)
  Re: OverWrite freeware completely removes unwanted files from hard drive (Anthony 
Stephen Szopa)
  Re: OverWrite freeware completely removes unwanted files from hard drive (Anthony 
Stephen Szopa)
  Re: steganography by random phase carrier convolution (Benjamin Goldberg)
  Re: Mathematical concepts ("Kurt Flei�ig")
  Re: OverWrite freeware completely removes unwanted files from hard drive (Anthony 
Stephen Szopa)
  Re: On combining permutations and substitutions in encryption (Benjamin Goldberg)

----------------------------------------------------------------------------

Reply-To: [EMAIL PROTECTED]
Crossposted-To: alt.2600,alt.security,comp.security
Subject: Re: A story of distrust .... my ex-mother, Eeva Nuora, Varkaus, Finland 
Date: Fri, 02 Feb 2001 16:51:01 GMT



"Markku J. Saarelainen" wrote:
> 
> Basically, this the story of distrust.
> 
<snip>


This is relavent to any of the groups you posted this too??

Fascinating story, should I care?

StanMann

------------------------------

From: [EMAIL PROTECTED]
Subject: Redundancy in algorithms
Date: Tue, 06 Feb 2001 08:30:03 GMT

Some algorithms are described as being non-redundant. What does that
mean?

Does it refer to not using the same procedure several rounds?

Could it mean that the system using the algorithm doesn't check for
errors and therefore doesn't re-send data?

Could someone point out some known algorithms belonging to either
group, please?

Cheers,
/Pteppic


Sent via Deja.com
http://www.deja.com/

------------------------------

From: "Michael Brown" <[EMAIL PROTECTED]>
Subject: Re: Do you like playing with numbers?
Date: Tue, 6 Feb 2001 21:50:45 +1300


"John Savard" <[EMAIL PROTECTED]> wrote in message
news:[EMAIL PROTECTED]...
> On Sat, 03 Feb 2001 21:44:09 GMT, Tom St Denis <[EMAIL PROTECTED]>
> wrote, in part:
>
> >Who cares?
>
> Presumably, whoever wants to crack the copy protection on the software
> for which those are serial numbers and activation keys...
Still, it's a pretty dumb activation algorithm that has this kind of
expansion. 4 bits to 9 bits. Geez.

>
> at least, that's what these tables tend to appear like, even if this
> time it really is a challenge puzzle, as claimed.
>
> John Savard
> http://home.ecn.ab.ca/~jsavard/crypto.htm

Michael
--
Code snippit 1 : Fibbonachi fill
Stats:
  In  : esi = destination address, ecx = number of numbers / 2
  Out : esi,eax,ebx,ecx destroyed. [esi] = 1,2,3,5,8...
  Time: 2.5 clocks per Fibbonachi number + 1 clock initialisation
Code (replace ";" with newline):
  mov eax,1;mov ebx,1;L1:mov [esi],eax;add ebx,eax;add esi,4;
  mov [esi],ebx;add eax,ebx;add esi,4;dec ecx;jnz L1



------------------------------

From: Anh Vu Tran <[EMAIL PROTECTED]>
Subject: Re: lypanov ?
Date: Tue, 06 Feb 2001 09:43:23 +0100

Thank you very much, that's exactly what I'm looking for.

John Savard wrote :
> 
> On Mon, 05 Feb 2001 10:45:22 +0100, Anh Vu Tran
> <[EMAIL PROTECTED]> wrote, in part:
> 
> >I would like informations about lypanov, i don't know anything about it.
> >It is said to decrypt cyphers containing a and b like that:
> 
> >abbab aaabb baabb....
> 
> >Am I right ?
> 
> Lyapunov?
> 
> http://spanky.triumf.ca/www/fractint/lyapunov_type.html
> 
> has something about patterns of a's and b's being used to produce
> designs called Lyapunov fractals.
> 
> http://wilson.simplenet.com/chaos/lyapunov/
> 
> has a bit more about it.
> 
> So I don't think it will really help in cracking a Baconian cipher...
> 
> John Savard
> http://home.ecn.ab.ca/~jsavard/crypto.htm

------------------------------

From: "Michael Brown" <[EMAIL PROTECTED]>
Subject: Different cipher type
Date: Tue, 6 Feb 2001 22:16:10 +1300

Hi there,

I mentioned this a while ago, but never really followed up on it. The idea
was to have a cipher where instead of each bit in a key representing a
number (bad description, but it's hard to cover DES and RSA keys under one
word :), it represented an instruction. For example (almost certainly
insecure or otherwise flawed, but you'll get the idea) using 3 bits and
having 8 "instructions" that could be, and having, say, 5 bits for
"parameters" to that function. These functions could be (with n beign the
parameter):
0 = xor with <n> preceding plaintext bytes
1 = xor with <n> preceding ciphertext bytes
2 = rotate left n bits
3 = rotate right n bits
4 = add <n> preceding plaintext bytes
5 = add <n> preceding ciphertext bytes
6 = negate
7 = nop

The last two are stupid, but you get the idea. After a certain number of
bytes (even maybe 1) the key would be mashed somehow (either by encrypting
it with itself, or rotating the whole key left 1 bit, or ...). The other
thing that would be needed would be, in this case, 256 bits of information
to go before the start of the main file. This is so that the encryption
would have time to "warm up", so to speak. Hey, you could even use the key
itself (or a deritive) as the warm-up code.

Anyhow, how secure would this sort of thing be, and how the heck would you
attack it (besides brute force of course :). Known plaintext could be a
problem.

Just an idea that returned to my head from places unknown.

Thanks in advance,
Michael

--
Fibbonachi fill
Stats:
  In  : esi = destination address, ecx = number of numbers / 2
  Out : esi,eax,ebx,ecx destroyed. [esi] = 1,2,3,5,8...
  Time: 2.5 clocks per Fibbonachi number + 1 clock initialisation
Code (replace ";" with newline):
  mov eax,1;mov ebx,1;L1:mov [esi],eax;add ebx,eax;add esi,4;
  mov [esi],ebx;add eax,ebx;add esi,4;dec ecx;jnz L1



------------------------------

From: [EMAIL PROTECTED] (Terry Ritter)
Subject: Re: On combining permutations and substitutions in encryption
Date: Tue, 06 Feb 2001 09:19:42 GMT


On Sun, 04 Feb 2001 23:08:05 +0100, in
<[EMAIL PROTECTED]>, in sci.crypt Mok-Kong Shen
<[EMAIL PROTECTED]> wrote:

>[...]
>It is interesting that you view whole file processing as
>variable-size block ciphers (in view of the fact that
>you have some patents on variable-size block ciphers,
>if I remember what you wrote previously correctly).

The cipher design is what it is.  The patent is what it is.  This is
not really a viewpoint issue.  


>[...]
>As far as I have seen, nobody has argued against any claim
>of the sort 'attacking DT is very complex'. So the many
>disputes about DT in a number of threads hitherto had
>constituted 'much ado for nothing' or have there been some 
>real causes that render disputes unavoidable?

I guess that depends on your idea of a "real cause."

What I said was distorted and exaggerated, and then addressed as
though that was what I actually said.  That would seem to be ample
cause for dispute.

Then there was the peculiar situation of people telling me what I
really meant, and criticizing that.


The Proof

As one issue, there was considerable dismay at my statement that
ciphering structures could exist for which one might find mathematical
proofs that certain attacks could not work in practice.  

Various lengthy, strong, and particularly condescending statements
were made to the effect that anyone with any knowledge of modern
cryptography would know that such a thing had been proven impossible.

But proofs do not come without assumptions, and recognizing and
meeting such limits is part of the ordinary business of mathematics.
The inability to know when a proof does not apply presumably speaks
volumes.  

In this case I was fortunate to find a contradiction which showed that
no such proof could apply to the discussion.  But if I had not found a
contradiction, my claims still would have been valid.  Yet the
improper use of mathematical proof probably would have convinced
almost everybody of something which is now known to be false.  I would
call that "cause for dispute."  

We now know that it *is* possible to have mathematical proofs which
show that particular ciphers are not vulnerable to certain types of
attack in practice.  Since some ciphers will support such proof better
than others, there would seem to be ample reason for cryptographers to
design a wide range of fundamentally new ciphering structures.


The OTP

Then we had some strange horror over the realization that, in
practice, an "OTP" can be insecure.  There was some sort of statement
about it being "widely accepted" that a physical RNG could not be
predicted, which is just twaddle: ample possibilities for error exist
at multiple levels in most physically-random generators.  

The classic "OTP" has the structure of the weakest possible stream
cipher, and so is trivially vulnerable to sequence defects.  A good
argument can be made that, in practice, a cipher using a modern
combiner which does not expose the keying sequence is less risky than
a classic "OTP."

---
Terry Ritter   [EMAIL PROTECTED]   http://www.io.com/~ritter/
Crypto Glossary   http://www.io.com/~ritter/GLOSSARY.HTM


------------------------------

From: Anthony Stephen Szopa <[EMAIL PROTECTED]>
Crossposted-To: talk.politics.crypto,alt.hacker,alt.conspiracy
Subject: Re: OverWrite freeware completely removes unwanted files from hard drive
Date: Tue, 06 Feb 2001 02:31:12 -0800

Tom St Denis wrote:
> 
> In article <[EMAIL PROTECTED]>,
>   Anthony Stephen Szopa <[EMAIL PROTECTED]> wrote:
> > OverWrite freeware completely removes unwanted files from hard drive
> >
> > OverWrite Program: incorporates the latest recommended file
> > overwriting techniques. State-of-the-art detection technology and
> > the subtleties of hard drive technology have made most overwritten
> > and deleted data on magnetic media recoverable. Simply overwriting
> > a file a few times is just not good enough.
> 
> I would argue on super-dense HD's that simply writting FF to the file is
> enough.  Not alot of snoopers have the time to break out the 'ol
> electron microscope and read bits "The HardWay (tm)".  If I overwrite
> the file with FF the os doesn't keep a backup (or shouldn't) thus
> mission accomplished the file is wiped.
> 
> Tom
> 
> Sent via Deja.com
> http://www.deja.com/


Your point is well made except that there continues to be 
vulnerable tracking variations even on modern hard drives.

And they do not use electron microscopes for this purpose.

Pointing out these facts should lead the average person of even 
common intelligence to question your grasp of the facts and 
conclusions as I have.

------------------------------

From: Anthony Stephen Szopa <[EMAIL PROTECTED]>
Crossposted-To: talk.politics.crypto,alt.hacker,alt.conspiracy
Subject: Re: OverWrite freeware completely removes unwanted files from hard drive
Date: Tue, 06 Feb 2001 02:33:28 -0800

Daniel wrote:
> 
> On Mon, 05 Feb 2001 16:57:13 GMT, Tom St Denis <[EMAIL PROTECTED]>
> wrote:
> 
> >
> >I would argue on super-dense HD's that simply writting FF to the file is
> >enough.  Not alot of snoopers have the time to break out the 'ol
> >electron microscope and read bits "The HardWay (tm)".  If I overwrite
> >the file with FF the os doesn't keep a backup (or shouldn't) thus
> >mission accomplished the file is wiped.
> >
> >Tom
> >
> Let us not forget what it would cost to have a HardDisk scanned up to
> 11 layers deep.  Usually, those HD which contained "critical
> information" but are no longer used are destroyed (mechanical + heat).
> That's the only assuring way :)
> 
> best regards,
> 
> Daniel


What are you talking about:  "11 layers deep".

Don't be ridiculous.

------------------------------

From: Benjamin Goldberg <[EMAIL PROTECTED]>
Subject: Re: steganography by random phase carrier convolution
Date: Tue, 06 Feb 2001 10:37:50 GMT

Splaat23 wrote:
> In article <[EMAIL PROTECTED]>,
>   [EMAIL PROTECTED] (John Bailey) wrote:
> > http://www.rochesterbusinessnews.com/
> > The Rochester (NY) Democrat and Chronicle mentioned tech briefs to
> > be put on the web by Kodak, "The first installment, released
> > yesterday, highlights Kodak's work to make digital imaging
> > copyrights more secure and in developing new screens for consumer
> > electronics." Following a Byzantine trail of links I finally came to
> > this article (pdf) which may be of interest to this newsgroup.
> >
> > Data Embedding Using Phase Dispersion
> > Chris Honsinger and Majid Rabbani
> >
> > Abstract
> > A method of data embedding based on the convolution of message data
> > with a random phase carrier is presented. The theory behind this
> > method is reviewed and it is shown that the technique can be used to
> > hide both pictorial and non-pictorial data. The details of the
> > procedures used for carrier design, message template optimization,
> > message extraction optimization, block synchronization, and rotation
> > and scale correction are discussed. Finally, the algorithm�s
> > benchmark results using Stirmark are presented.
> > Full text at:
> > http://www.kodak.com/US/plugins/acrobat/en/corp/
> >     researchDevelopment/dataEmbedding.pdf
> >
> 
> It doesn't appear to be that spectacular (from my first look). It's
> just watermarking with some resistance to filtering. It doesn't solve
> the one problem with (image) stenography - if you can get the original
> image, you can find that a difference exists and detect the
> transmission of data.
> 
> As with all watermarking solutions that run on computers under the
> control of the user, it will not function as a right management
> package. The same data that would all the information to be retrieved
> can be used to erase or managle the watermark enough to disable it.
> 
> As far as I can tell, this is not that interesting.

If removing the watermark degrades the quality of the transfer medium
(the image) below what is acceptable, then it is indeed interesting.

If that is so (and I don't know whether or not it is), then you would
have a kind of watermark that "can't be removed" and thus might be
usable as a form of copy protection -- you sell each user data with
different watermarked info, and if the user pirates it and resells it or
puts it up for free, then you can identify which user did it and take
legal action.

-- 
A solution in hand is worth two in the book.
Who cares about birds and bushes?

------------------------------

From: "Kurt Flei�ig" <[EMAIL PROTECTED]>
Subject: Re: Mathematical concepts
Date: Tue, 06 Feb 2001 10:34:10 GMT


Joris Vankerschaver ha scritto nel messaggio
<91ps3c$m10$[EMAIL PROTECTED]>...
>Paul Rubin ([EMAIL PROTECTED]) wrote:
>
>: Better look at it before you buy it.  It is a graduate level text.
>: If you've just had some undergraduate courses in algebra and number
>: theory, Cohen's book will be very difficult.  But it is a great book.
>
>OK, what about the Koblitz book then (Algebraic Aspects of Cryptography)?
>
>Joris

To avoid both copyright's violations and subjective evaluations, here is a
contents summary:

Chapter 1 - Cryptography
Cryptography: the beginning, Public Key Systems, RSA, Diffie-Hellmann...
Signature, Secret Sharing... Coin flipping, Passwords..., Practical
CryptoSystems....

Chapter 2 - Computational Complexity
Big O notation, Length of Numbers..., Time..., P=NP?..., Random and
complexity...,...

Chapter 3 - Algebra
Fields and finite fields, Euclide's Algo..., P-Rings, Gr�bner...

Chapter 4 - Hidden Monomial CrySys
Matzumoto, Dragon and others

Chapter 5 - Combinatorial Algebraic CrySys
Brassard, Concrete... basic... ideal, attacks with linear algebra, secure
sys

Chapter 6 - Ellpitic and HyperEll. CrySys

Appendix


All in 201 pages with exercises.

Ciao
KF




------------------------------

From: Anthony Stephen Szopa <[EMAIL PROTECTED]>
Crossposted-To: talk.politics.crypto,alt.hacker,alt.conspiracy
Subject: Re: OverWrite freeware completely removes unwanted files from hard drive
Date: Tue, 06 Feb 2001 02:45:17 -0800

Joseph Ashwood wrote:
> 
> Not to address Szopa (because he's not worth it), but instead to address his
> misinformation.
> 
> It has been proven many a time, that if you overwrite with known patterns,
> those patterns can be stripped away, since Szopa has been so kind as to make
> all the information about what he overwrites with available, those patterns
> can be very easily, very systematically removed, and the original file can
> be recovered. It's not a cheap process, but they can be recovered.
> 
> What is actually needed is to detect the type of media, be it high-density
> magnetic media, optical media or RAM-based media and treat them accordingly.
> Each of these have different requirements for proper erasure. For all these
> media types obviously massive temperatures are the optimal solution, however
> that has a strong tendancy to destroy the media in the process. Beyond that
> extreme level they vary greatly.
> 
> Magnetic media is the usual target so I will begin with that. To fully erase
> something off of magnetic media requires the successive overwriting of it
> with random data (as noted above all known patterns can be stripped away
> unless extreme temperatures have been applied). Once this has been done the
> odds of recovery of each bit drop to similar to 1/2^n where n is the number
> of overwrites, the primary change comes from the difference in data density
> of various drives (an old 5 MB 6 foot wide single platter drive will be far
> more recoverable than IBMs 30 GB minidrive).
> 
> Optical media, comes in two very different types with differing needs. Both
> have actually very similar behavior for secure deletion. Because they
> commonly lack sufficient density to justify the precision to properly
> overwrite, and because an overwrite partially destroys the disk, making an
> overwrite secure delete destructive. I strongly recommend the excessive
> temperature methodology.
> 
> RAM media is it's own interesting problem. It is commonly believed that RAM
> cannot be dependably recovered. This is not true, however it is by far the
> most difficult to recover, and also the most expensive. It also disobeys the
> rules the other two obey. Most importantly the ability to recover is
> affected by the amount of time the value has been in the RAM, and how long
> since it was removed. This stems from the fact that the presence of the
> magnetic field created by charge in the RAM causes stesses in the
> surrounding environment, with different stresses applying to different
> values. Because of this the idea is to destroy evidence of the stress in the
> RAM. To do this what is required is far different. It is still required that
> the numbers be truly random, otherwise the stresses can again be observed
> with some amount of transparency, However instead of writing different
> values as fast as possible to erase it as quickly as possible, instead you
> need to write and wait for the stress effect the environment. This stress
> becomes unreadable much faster than the magnetic media becomes unreadable,
> with the stress dropping away in under a second from detectable to virtually
> indetectable. It's also worth noting that although the magnetic media
> overwrite can be realatively easily negated if the values are known, with
> the RAM modules it creates less than a 50% time difference.
> 
> These differences should be of extreme importance to anyone who places their
> lives in the hands of a secure delete program.
>                                     Joe


You are nuts.

"...those patterns can be stripped away..."

How?  With cleanser and lots of elbow grease.  What are you talking
about here?  What utter BS.

I suppose we have your good word on this.

Then tell us, what are the forces involved in placing the original
magnetic orientation to the patch of hard disk space with the 
original data bit?  What was the amount of force used to place the 
bit or bits previous to this? What are the forces involved in 
writing 27 other bits of data over this original and all previous 
data bits written before the overwrite was begun?  What are the
variations of this force with each write operation?

I don't expect you have such answers because you are just winging it
with all your BS.

Let's say that the bit in question was the 100th bit written to this
small space on the hard drive since it was manufactured.  Then the
original data sought after was then written.  Now it is over written
with 27 other bits and let's say you know exactly what these last 27
bits were.

Tell us, give us an order of magnitude value for the residual 
magnetic force from the original bit in question and how does this
magnetic force compare to the 99 previous bit residual magnetic 
fields and the subsequent 27?

I have just got to hear this!

------------------------------

From: Benjamin Goldberg <[EMAIL PROTECTED]>
Subject: Re: On combining permutations and substitutions in encryption
Date: Tue, 06 Feb 2001 10:56:03 GMT

Bryan Olson wrote:
> 
> Benjamin Goldberg wrote:
> > David Wagner wrote:
> [...]
> > > To be more precise, you need to give me a family of ciphers {C_k},
> > > one for each value of k.  Think of k as a security parameter: The
> > > running time of C_k must be polynomial in k, whereas the security
> > > condition is that breaking C_k should be super-polynomial in k.
> > > If there exists a family of ciphers where these two conditions
> > > are met, then P != NP.

Here's something I've just thought of:  Suppose the family of ciphers is
Rijndael, with k being the number of key bits.

> >
> > What if k is the number of bits in the key?  Are you saying
> > that if P=NP, then a system which enciphers with a k bit key
> > must either take superpolynomial time (in terms of k) to encrypt
> > or else take polynomial time (in terms of k) to break?
> >
> > Somehow, I doubt that.
> 
> Well, it's certainly not an obvious result.
> 
> > Let's suppose that we have an encryption system with a size
> > k key, and unicity distance u.  Further suppose that we have
> > a way of converting a plaintext/ciphertext pair into a 3SAT
> > problem -- and that we have u pairs.
> >
> > I'm sure you'll agree that if P!=NP, then it takes
> > superpolynomial time (with respect to the number of 3SAT
> > terms) to solve this 3SAT problem,
> 
> No.  If P!=NP then the worst case of 3SAT is super-polynomial.
> That does not imply the subset of 3SAT produced by reduction
> from the cipher input/output will be super-polynomial.

My mistake.

> > and if P=NP, then it takes polynomial time.
> 
> Right.
> 
> > Furthermore, our conversion algorithm might be made to take
> > polynomial time (with respect to the number of 3SAT terms)
> [...]
> > But there's one [big] problem with all this.  How many terms are
> > in the 3SAT problem?  If the number of terms is superpolynomial
> > wrt k
> 
> The reduction time is polynomial in the size of input.

What do you mean the size of the input?

Or, to put it another way, suppose you have AES with a 192 bit key. 
Write the relationship between the key bits, the plaintext bits, and the
ciphertext bits as a 3SAT problem.  Does this conversion take polynomial
in terms of 192, or superpolynomial in terms of 192?  (I think you've
said the conversion takes polynomial time in terms of 192, but I'm not
sure.)  If it takes polynomial time to convert, then there cannot be
more than polynomial terms in the 3SAT problem.  If it takes
superpolynomial time to convert, then there may [or may not] be a
superpolynomial number of terms.

If conversion takes polynomial time, and P=NP, [solving the 3SAT takes
polynomial time], then breaking any block cipher takes polynomial time
in terms of the keysize.

-- 
A solution in hand is worth two in the book.
Who cares about birds and bushes?

------------------------------


** FOR YOUR REFERENCE **

The service address, to which questions about the list itself and requests
to be added to or deleted from it should be directed, is:

    Internet: [EMAIL PROTECTED]

You can send mail to the entire list by posting to sci.crypt.

End of Cryptography-Digest Digest
******************************

Reply via email to