Cryptography-Digest Digest #65, Volume #11        Mon, 7 Feb 00 17:13:01 EST

Contents:
  Re: Prior art in science (wtshaw)
  Re: Prior art in science (Mok-Kong Shen)
  Re: How secure is this method? (Mok-Kong Shen)
  Message to SCOTT19U.ZIP_GUY ([EMAIL PROTECTED])
  Re: Hill Climbing ("Tony T. Warnock")
  Re: Combining LFSR's ("Trevor Jackson, III")
  Re: Factorization ("Tony T. Warnock")
  Anti-crack (CJ)
  Re: Prior art in science (Mok-Kong Shen)
  Re: Merkle hash tree patent expired (Anton Stiglic)
  Re: Anti-crack (Arthur Dardia)
  Re: Need help for security program (Arthur Dardia)
  Re: NIST, AES at RSA conference (Terry Ritter)
  Re: NIST, AES at RSA conference (Terry Ritter)
  Re: NIST, AES at RSA conference (Terry Ritter)
  Re: NIST, AES at RSA conference (Terry Ritter)
  Re: Prior art in science (Terry Ritter)
  Re: NIST, AES at RSA conference ("Douglas A. Gwyn")
  Re: Anti-crack ("Kurt Van Nuggat")
  Re: Hill Climbing ("Douglas A. Gwyn")

----------------------------------------------------------------------------

From: [EMAIL PROTECTED] (wtshaw)
Subject: Re: Prior art in science
Date: Mon, 07 Feb 2000 12:53:47 -0600

In article <[EMAIL PROTECTED]>, Mok-Kong Shen
<[EMAIL PROTECTED]> wrote:

....
> (That an employee at the patent office may not have the incentive or
> energy to perform similar tasks is well understandable.) So in some 
> sense it appears paradoxically that, while we acquire more knowledge
> every day, we know less at the same time.
> 

Those that pretend to be protected from the internet revolution simply
because it is not convenient for them are not to be sheltered. 

To classify as published to only journals which you have is some dusty
warehouse, which many do not have, is simply to qualify yourself as inept,
which is the result of the internet revolution aspect in different degrees
to many of us.

Things that might be getting ahead of us are best attacked with methods
from the same technology, search engines, whose capabilities become by
definition those of their users.
-- 
Life is full of upturns and downturns, with varying periods of 
stabilty mixed in.  It is a fool's errand to assume that what is 
happening any one day predicts the same as a constant future.

------------------------------

From: Mok-Kong Shen <[EMAIL PROTECTED]>
Subject: Re: Prior art in science
Date: Mon, 07 Feb 2000 21:12:25 +0100

Paul Koning wrote:
> 
> Mok-Kong Shen wrote:
> >
> > The issue of 'prior art' is not only relevant in patent applications
> > but also of interest by itself in general in science, I suppose.
> > Recently in a thread it has been pointed out that what has been
> > published in newsgroups and similar forums possibly may not
> > qualify as 'prior art' because of limited possibilities of being
> > found in searches.
> 
> That's nonsense.
> 
> First of all, of course you can find it in searches.  But
> whether or not it's easy to find doesn't change whether it
> is "prior art".

Perhaps the cited opinion (not mine!) could nevertheless be
defended somewhat. If some information is very troublesome to be 
retrieved, then it will normally not be found, either because the 
searcher becomes impatient or because his resources (time, etc.) 
are exhausted. It is known, for example, that decades ago lots of
Russian scientific results were barely known to western scientists 
because either the journals concerned were simply not subscribed
by the libraries or that the scientists couldn't read them. Nowadays 
the situation has changed to such an extent that one friend of mine 
once remarked that there have been much too much translations of 
Russian journals (meaning that even the second-classed ones get
translated into English).

> The patent office may not spot it, of course; these days,
> almost anything seems to be accepted as a valid patent application,
> in spite of the "non-obvious and novel" requirement.  :-(

In Germany the situation is a little bit better. There is a
public review period for all patent applications. However, only
large firms that can afford to maintain special patent departments 
regularly keep an eye on the patent office bulletins that solicitate 
public reviews and the public itself is unfortunately generally very 
much less attentive of materials contained in such publications.

M. K. Shen

------------------------------

From: Mok-Kong Shen <[EMAIL PROTECTED]>
Subject: Re: How secure is this method?
Date: Mon, 07 Feb 2000 21:12:33 +0100

Sandy Harris wrote:
> 
> [EMAIL PROTECTED] (Erik) spake thus:

> >and is the linear congruence algorithm sufficient for this purpose?
> 
> Absolutely not, even if you combine outputs from several of them.

I have incidently published on my web page a couple of years ago
a compound PRNG scheme with a (special case) implementation that 
employed exclusively LPRNGs as constituent generators. If someone 
has 'concrete' (as against theoretically postulated) ideas of how 
to effectively attack that generator, I should be very grateful 
to learn these.

M. K. Shen
=============================
http://home.t-online.de/home/mok-kong.shen

------------------------------

From: [EMAIL PROTECTED]
Subject: Message to SCOTT19U.ZIP_GUY
Date: Mon, 07 Feb 2000 20:10:40 GMT

Hi...

I sent you an email about your Screaming Method....no reply....Hope you
are still around....???

Perhaps you can explain your 2 compression , 3 encryption method...sound
s very complicated and CPU consumming....

Steve


Sent via Deja.com http://www.deja.com/
Before you buy.

------------------------------

From: "Tony T. Warnock" <[EMAIL PROTECTED]>
Subject: Re: Hill Climbing
Date: Mon, 07 Feb 2000 13:25:26 -0700
Reply-To: [EMAIL PROTECTED]

In cryptography one hopes that the natural topology based for example on
add, subtract, multiply, divide, shift, xor, and, or, etc. does not
reflect what is really happening. For example in DES there is not
obvious relationship between what happens to a given plaintext with keys
that differ in 1 bit.


------------------------------

Date: Mon, 07 Feb 2000 15:33:06 -0500
From: "Trevor Jackson, III" <[EMAIL PROTECTED]>
Subject: Re: Combining LFSR's

Ben Curley wrote:

> Hi all,
>
> This is probably a stupid question, but here goes...
>
> I am attempting to combine the output of two LFSR's to produce a repeatable
> key stream when the start state of LFSR 1 is random. Is this even possible?

Yes it is possible.

If the LFSRs are the same size and structure, then there are only 2^WIDTH
possible output cycles of the combined registers.  For an initial state of R1
there is a corresponding state for R2 that produces the original combined
output stream.

For more information on calculating the initial value of a register from its
output look up the Berklecamp-Massey algorithm.



------------------------------

From: "Tony T. Warnock" <[EMAIL PROTECTED]>
Subject: Re: Factorization
Date: Mon, 07 Feb 2000 13:26:36 -0700
Reply-To: [EMAIL PROTECTED]

Notice how slow Pollard rho is. (The one I ran.)


------------------------------

From: [EMAIL PROTECTED] (CJ)
Subject: Anti-crack
Date: Mon, 07 Feb 2000 20:18:03 GMT

Has anyone researched means of protecting
programs from being cracked with encryption?

I'm not an expert in either area, but what I understand
of cracking is that ultimately you are looking for the
machine instruction in the executable which compares
password/serial number etc. to some given
value. 

So I was thinking one could maybe encrypt this piece
of the executable and decrypt on the fly when the application
starts. You might be able to trace the decryption and try to
spot the key used, but that would be more difficult (esp.
as you wouldn't know what algorithm is used).

I'm not sure one can prevent direct tracing of the executable
code once it has been decrypted however. (I was thinking
maybe having it in a DLL, but this is maybe traceable too.)
Are there any better ways?

------------------------------

From: Mok-Kong Shen <[EMAIL PROTECTED]>
Subject: Re: Prior art in science
Date: Mon, 07 Feb 2000 21:47:22 +0100

wtshaw wrote:
> 
> Things that might be getting ahead of us are best attacked with methods
> from the same technology, search engines, whose capabilities become by
> definition those of their users.

Search engines by themselves do not constitute an appropriate
solution in the long term in my humble opinion. For the efficiency
naturally falls as the total volume of informations to be searched
(over-exponentially) increases with time. Concerning meterials 
posted in internet forums, I conjecture that one useful solution 
could be for some (apparently fruitful) threads to have their 
initiators write at the end of the discussion periods summaries 
together with classification numbers/keywords of the contents 
(eventually modified/finalized with the help of certain participants) 
and have these summaries be archieved at certain locations for 
purposes of search (or at least for a preliminary run of such search).
That way, the retrieval efficiency could evidently be much enhanced. 
Conversely, it would be very fine if it could be arranged that the 
patent applicants announce their submissions to the relevant 
newsgroups and mailing lists so that the applications would not 
escape the attention/scrutiny of those people that have a regular 
interest in the scientific fields concerned.

M. K. Shen

------------------------------

From: Anton Stiglic <[EMAIL PROTECTED]>
Subject: Re: Merkle hash tree patent expired
Date: Mon, 07 Feb 2000 15:45:40 -0500

Paul Rubin wrote:

> Hey, nobody seems to have mentioned it here, but apparently US patent
> 4,309,569 expired on September 5, 1999.  This gives a somewhat clunky
> way of doing digital signatures using only conventional hash functions,
> no modular exponentiations or elliptic curves or other fancy math.

I guess the next big one is RSA, september 20, 2000.

Anton


------------------------------

From: Arthur Dardia <[EMAIL PROTECTED]>
Subject: Re: Anti-crack
Date: Mon, 07 Feb 2000 16:04:41 -0500

CJ wrote:

> Has anyone researched means of protecting
> programs from being cracked with encryption?
>
> I'm not an expert in either area, but what I understand
> of cracking is that ultimately you are looking for the
> machine instruction in the executable which compares
> password/serial number etc. to some given
> value.
>
> So I was thinking one could maybe encrypt this piece
> of the executable and decrypt on the fly when the application
> starts. You might be able to trace the decryption and try to
> spot the key used, but that would be more difficult (esp.
> as you wouldn't know what algorithm is used).
>
> I'm not sure one can prevent direct tracing of the executable
> code once it has been decrypted however. (I was thinking
> maybe having it in a DLL, but this is maybe traceable too.)
> Are there any better ways?

Rather than tracing through the encryption/decryption routine, the
cracker could just write a jump command.  At some point you're going to
have to try the key against the the encrypted serial number.  By tracing
through the dissassembled code, the cracker will be able to see the
routine you used, he can then write a keygen using one of the engines by
taking your encryption code directly from the dissassembly file.

To sum it up, every program is crackable, it'll just take time, which
isn't a factor, because you still won't get the registration money, and
he'll then just want to share the wealth of his efforts with more people
(bragging for completing a tough crack).

--
Arthur Dardia      Wayne Hills High School      [EMAIL PROTECTED]
 PGP 6.5.1 Public Key    http://www.webspan.net/~ahdiii/ahdiii.asc



------------------------------

From: Arthur Dardia <[EMAIL PROTECTED]>
Subject: Re: Need help for security program
Date: Mon, 07 Feb 2000 16:07:18 -0500

etbear wrote:

> I'm working on a security program for my computing project, and I need a way
> in which Windows will alert me when my file is being accessed for reading or
> writing. I know there must e some wayto do it...but just couldn't find it.
> Can any one give me a brief idea how to do it, so that I can start
> researching on that?
> Thanks

I used to use SoftICE a lot, and I have long forgotten some of the Windows API
calls, such as GetDlgItemA, etc.  Try searching for a list of these on a
cracker's site or asking in a cracking channel.

Once you get this information, just write a program that'll run on startup and
will do whatever anytime your file is opened.  Shouldn't be too tough.


--
Arthur Dardia      Wayne Hills High School      [EMAIL PROTECTED]
 PGP 6.5.1 Public Key    http://www.webspan.net/~ahdiii/ahdiii.asc



------------------------------

From: [EMAIL PROTECTED] (Terry Ritter)
Subject: Re: NIST, AES at RSA conference
Date: Mon, 07 Feb 2000 21:36:37 GMT


This is the 3rd time I have tried to post this message...

On 6 Feb 2000 14:22:13 -0800, in
<87ks6l$esd$[EMAIL PROTECTED]>, in sci.crypt
[EMAIL PROTECTED] (David Wagner) wrote:

>In article <[EMAIL PROTECTED]>, Terry Ritter <[EMAIL PROTECTED]> wrote:
>> My statement was and is correct.
>
>I'll repeat my request for a proof, then...

I think I have been more than forthcoming.


>> 1. If the current best attack on an individual cipher is some sort of
>> known-plaintext or defined-plaintext (as is quite likely), and that
>> attack is prevented by multi-ciphering, does that *not* increase the
>> effective strength of the cipher?
>
>Yes, if there are no other attacks that are as good as the one prevented

That was the assumption following my "if."

>(and as long as the modification doesn't introduce any new attacks).
>But how do you prove that these conditions hold?

With respect to known-plaintext and defined-plaintext not being the
easiest attacks, it is hard to imagine how having *less* information
about a particular transformation (a keyed cipher) could possibly
produce the *better* attack on that transformation.  I suppose it
might be possible for some other attack to have the same power, or
even slightly less but practically the same, but I would like to see
some examples.  But this might be a more favorable proof area than the
silly "additional computation" approach.   

It also might be interesting to know how many examples exist where the
best published attack on any particular block cipher is *not* a
known-plaintext or defined-plaintext attack.  


>> 2. If we add a second ciphering action after a first cipher, is it not
>> clear that -- even in the worst possible case of the opponent knowing
>> the key to that cipher -- this will increase the effective strength of
>> the system by the effort involved in deciphering the second cipher?  
>
>No.  It is not at all clear that this will increase the strength one whit.
>See the previously posted counterexample, using XOR.  

First of all, XOR is not a cipher.  

Then, if the cipher in the so-called "counterexample" was intended to
be a one-time pad, the only reason a second level does not increase
strength is that there is no more strength left.  But I find it hard
to take any use of the OTP in these arguments very seriously.  

And if the cipher in the so-called "counterexample" was intended to be
a stream cipher, the argument is simply faulty:  Stream ciphers do
not, in general, form a group.  

So the argument was lame, yet you insisted that it disproves my point.



>More generally, any
>cipher that forms a group is an obvious counterexample.

OK, that's right.  So we need to use ciphers which do not form a
group.  Please consider that assumption added to those following my
"if."  

What percentage of ciphers would you say that leaves out?  


>Once again, I think this point also needs proof, because in its fully
>general form I find it unbelievable, given the apparent counterexamples.

The original counterexample -- which made it unbelievable for you --
was wrong, but it convinced you anyway.  So why bother with "proof"?  


>Again, if these things are so clear, it should be straightforward to
>encode them in a formal (or at least convincing) proof.  But I don't think
>they are clear at all.  That's why I'm challenging anyone who cares to try
>to prove them.

Fine.  Why don't you sketch out two or three approaches which don't
work, and we'll see what can be done to extend them.

This is an issue that apparently concerns you far more than it does
me:  My proposed proof of "increased strength due to multi-ciphering"
represents only the effort involved in executing the cipher.  I still
think this can be shown, but the result is meaningless in practice
anyway. 

---
Terry Ritter   [EMAIL PROTECTED]   http://www.io.com/~ritter/
Crypto Glossary   http://www.io.com/~ritter/GLOSSARY.HTM


------------------------------

From: [EMAIL PROTECTED] (Terry Ritter)
Subject: Re: NIST, AES at RSA conference
Date: Mon, 07 Feb 2000 21:36:46 GMT


On 6 Feb 2000 23:45:40 -0800, in
<87lt74$jgj$[EMAIL PROTECTED]>, in sci.crypt
[EMAIL PROTECTED] (David Wagner) wrote:

>In article <[EMAIL PROTECTED]>, Terry Ritter <[EMAIL PROTECTED]> wrote:
>> The provable "increase in strength" to which I refer in the sub-issue
>> is simply the effort involved in executing the additional cipher, even
>> if that is broken or the key known.  Even if the transformation from
>> ciphertext to plaintext for the second cipher is "easy," it still must
>> take place, and that is strength, even if hardly any at all.  
>
>But this is *wrong*, as is implicitly demonstrated by the counterexamples.
>Consider double-encryption with Vigenere (a group).  Deciphering this
>takes *precisely* as much work as deciphering single-Vigenere.
>
>You claimed that the cryptanalyst must take the time to execute both
>components of the multi-cipher to read the traffic.  This is wrong:
>because double-encryption is here equivalent to single-encryption with
>a different key, the cryptanalyst will just do a single decryption,
>executing the cipher only once to read the traffic.  You failed to
>consider the possibility of shortcut attacks that bypass the second
>cipher.

Right.  For provable added strength, the ciphers must not have
"groupiness."  But if we have that, we may be able to say more about
added strength then just an execution time.  


>One might conjecture that this phenomenom only occurs when the cipher
>is a group, but such a claim would remain unproven at best.

OK, then:  Sketch out one or two approaches which do not develop a
proof and we can see what if anything can be done to fix them up.  I
suppose if you had a contradiction you would say so.  

---
Terry Ritter   [EMAIL PROTECTED]   http://www.io.com/~ritter/
Crypto Glossary   http://www.io.com/~ritter/GLOSSARY.HTM


------------------------------

From: [EMAIL PROTECTED] (Terry Ritter)
Subject: Re: NIST, AES at RSA conference
Date: Mon, 07 Feb 2000 21:36:52 GMT


On Mon, 07 Feb 2000 13:57:42 GMT, in <[EMAIL PROTECTED]>,
in sci.crypt [EMAIL PROTECTED] (John Savard) wrote:

>On 6 Feb 2000 23:45:40 -0800, [EMAIL PROTECTED]
>(David Wagner) wrote, in part:
>
>>One might conjecture that this phenomenom only occurs when the cipher
>>is a group, but such a claim would remain unproven at best.
>
>Of course it can occur when the two ciphers are only "close" to being
>in the same group. 

I suppose you are taking "group" in the full mathematical sense.  But
if two ciphers have group-like operations for any reasonable fraction
of messages, that counts.  Lacking a better term, I would call this
property "groupiness."  

On the other hand, it appears to me that ciphers which have this
property *must* be constructed fairly similarly.  The reason for this
is that modern block ciphers are just tiny simulations of random
permutations, since each covers only a tiny part of the potential
keyspace.  In practice this means that the probability of any two
different ciphers covering a substantial fraction of the same keyed
transformations (the groupiness property) is tiny to the point of
unbelievability.  

Perhaps the real issue is to get some sort of handle on what it means
for ciphers to be "different."  But it seems to me that ciphers are
not black boxes, and practical ciphers which function differently have
little or no chance of producing similar transformation.  


>However, as a practical matter, it is usually true
>that with unrelated ciphers, shortcut attacks are not likely to be
>known.
>
>Again, this comes down to multi-ciphering being a _practical_ measure,
>but being recommended as a remedy to a problem whose existence is only
>shown in _theoretical_ terms.

Since there is no proof we have very real worries.  The fact that
proof is "theoretical" does not mean that the consequences are also
"theoretical."  


>Either some practical reason for regarding the AES candidates as too
>weak is needed, or it is enough to cast the argument in this light:
>since no cipher can be proven secure, this is enough reason to want to
>take all the precautions one can, and to be particularly suspicious of
>a single cipher based on a single round structure. I suppose that does
>make the argument weaker, but not unreasonable even so.

Right.  

But we have a whole subset of people here for whom proof is
everything, despite the fact that 50 of mathematical cryptography have
yet to produce a proof of strength for any cipher in practice.  

Obviously there is something fundamentally wrong with concentrating on
proofs of strength in this environment.  That concentration has not
solved the problem, and seems quite unlikely to do so.  

But these same people say things like "since we cannot prove any
cipher, nothing we can do will provide such proof, so nothing is worth
doing."  Even though that is the basis for the argument of the side
which loves proofs, they have yet to prove such logic, which does seem
more than a little ironic.  

Of course, that logic cannot be proven.  But one might think that
would be more of a problem for those who insist on proof.  

---
Terry Ritter   [EMAIL PROTECTED]   http://www.io.com/~ritter/
Crypto Glossary   http://www.io.com/~ritter/GLOSSARY.HTM


------------------------------

From: [EMAIL PROTECTED] (Terry Ritter)
Subject: Re: NIST, AES at RSA conference
Date: Mon, 07 Feb 2000 21:36:57 GMT


On 7 Feb 2000 11:31:16 -0800, in
<87n6i4$k2k$[EMAIL PROTECTED]>, in sci.crypt
[EMAIL PROTECTED] (David Wagner) wrote:

>In article <[EMAIL PROTECTED]>,
>John Savard <[EMAIL PROTECTED]> wrote:
>> since no cipher can be proven secure, this is enough reason to want to
>> take all the precautions one can, and to be particularly suspicious of
>> a single cipher based on a single round structure.
>
>But this line of reasoning still does not provide a satisfactory answer
>to Brian Olson's question: How do you know when to stop?  

I have answered that question in great detail.  The question is a
red-herring which Olson created.  I have never claimed that
multi-ciphering will provide a proof of strength.  Olson apparently
argues that unless something provides a proof of strength it is not
worth using, in which case, of course, we might as well not use
ciphers at all.  And yet we do.  


>If lack of proof
>were the only reason to triple a cipher, then there would be no end to the
>tripling.  After all, tripling doesn't help one whit with the problem
>that we lack proofs -- there is no tripled cipher that we can prove secure,
>nor can we prove that tripling increases security even the slightest bit.

Given a restricted set of ciphers used in restricted ways it may be
possible to show that any attack on a particular cipher must be more
difficult if that cipher is in a multi-cipher stack.  


>In practice, I expect that the reason why we to triple ciphers is because
>of subjective concerns about the security of our ciphers, not because of
>the lack of proofs.  

I would say that if we had a proof of strength, that would be
sufficient.  Consequently, it is the failure of academic mathematical
cryptography to provide a proof of strength which has generated the
current situation.  

Since we do *not* have a proof of strength, we are necessarily forced
into *non-proof* -- but still scientific and logical -- comparisons
and reasoning.  


>(That's a fine reason, by the way -- but let's be clear
>about our reasons.)  The lack of proofs, IMHO, seems to be a red herring.

That's fine, but it is essentially *your* red-herring, since you keep
returning to it.  

---
Terry Ritter   [EMAIL PROTECTED]   http://www.io.com/~ritter/
Crypto Glossary   http://www.io.com/~ritter/GLOSSARY.HTM


------------------------------

From: [EMAIL PROTECTED] (Terry Ritter)
Subject: Re: Prior art in science
Date: Mon, 07 Feb 2000 21:37:13 GMT


On Mon, 07 Feb 2000 13:12:11 -0500, in <[EMAIL PROTECTED]>,
in sci.crypt Paul Koning <[EMAIL PROTECTED]> wrote:

>Mok-Kong Shen wrote:
>> 
>> The issue of 'prior art' is not only relevant in patent applications
>> but also of interest by itself in general in science, I suppose.
>> Recently in a thread it has been pointed out that what has been
>> published in newsgroups and similar forums possibly may not
>> qualify as 'prior art' because of limited possibilities of being
>> found in searches.
>
>That's nonsense.

I agree.


>First of all, of course you can find it in searches.  But
>whether or not it's easy to find doesn't change whether it
>is "prior art".  

Indeed, it may be far easier to search for and find something on line
than in a library of individual books and magazine issues.  I have
personally conducted a grueling manual search through individual
patent records which took two full man-weeks of time.  Nowadays it's
easier, of course.  

Moreover, this is the sort of search one can conduct in many public
libraries, which is the traditional measure of whether the information
was indeed available to the public.  


>The patent office may not spot it, of course; these days,
>almost anything seems to be accepted as a valid patent application,
>in spite of the "non-obvious and novel" requirement.  :-(

Yes, those are the requirements, but after they get filtered through
an ancient tradition of law and convoluted reasoning, what these
requirements actually mean is difficult to even discuss unless we have
experienced that context.  Words simply don't have the same meanings
or implications or relations in patent law as in normal life.  

However, I can report that I *have* in fact presented Usenet sci.crypt
messages as the state of the art for one or two of my issued patents.


---
Terry Ritter   [EMAIL PROTECTED]   http://www.io.com/~ritter/
Crypto Glossary   http://www.io.com/~ritter/GLOSSARY.HTM


------------------------------

From: "Douglas A. Gwyn" <[EMAIL PROTECTED]>
Subject: Re: NIST, AES at RSA conference
Date: Mon, 07 Feb 2000 21:57:22 GMT

Terry Ritter wrote:
> But these same people say things like "since we cannot prove any
> cipher, nothing we can do will provide such proof, so nothing is
> worth doing."

I think *most* of the counter-responses were just objecting to the
claim that composing multiple encipherments *provably* increased
"strength".  When one says something is provable, he is in danger
of being asked to prove it!

------------------------------

From: "Kurt Van Nuggat" <[EMAIL PROTECTED]>
Subject: Re: Anti-crack
Date: Mon, 7 Feb 2000 15:00:09 -0700

Absolutely correct!!

Software protection may slow a cracker down but will not stop him.

Anyone who can write uncrackable software protection code will be very rich
indeed because every sharware author in the world will beat a path to his
door.

JK  http://www.crak.com



Arthur Dardia <[EMAIL PROTECTED]> wrote in message
news:[EMAIL PROTECTED]...
> CJ wrote:
>
> > Has anyone researched means of protecting
> > programs from being cracked with encryption?
> >
> > I'm not an expert in either area, but what I understand
> > of cracking is that ultimately you are looking for the
> > machine instruction in the executable which compares
> > password/serial number etc. to some given
> > value.
> >
> > So I was thinking one could maybe encrypt this piece
> > of the executable and decrypt on the fly when the application
> > starts. You might be able to trace the decryption and try to
> > spot the key used, but that would be more difficult (esp.
> > as you wouldn't know what algorithm is used).
> >
> > I'm not sure one can prevent direct tracing of the executable
> > code once it has been decrypted however. (I was thinking
> > maybe having it in a DLL, but this is maybe traceable too.)
> > Are there any better ways?
>
> Rather than tracing through the encryption/decryption routine, the
> cracker could just write a jump command.  At some point you're going to
> have to try the key against the the encrypted serial number.  By tracing
> through the dissassembled code, the cracker will be able to see the
> routine you used, he can then write a keygen using one of the engines by
> taking your encryption code directly from the dissassembly file.
>
> To sum it up, every program is crackable, it'll just take time, which
> isn't a factor, because you still won't get the registration money, and
> he'll then just want to share the wealth of his efforts with more people
> (bragging for completing a tough crack).
>
> --
> Arthur Dardia      Wayne Hills High School      [EMAIL PROTECTED]
>  PGP 6.5.1 Public Key    http://www.webspan.net/~ahdiii/ahdiii.asc
>
>



------------------------------

From: "Douglas A. Gwyn" <[EMAIL PROTECTED]>
Subject: Re: Hill Climbing
Date: Mon, 07 Feb 2000 22:01:45 GMT

G Winstanley wrote:
> As far as scoring goes...Incidence of Coincidence has proven not very
> effective in my Playfair solving, and I finally went with trigraph
> frequencies.

The main trick is for the scoring function to give "part credit"
when one has a "partial solution".  Trigraph frequencies work in
this case because they span multiple encryption units.

------------------------------


** FOR YOUR REFERENCE **

The service address, to which questions about the list itself and requests
to be added to or deleted from it should be directed, is:

    Internet: [EMAIL PROTECTED]

You can send mail to the entire list (and sci.crypt) via:

    Internet: [EMAIL PROTECTED]

End of Cryptography-Digest Digest
******************************

Reply via email to