Cryptography-Digest Digest #222, Volume #10 Sat, 11 Sep 99 18:13:02 EDT
Contents:
Factoring numbers too big to represent on a computer? ("Stush")
Re: compression and encryption (Anthony Stephen Szopa)
Re: Factoring numbers too big to represent on a computer? (Bob Silverman)
Re: compression and encryption (SCOTT19U.ZIP_GUY)
Re: compression and encryption (SCOTT19U.ZIP_GUY)
Re: "Posting Anonymously is the Sign of a Coward" (Barrett Richardson)
----------------------------------------------------------------------------
From: "Stush" <[EMAIL PROTECTED]>
Crossposted-To: sci.math
Subject: Factoring numbers too big to represent on a computer?
Date: Sat, 11 Sep 1999 15:08:22 -0400
How would one go about finding factors of a number too big to represent on a
computer?
For example 5^d -1 where d is > 10^50.
I'm not looking for a complete factorization. Maybe all factors less than
some bound such as 10^10.
Any help appreciated.
Thanks
------------------------------
From: Anthony Stephen Szopa <[EMAIL PROTECTED]>
Subject: Re: compression and encryption
Date: Sat, 11 Sep 1999 12:13:09 -0700
Reply-To: [EMAIL PROTECTED]
"SCOTT19U.ZIP_GUY" wrote:
> In article <7r5jp2$[EMAIL PROTECTED]>, "Shaun Wilde"
><[EMAIL PROTECTED]> wrote:
> >
> >should I compress my data before or after encryption? (binary data - with
> >possibly repeated blocks i.e .exe etc)
> >
> >1) If I compress before encyption the final data block is small.
> >2) If I compress after encryption the data block is much larger (hardly any
> >saving as the encryption removes any repetitiveness
> >that exists in the original data.)
> >
> >From the above I would say go for the 1st option, however I have a concern
> >and it is as follows.
> >
> >If someone was trying to break the encryption all they would have to do is
> >
> >a) try a key
> >b) try to decompress
> > if decompression works - no errors - then the odds are on that thay have
> >broken the code
> This is ture if you use most compression methods. But if you use
> a "one to one" compressor any file can be the compressed results of
> another file. Therefore all files that could result from guessing a worng key
> would be uncompressable. See http://members.xoom.com/ecil/compress.htm
> If your are like me you may have wondered wht PGP was not designed with
> this type of compression. I feel that a weak compressor can be used as
> a back door to help with the breaking of encryption.
> > else repeat
> >
> >Which would lead to an automated attack, whereas the second approach would,
> >in my opinion, require a more
> >interactive approach - as you would need to know what sort of data exists in
> >the original to know whether you
> >have decrypted succesfully.
> The second apporach is far worse since the enemy would have to uncompress
> only once.
> >
> >Do I have right to be concerned or am I completely off track?
> Yes you have the right to be worried. Most books put out by the
> experts fail to cover this topic. It is most likely not covered on purpose.
> IF you notice Mr Brue or Wagner will not even touch the topic since
> it is a likely back door to such methods as PGP. And people such as
> them are afraid to make trouble for the NSA.
>
> David A. Scott
> --
> SCOTT19U.ZIP NOW AVAILABLE WORLD WIDE
> http://www.jim.com/jamesd/Kong/scott19u.zip
> http://members.xoom.com/ecil/index.htm
> NOTE EMAIL address is for SPAMERS
My 2 cents worth: It certainly depends on the strength of the
encryption method.
Using OAP-L3 with a sufficiently strong key, it really does not
matter. You are no more likely to decrypt the compressed file as you
are likely to decrypt the UNcompressed file.
If you are using an encryption method that may be compromised by
compressing the file first then encryptung it, then perhaps you
should seek a stronger encryption method.
------------------------------
From: Bob Silverman <[EMAIL PROTECTED]>
Crossposted-To: sci.math
Subject: Re: Factoring numbers too big to represent on a computer?
Date: Sat, 11 Sep 1999 20:41:30 GMT
In article <gDxC3.1204$[EMAIL PROTECTED]>,
"Stush" <[EMAIL PROTECTED]> wrote:
> How would one go about finding factors of a number too big to
represent on a
> computer?
>
> For example 5^d -1 where d is > 10^50.
>
> I'm not looking for a complete factorization. Maybe all factors less
than
> some bound such as 10^10.
You can find algebraic factors of 5^d -1 by factoring d.
5^(ab) - 1 is divisible by 5^a - 1 and 5^b - 1 when a,b are odd.
When ab is even 5^ab - 1 is the difference of two squares.
You then look for factors of 5^a - 1 and 5^b - 1.
If d is prime and d > 10^50, then the only factor of 5^d - 1 that
is less than 10^10 is the factor 4. There can be no others. Indeed,
even if d isn't prime, and d > 10^50, the only factors less than 10^10
will be from algebraic factorizations of 5^d - 1 such as shown above.
--
Bob Silverman
"You can lead a horse's ass to knowledge, but you can't make him think"
Sent via Deja.com http://www.deja.com/
Share what you know. Learn what you don't.
------------------------------
From: [EMAIL PROTECTED] (SCOTT19U.ZIP_GUY)
Subject: Re: compression and encryption
Date: Sat, 11 Sep 1999 22:12:32 GMT
In article <[EMAIL PROTECTED]>, [EMAIL PROTECTED] wrote:
>"SCOTT19U.ZIP_GUY" wrote:
>
>> In article <7r5jp2$[EMAIL PROTECTED]>, "Shaun Wilde"
> <[EMAIL PROTECTED]> wrote:
>> >
>> >should I compress my data before or after encryption? (binary data - with
>> >possibly repeated blocks i.e .exe etc)
>> >
>> >1) If I compress before encyption the final data block is small.
>> >2) If I compress after encryption the data block is much larger (hardly any
>> >saving as the encryption removes any repetitiveness
>> >that exists in the original data.)
>> >
>> >From the above I would say go for the 1st option, however I have a concern
>> >and it is as follows.
>> >
>> >If someone was trying to break the encryption all they would have to do is
>> >
>> >a) try a key
>> >b) try to decompress
>> > if decompression works - no errors - then the odds are on that thay have
>> >broken the code
>> This is ture if you use most compression methods. But if you use
>> a "one to one" compressor any file can be the compressed results of
>> another file. Therefore all files that could result from guessing a worng key
>> would be uncompressable. See http://members.xoom.com/ecil/compress.htm
>> If your are like me you may have wondered wht PGP was not designed with
>> this type of compression. I feel that a weak compressor can be used as
>> a back door to help with the breaking of encryption.
>> > else repeat
>> >
>> >Which would lead to an automated attack, whereas the second approach would,
>> >in my opinion, require a more
>> >interactive approach - as you would need to know what sort of data exists in
>> >the original to know whether you
>> >have decrypted succesfully.
>> The second apporach is far worse since the enemy would have to
> uncompress
>> only once.
>> >
>> >Do I have right to be concerned or am I completely off track?
>> Yes you have the right to be worried. Most books put out by the
>> experts fail to cover this topic. It is most likely not covered on purpose.
>> IF you notice Mr Brue or Wagner will not even touch the topic since
>> it is a likely back door to such methods as PGP. And people such as
>> them are afraid to make trouble for the NSA.
>>
>> David A. Scott
>> --
>> SCOTT19U.ZIP NOW AVAILABLE WORLD WIDE
>> http://www.jim.com/jamesd/Kong/scott19u.zip
>> http://members.xoom.com/ecil/index.htm
>> NOTE EMAIL address is for SPAMERS
>
>My 2 cents worth: It certainly depends on the strength of the
>encryption method.
>
>Using OAP-L3 with a sufficiently strong key, it really does not
>matter. You are no more likely to decrypt the compressed file as you
>are likely to decrypt the UNcompressed file.
It is faulty reasoning like this that makes things easier to break.
One should always use the strongest encryption possible. But that
said. But I think your stuck on the feeling your method OAP-L3
is so strong that it makes not difference. Well if you belive that
there is no reason to go farther.
But if your method is not the work of GOD then it might make
a difference as to weather compression can aid or hinder an attack
against it. You can not say with any honesty that compression
makes zero difference. The fact is compression makes a lot of
difference and a poor compression method can actually aid in
the breaking of an encryption and the recovery of a message.
But if your to blinded by the brillance of your method and recieveing
to many pats on the back then maybe you can't think very
straight.
Even Bruce in his book that I recommend no one to read Page 222
recognizes this from the diagram in his book ( it most be an oversite
of his since I am sure he really does not wish to help people)
says use compression then encryption and then error corrections as
a seperate item. Of course his book never goes deeply into the
subject of why. And he never talks in his book about what kind of
info the compression method itself can add to the aid of the
attacker and what would should consider since Maybe his
handlers don't want him too. But it is interesting to know
that at least in this page he also states that error correction
should be done outside of the encryption phase. Yet I don't see
him taking a negative stand against the NSA approved chainning
methods that are so great becasue of error recovery.
>
>If you are using an encryption method that may be compromised by
>compressing the file first then encryptung it, then perhaps you
>should seek a stronger encryption method.
Think again no short keyed short blocked method can be proven
secure. To think otherwise shows great stupidity. Again use your
brain if possible. Use the best form of compression and the
strongest encryption.
My 5 cents worth.
David A. Scott
--
SCOTT19U.ZIP NOW AVAILABLE WORLD WIDE
http://www.jim.com/jamesd/Kong/scott19u.zip
http://members.xoom.com/ecil/index.htm
NOTE EMAIL address is for SPAMERS
------------------------------
From: [EMAIL PROTECTED] (SCOTT19U.ZIP_GUY)
Subject: Re: compression and encryption
Date: Sat, 11 Sep 1999 22:19:02 GMT
In article <7regkh$hvq$[EMAIL PROTECTED]>, [EMAIL PROTECTED] (SCOTT19U.ZIP_GUY)
wrote:
>In article <[EMAIL PROTECTED]>, [EMAIL PROTECTED] wrote:
>>"SCOTT19U.ZIP_GUY" wrote:
>>
>>> In article <7r5jp2$[EMAIL PROTECTED]>, "Shaun Wilde"
>> <[EMAIL PROTECTED]> wrote:
>>> >
>>> >should I compress my data before or after encryption? (binary data - with
>>> >possibly repeated blocks i.e .exe etc)
>>> >
>>> >1) If I compress before encyption the final data block is small.
>>> >2) If I compress after encryption the data block is much larger (hardly any
>>> >saving as the encryption removes any repetitiveness
>>> >that exists in the original data.)
>>> >
>>> >From the above I would say go for the 1st option, however I have a concern
>>> >and it is as follows.
>>> >
>>> >If someone was trying to break the encryption all they would have to do is
>>> >
>>> >a) try a key
>>> >b) try to decompress
>>> > if decompression works - no errors - then the odds are on that thay
> have
>>> >broken the code
>>> This is ture if you use most compression methods. But if you use
>>> a "one to one" compressor any file can be the compressed results of
>>> another file. Therefore all files that could result from guessing a worng
> key
>>> would be uncompressable. See http://members.xoom.com/ecil/compress.htm
>>> If your are like me you may have wondered wht PGP was not designed with
>>> this type of compression. I feel that a weak compressor can be used as
>>> a back door to help with the breaking of encryption.
>>> > else repeat
>>> >
>>> >Which would lead to an automated attack, whereas the second approach would,
>>> >in my opinion, require a more
>>> >interactive approach - as you would need to know what sort of data exists
> in
>>> >the original to know whether you
>>> >have decrypted succesfully.
>>> The second apporach is far worse since the enemy would have to
>> uncompress
>>> only once.
>>> >
>>> >Do I have right to be concerned or am I completely off track?
>>> Yes you have the right to be worried. Most books put out by the
>>> experts fail to cover this topic. It is most likely not covered on purpose.
>>> IF you notice Mr Brue or Wagner will not even touch the topic since
>>> it is a likely back door to such methods as PGP. And people such as
>>> them are afraid to make trouble for the NSA.
>>>
>>> David A. Scott
>>> --
>>> SCOTT19U.ZIP NOW AVAILABLE WORLD WIDE
>>> http://www.jim.com/jamesd/Kong/scott19u.zip
>>> http://members.xoom.com/ecil/index.htm
>>> NOTE EMAIL address is for SPAMERS
>>
>>My 2 cents worth: It certainly depends on the strength of the
>>encryption method.
>>
>>Using OAP-L3 with a sufficiently strong key, it really does not
>>matter. You are no more likely to decrypt the compressed file as you
>>are likely to decrypt the UNcompressed file.
> It is faulty reasoning like this that makes things easier to break.
>One should always use the strongest encryption possible. But that
>said. But I think your stuck on the feeling your method OAP-L3
>is so strong that it makes not difference. Well if you belive that
>there is no reason to go farther.
> But if your method is not the work of GOD then it might make
>a difference as to weather compression can aid or hinder an attack
>against it. You can not say with any honesty that compression
>makes zero difference. The fact is compression makes a lot of
>difference and a poor compression method can actually aid in
>the breaking of an encryption and the recovery of a message.
>But if your to blinded by the brillance of your method and recieveing
>to many pats on the back then maybe you can't think very
>straight.
> Even Bruce in his book that I recommend no one to read Page 222
Sorry typo ( can you belive that) page 226 saw it in a library
but would never buy it since little real info in it that I could find
also a don't buy from arragant SPAMERS.
>recognizes this from the diagram in his book ( it most be an oversite
>of his since I am sure he really does not wish to help people)
>says use compression then encryption and then error corrections as
>a seperate item. Of course his book never goes deeply into the
>subject of why. And he never talks in his book about what kind of
>info the compression method itself can add to the aid of the
>attacker and what would should consider since Maybe his
>handlers don't want him too. But it is interesting to know
>that at least in this page he also states that error correction
>should be done outside of the encryption phase. Yet I don't see
>him taking a negative stand against the NSA approved chainning
>methods that are so great becasue of error recovery.
>
>>
>>If you are using an encryption method that may be compromised by
>>compressing the file first then encryptung it, then perhaps you
>>should seek a stronger encryption method.
>
> Think again no short keyed short blocked method can be proven
>secure. To think otherwise shows great stupidity. Again use your
>brain if possible. Use the best form of compression and the
>strongest encryption.
>
>My 5 cents worth.
>
>
>David A. Scott
David A. Scott
--
SCOTT19U.ZIP NOW AVAILABLE WORLD WIDE
http://www.jim.com/jamesd/Kong/scott19u.zip
http://members.xoom.com/ecil/index.htm
NOTE EMAIL address is for SPAMERS
------------------------------
From: Barrett Richardson <[EMAIL PROTECTED]>
Crossposted-To:
alt.fan.gburnore,alt.usenet.kooks,alt.privacy.anon-server,alt.privacy,alt.cypherpunks
Subject: Re: "Posting Anonymously is the Sign of a Coward"
Date: Sat, 11 Sep 1999 17:32:36 -0400
On 11 Sep 1999, Charlie Comsec wrote:
> Barrett Richardson <[EMAIL PROTECTED]> wrote:
>
> > On 9 Sep 1999, Charlie Comsec wrote:
> >
> > > Barrett Richardson <[EMAIL PROTECTED]> wrote:
> > >
> > > > I don't argue that the concept of "anonymity" is bad. I
> > > > argue that in it's current implementation in usenet it
> > > > is likely to become a casualty of it's own shortcomings.
> > > > It needs to evolve into something that can be abused
> > > > less openly.
> > >
> > > Please feel free to provide suggestions as to how you'd like to see
> > > "anonymity" change. Can you do it and still have it be *TRUE*
> > > anonymity, and not the usual snake oil where identity is "escrowed"
> > > by some supposedly "trustworthy" third party? That, BTW, is NOT
> > > true anonymity.
> >
> > On the technical side of things, I would like to have my server
> > generate RSA or elliptic curve key pairs and use the public key
> > as a "usenet ID" of sorts for a user. When a user authenticates
> > to the server the associated public/private key pairs get assigned
> > to the user's session. Generate an MD5 hash of the message contents
> > and encrypt it with the private key. Attach the "usenet ID" (just
> > the public key) and the MD5 hash to the bottom of the message.
> > An elliptic curve key would be less obtrusive for this purpose
> > as it is smaller. The MD5 hash and "usenet ID" can be used
> > to validate the message. Messages are archived and made available
> > on the server. Posting history by abusive users can then be identified
> > and verified to be theirs. "Designer abuse" will be (somewhat) more
> > identifiable by the usenet community as the posting history for
> > a "usenet ID" is available. Usenet IDs are not reused when accounts
> > are cancelled. Have a unique From: address so usenet participants
> > that killfile annoying or abusive users don't killfile your
> > entire userbase. The usual steps to secure information in the
> > message header is employed.
>
> This sounds like a major overhaul of the usenet/NNTP RFCs.
>
Just include the ID and signature in the message body, like a PGP
signed message that also has the public key. Requests for the existance
of "usenet ID" can be sent to a listserver.
> However, how does it address the concern about anonymity, especially
> where anonymous usenet posts start out as anonymous *E-MAIL*
> messages routed to a mail-to-news gateway where they are converted
> into usenet posts? If you implemented this at the gateway news
> server, for example, the only ID on an anonymous post would be that
> of the remailer used to send the message to the gateway.
>
Well, it doens't address a mail gateway at all. As an initial idea,
it would be a integral part of news services for my customers if
implemented. I would be excercising some editorial control and
would want to be able to release information to the authorities
should a court action require it.
How is information secured in transit with the mail-to-news
gateway?
> The other problem is that it still does not facilitate a complete
> posting history on any REAL PERSON because there is no way to ensure
> that the person is only using one ID to post with.
For my own customers, each user gets a usenet ID. I wouldn't
want random activity from the internet funneled through my
server with the initial draft. My own local users just get an
alternate newsreader to access if they want anonymity, and
they must authenticate to it.
>
> Also, you said that "Messages are archived and made available
> on the server". Given the current popularity of concealing posts
> using the X-No-Archive header, do you now propose to make archiving
> mandatory?
>
Well, possibly. It *is* public information. It depends on
how feasible it is to identity a pattern of abuse without it.
Shouldn't inhibit responsible users. Another question is how
available the archive should be publicly.
> --
> Charlie Comsec <[EMAIL PROTECTED]>
>
>
>
>
>
>
>
>
>
>
>
>
>
------------------------------
** FOR YOUR REFERENCE **
The service address, to which questions about the list itself and requests
to be added to or deleted from it should be directed, is:
Internet: [EMAIL PROTECTED]
You can send mail to the entire list (and sci.crypt) via:
Internet: [EMAIL PROTECTED]
End of Cryptography-Digest Digest
******************************