Cryptography-Digest Digest #805, Volume #12       Sun, 1 Oct 00 03:13:01 EDT

Contents:
  Re: RSA and Chinese Reminder Theorem (Tony Lauck)
  Re: NIST Statistical Test Suite ("Paul Pires")
  Re: Yet another LFSR idea. (Benjamin Goldberg)
  Re: Question on biases in random-numbers & decompression (Benjamin Goldberg)
  Re: Josh MacDonald's library for adaptive Huffman encoding (Benjamin Goldberg)
  Re: NIST Statistical Test Suite (Benjamin Goldberg)
  Re: Choice of public exponent in RSA signatures (Roger Schlafly)
  Re: How to get Certificate content after HTTPS Authentication ("Joyce")
  Re: Josh MacDonald's library for adaptive Huffman encoding (SCOTT19U.ZIP_GUY)

----------------------------------------------------------------------------

From: Tony Lauck <[EMAIL PROTECTED]>
Subject: Re: RSA and Chinese Reminder Theorem
Date: Sat, 30 Sep 2000 22:23:47 -0400

Thank you. Alternatively,  "Right On!"

Tony Lauck
(An occasional lurker on sci.crypt.)

Roger Schlafly wrote:
> 
> Bob Silverman wrote:
> > A minor nit.  System of congruences. Not "system of equations".
> 
> You wouldn't be so insufferable if you were at least correct
> with these trivial nits of yours.
> 
> They are congruences over the integers, or equations over
> the quotient ring. Either usage is correct.
> 
> If you want to score a trivial nit, correct his spelling of the
> name of the other BS man. <g>

------------------------------

From: "Paul Pires" <[EMAIL PROTECTED]>
Crossposted-To: sci.crypt.random-numbers
Subject: Re: NIST Statistical Test Suite
Date: Sat, 30 Sep 2000 19:39:12 -0700


bubba <[EMAIL PROTECTED]> wrote in message
news:3DvB5.12875$[EMAIL PROTECTED]...
> I built it last night using Microsoft VC6.0 for x86.
> I had to dummy out erfc() and erf(),

I hope I am not mis-understanding you. The test suite lists efrc
(in the glossary up front) as:

"Complementary error function.  See efrc"
"...is related to the normal edf."

Perhaps it is a needed defined function?

If so I would think that when the code
looks for that, it needs it as it is a component
in the tests and results analysis.

Don't shoot... I'm tryin to help.

Paul

>as those are
> not standard in the PC world. I wish they would
> have release PC compatible source, more than
> a few of use those nowadays.
>
> I  got plenty of warnings. Some suggest SUN's
> compiler missing questionable code.
>
> The executable runs, but I fell asleep downloading
> the sample data. Maybe I will fool with it again later.
>
>
>
> "Mok-Kong Shen" <[EMAIL PROTECTED]> wrote in message
> news:[EMAIL PROTECTED]...
> >
> > While all the current attention of our groups in direction
> > of NIST is apparently on AES, I believe that it is barely
> > known that NIST has just contributed something also of
> > essential interest to us. In
> >
> >      http://csrc.nist.gov/rng/
> >
> > there is now available for download an apparently
> > fairly good statistical test suite. A technical problem
> > may be however that the stuff is in UNIX tar files.
> >
> > I hope that this news is of value to those interested
> > in random numbers. If someone gains practical experience
> > with the test suite, it would be nice if he will give
> > a report on that to us.
> >
> > M. K. Shen
> > -------------------------
> > http://home.t-online.de/home/mok-kong.shen
>
>





------------------------------

From: Benjamin Goldberg <[EMAIL PROTECTED]>
Subject: Re: Yet another LFSR idea.
Date: Sun, 01 Oct 2000 03:56:31 GMT

David Wagner wrote:
> 
> Benjamin Goldberg  wrote:
> > Create a circular array of N bytes, and fill it randomly.
> > To update, combine the current byte with the sum of a selection of
> > the "previous" N-1 bytes, using a primitive polynomial as the
> > selector, and output it.  Do a circular shift left by 1 bit after
> > adding.  This brings the nonlinear top bit to the bottom.
>
> You might want to look at mod n attacks.
> 
> If you look at the byte entries mod 255, the result is
> almost entirely linear.  Why?  First, x<<<1 = 2x mod 255.
> Second, (x mod 255) + (y mod 255) is either (x+y mod 255)
> or (x+y-1 mod 255) (with prob. 1/2).  Thus, a significant
> bias remains; your cipher is "almost linear" (where here
> I mean linear with respect to addition mod 255, not to xor).

I'm not certain I get it...  The idea for my cipher is:
tmp = state[i-tap0] + state[i-tap1] + ... + state[i-tapN];
state[i] = tmp <<< 1;

I think you misread my idea to mean to do the rotate-left after each
individual add (I might be mistaken, though).

And if you were reading me right... well, I'm afraid I don't understand
your attack... possibly because my idea was so loosely described.

> For example, there is a simple attack with 2^{N lg N}
> workfactor needing N bytes of known keystream which works
> by guessing the carry bits at the addition at each update.
> You're then left with a LFSR over Z/255Z, and solving this
> is straightforward if you know the feedback taps.  (If the
> taps are unknown, this attack needs 2N bytes of keystream
> and 2^{2N lg N} work.)

I don't understand.  Maybe it's me, or maybe it's your description
[probably me], but could you make it clearer?  Perhaps with an example.

> Note that the above is substantially faster than the 2^{8N}
> workfactor of an exhaustive search.
> 
> This is only a simplistic example of an attack, and I
> strongly suspect there are much more powerful correlation
> attacks available.  However, I don't have time to check.

--
... perfection has been reached not when there is nothing left to
add, but when there is nothing left to take away. (from RFC 1925)


------------------------------

From: Benjamin Goldberg <[EMAIL PROTECTED]>
Crossposted-To: comp.compression
Subject: Re: Question on biases in random-numbers & decompression
Date: Sun, 01 Oct 2000 04:28:06 GMT

Herman Rubin wrote:
> 
> In article <[EMAIL PROTECTED]>, Tim Tyler  <[EMAIL PROTECTED]> wrote:
> >In sci.crypt Mok-Kong Shen <[EMAIL PROTECTED]> wrote:
> >: "D.A.Kopf" wrote:
> 
> >:> So the original poster was correct, the inverse of an arithmetic
> >:> compresser would be effective. Just "decompress" the random
> >:> bitstream into the needed bin size.
> 
> >: Could you give a reference to an efficient decompressor
> >: that works for arbitrary target range? Thanks.
> 
> >He means an ordinary arithmetic decompressor, with a small number of
> >symbols set up to occur with equal frequency.
> 
> This has been posted before.
> 
> A most bit efficient procedure to generate one of n equally
> likely options in {0, 1, ..., n-1} is as follows; using
> hardware features can improve the speed, but these are likely
> to be machine dependent.
[neat not-previously-posted-algorithm (saved and) snipped]

This is really great for random numbers, but it's not reversible.  Using
an arithmetic decompressor will allow me to reversibly turn my plaintext
bit-string into a base-3 plaintext suitable for NTRU.  I want to convert
my original plaintext base-2 stream into the shortest base-3 stream
possible, chop it into blocks, encrypt, transmit, decrypt, concatenate
the blocks, and change the resulting base-3 stream back into a base-2
stream.

Perhaps^H^H^H^H^H^H^H I should have stated this back in the post that
started the thread.

I didn't do so then, because I *also* need a source of truly random
base-3 numbers, as part of the encryption process.  I think at this
point that I'm simply going to use two seperate algorithms, the
arithmetic decompressor for the plaintext, and one of the various
random-number-in-range algorithms for the 'obfuscator' component of
encryption.

I'd originally thought I could use a decompressor for each, one using
plaintext bits, one using random bits, since the criteria are very
similar, though slightly different, but doesn't seem like it's going to
happen.

Just as an aside, what's needed for the plaintext stream is:
1) Minimum expansion of the information bitstream when going from base-2
to base-3.
2) The conversion should be reversible (lossless).
*) This implies maximum compression from base-3 to base-2.
What's needed for the 'obfuscator' stream is:
1) Bits from the underlying generator are used in the optimum way
possible, because the generator may be very slow.
*) This can be phrased as, "Minimum lossless expansion of the random
bitstream when going from base-2 to base-3."  This is why I thought to
use the same solution to both problems.
2) If all the previous values of the base-3 stream are known, there
should be no better than 1/3 probability of guessing the next symbol,
even if the underlying base-2 generator (but not the generator's state)
are known.
*) We can assume that if all of the generator's previous output is
known, and the generator's algorithm (but not it's state) is known, the
opponent has no better than 1/2 probability of guessing the next output
bit...  In other words, the underlying bit generator is a 'good' one.

--
... perfection has been reached not when there is nothing left to
add, but when there is nothing left to take away. (from RFC 1925)

------------------------------

From: Benjamin Goldberg <[EMAIL PROTECTED]>
Crossposted-To: comp.compression,comp.theory
Subject: Re: Josh MacDonald's library for adaptive Huffman encoding
Date: Sun, 01 Oct 2000 05:22:59 GMT

SCOTT19U.ZIP_GUY wrote:
>
> [EMAIL PROTECTED] (Mok-Kong Shen) wrote in
> <[EMAIL PROTECTED]>:
>
> >"SCOTT19U.ZIP_GUY" wrote:
> >>
> >> [EMAIL PROTECTED] (Mok-Kong Shen) wrote:
> >>
> >> >But you could at least publish it in a cs or crypto
> >> >scientific journal, since it would be a significant
> >> >contribution. But perhaps I could conjecture what
> >> >would be your answer: These journals have editors
> >> >that are all against a real scientist like you.
> >> >
> >>
> >>    I think most publishing is for a more or less closed
> >> group of people. I have had some of my work published by
> >> others when I worked for the government I expect nothing
> >> different know that I am retired. If you wish to publish
> >> it fell free to do so.
> >
> >If there is really good will to let your ideas put
> >to the public, then it shouldn't be a problem at all,
> >to spend effort, time and again, to put these clearly
> >und understandably to, e.g. internet groups. If one
> >doesn't do that, it plainly indicates lack of good
> >will or that the ideas are no likely to be no good.
> >
> 
>   That's where YOUR WRONG AGAIN. How dare you say
> it shouldn't be a problem at all. I have made it clear
> you can look at the code if your not so dam lazy.
> If you run into specific problems I would help but
> you have to do something. If you think I lack good
> will then F**K you.

That's an extremely rude and abusive attitude.  While you have some good
ideas (in terms of algorithms), your understanding of how to present
your ideas plainly sucks.  When someone asks for an explanation, the
don't want "look at the code if your not so dam lazy", they want, "The
algorithm works as follows: <neatly written machine independent
psuedocode, possibly with some comments on what is happening in which
part of the algorithm, with no speed/memory[/whatever] optimizations
(unless the discussion is how to optimize speed/memory[/whatever]), and
minimal obfuscation>"  Working code might be a good implementation of an
algorithm, but it is not a specification of the algorithm.  Nor does it
tell *how* or *why* it works.  Furthermore, suppose that your code does
something *wrong* -- a logic bug -- when compared to your idea of what
is/should be happening.  This could result in something that looks
right, but isn't.  Since we don't know what the algorithm is supposed to
be (cause you refused to tell us), we can't tell you there's a problem,
or help the problem get fixed.  If you post your algorithm here, to this
NG, we can look at it, write our own implementations, and compare the
results of what we got and what you got.

As an example, suppose you have an idea for a new compression scheme
that should get 47% compression on certain types of text.  You implement
it, and make a mistake, and it gets 45% compression.  It looks right,
but it isn't.  If our only reference is your code, all our versions can
possibly get is 45% compression.  If our reference is your algorithm,
posted to this NG, our implementations will get the desired 47%
compression.

Similarly, consider encryption;  Suppose your algorithm uses a fiestel
structure, and you have an F function which is supposed to be bijective
(but isn't in your code, due to a typo, perhaps & instead of ^). This
property is something that would as a matter of course when describing
an algorithm, but might not be mentioned when writing code.  Sure the
code will work, but it won't have the same security properties that you
desire.  Without a *seperate* algorithm and reference implementation, we
can't tell that such-and-such function is correct.

Consider Tom St. Denis's tc5 cipher.  If he had just put code on his web
site, and no analysis, and no description of the design rationale, we
wouldn't be nearly as appreciative as we are.  On the other hand, if he
just the paper that's on his site, and no code... well, we could write
some code for it ourselves, since the paper describes the algorithm. 
Without his paper, and just the code, we don't have an analysis.

You have a tendency to present poorly commented, structurally irregular
code, with no analysis, and the only documentation you'll have is a
difficult to read, grammatically incorrect description of the rationale,
and nothing remotely like a proof.

And last but not least, consider people who, like myself, have a modem
and one phone line.  I dial in, download my mail and news, and
disconnect, so as to not prevent phone calls from coming in.  I
generally only bother to look at web pages if I am quite certain that it
will have information useful to me.  Maybe if it's past midnight, I'll
browse the web, but not otherwise.

Plus there's the extra annoyance of searching for your page, and then
though your pages for the new thing added.  MOST people who've just
added something to their web site will post a URL to that new document
for our convenience, but I guess you think you're too good for that,
just like you're too good to run a spellchecker before posting things,
or putting them on the web. FYI, the word "dam" means either "mother"
(when talking of equines), or a man (or animal) -made thing which blocks
a stream or river, or the act of blocking a stream or river.  The you
wanted in the post I replied to was "damn", which means to go to Hell,
or to be destined for Hell.

Since your problems stem primarily from stupidity, not malice, I doubt
that you are damned.  However, I *would* like to dam you, or at least
your mouth.  That would make this place a quieter, more pleasent NG.

Sadly for us, not only aren't you damned, but you aren't dammed, either.

--
... perfection has been reached not when there is nothing left to
add, but when there is nothing left to take away. (from RFC 1925)

------------------------------

From: Benjamin Goldberg <[EMAIL PROTECTED]>
Crossposted-To: sci.crypt.random-numbers
Subject: Re: NIST Statistical Test Suite
Date: Sun, 01 Oct 2000 05:51:33 GMT

Paul Pires wrote:
[snip]
> I hope I am not mis-understanding you. The test suite lists efrc
> (in the glossary up front) as:
> 
> "Complementary error function.  See efrc"
> "...is related to the normal edf."

Hmm... "You are such a wonderful, stupendous person, I hate telling you
there's an error, but..."  That kind of complimentary?

--
... perfection has been reached not when there is nothing left to
add, but when there is nothing left to take away. (from RFC 1925)

------------------------------

From: Roger Schlafly <[EMAIL PROTECTED]>
Subject: Re: Choice of public exponent in RSA signatures
Date: Sat, 30 Sep 2000 23:00:47 -0700

Francois Grieu wrote:
> "There is evidence that the equivalence (..of factoring and
>  RSA..) does not hold if the (..public exponent is small..).
> 
> What is this evidence ? Does it apply to properly formatted
> RSA signatures ?

Have you seen this paper?

Breaking RSA may be easier than factoring
http://crypto.stanford.edu/~dabo/papers/no_rsa_red.pdf

The use of the word "evidence" seems to be in a non-mathematical
sense. It might be meaningless.

------------------------------

From: "Joyce" <[EMAIL PROTECTED]>
Subject: Re: How to get Certificate content after HTTPS Authentication
Date: Sun, 1 Oct 2000 14:35:33 +0800

In Apache httpd.conf file,
SSLOptions +FakeBasicAuth +ExportCertData +CompatEnvVars +StrictRequire

I try to get the Certificate content by 3 ways. They are returned null.
(1) Get attribute of javax.net.ssl.cipher_suite, and
javax.net.ssl.peer_certificates.

String cipherSuite = (String)
request.getAttribute("javax.net.ssl.cipher_suite");
X509Cert certChain [] = (X509Cert []) request.getAttribute
("javax.net.ssl.peer_certificates");

(2) Get Certificate by calling getUserPrincipal().
java.security.Principal pl = request.getUserPrincipal();

(3) Get value by passing header "SSL_CLIENT_CERT".
String sl = request.getHeader("SSL_CLIENT_CERT");


Please tell me what's wrong (server setting or program code) ?
You are highly appreciated to give me an example.

Best Regards,
Joyce

"Paul Rubin" <[EMAIL PROTECTED]> wrote in message
news:[EMAIL PROTECTED]...
> "Joyce" <[EMAIL PROTECTED]> writes:
>
> > Dear all,
> >
> > Would you tell me how to get Client Certificate content (eg. Signature
> > Algorithm, Issuer information, Subject information, Public key and etc)
> > after HTTPS Client Authentication is successful ?
> >
> > My program is run under
> > Environment: apache + openSSL + tomcat
> > Platform: NT
>
> Sci.crypt really isn't the right place for this type of question.
> It's better to ask someplace like comp.infosystems.www.servers.unix.
> But here goes anyway.
>
> OpenSSL is just the SSL library--you also have to say what Apache/OpenSSL
> interface you're using.  Mod_ssl is the most popular one.  It normally
> doesn't bother providing the client certificate in order to save
environment
> space.  If you want the certificate, put
>
>   SSLOptions ExportCertData
>
> in your SSL host configuration in httpd.conf.  That will put the
> client cert contents in the SSL_CLIENT_CERT cgi environment variable.
> However, the stuff inside the certificate may not be visible.  You
> may have to pass the certificate to OpenSSL from your application,
> if you want to read its contents.



------------------------------

From: [EMAIL PROTECTED] (SCOTT19U.ZIP_GUY)
Crossposted-To: comp.compression,comp.theory
Subject: Re: Josh MacDonald's library for adaptive Huffman encoding
Date: 1 Oct 2000 06:36:09 GMT

[EMAIL PROTECTED] (Benjamin Goldberg) wrote in 
<[EMAIL PROTECTED]>:

>SCOTT19U.ZIP_GUY wrote:
>>
>> [EMAIL PROTECTED] (Mok-Kong Shen) wrote in
>> <[EMAIL PROTECTED]>:
>>
>> >"SCOTT19U.ZIP_GUY" wrote:
>> >>
>> >> [EMAIL PROTECTED] (Mok-Kong Shen) wrote:
>> >>
>> >> >But you could at least publish it in a cs or crypto
>> >> >scientific journal, since it would be a significant
>> >> >contribution. But perhaps I could conjecture what
>> >> >would be your answer: These journals have editors
>> >> >that are all against a real scientist like you.
>> >> >
>> >>
>> >>    I think most publishing is for a more or less closed
>> >> group of people. I have had some of my work published by
>> >> others when I worked for the government I expect nothing
>> >> different know that I am retired. If you wish to publish
>> >> it fell free to do so.
>> >
>> >If there is really good will to let your ideas put
>> >to the public, then it shouldn't be a problem at all,
>> >to spend effort, time and again, to put these clearly
>> >und understandably to, e.g. internet groups. If one
>> >doesn't do that, it plainly indicates lack of good
>> >will or that the ideas are no likely to be no good.
>> >
>> 
>>   That's where YOUR WRONG AGAIN. How dare you say
>> it shouldn't be a problem at all. I have made it clear
>> you can look at the code if your not so dam lazy.
>> If you run into specific problems I would help but
>> you have to do something. If you think I lack good
>> will then F**K you.
>
>That's an extremely rude and abusive attitude.  While you have some good
>ideas (in terms of algorithms), your understanding of how to present
>your ideas plainly sucks.  When someone asks for an explanation, the
>don't want "look at the code if your not so dam lazy", they want, "The
>algorithm works as follows: <neatly written machine independent
>psuedocode, possibly with some comments on what is happening in which
>part of the algorithm, with no speed/memory[/whatever] optimizations
>(unless the discussion is how to optimize speed/memory[/whatever]), and
>minimal obfuscation>"  Working code might be a good implementation of an
>algorithm, but it is not a specification of the algorithm.  Nor does it
>tell *how* or *why* it works.  Furthermore, suppose that your code does
>something *wrong* -- a logic bug -- when compared to your idea of what
>is/should be happening.  This could result in something that looks
>right, but isn't. 

   YOur premese is not possible it either works or doesn't the
optimal adaptive huffman coding is black and white. Mok should 
knows this we use to exchange letters as well as notes here he
just never learns and that I get pissed off at him. Maybe you
will to after a few years.


   That's how I do it. Its black magic.

   Actually it quite simple my basic huffman compression is such
that for any file X decompress( compress(X ) ) = X
and compress( decompress(X)) = X that is what it does it
is easy to test. Someone found an error in h2com less than a
year ago it was fixed. It is documented in source code. Any
more and you have to play with it.



>You have a tendency to present poorly commented, structurally irregular
>code, with no analysis, and the only documentation you'll have is a
>difficult to read, grammatically incorrect description of the rationale,
>and nothing remotely like a proof.

   I don't write notes. Even when I worked for the government
my job was to write code to keep the stuff in the air. comments
are never correct anyway. I always made the algorithms do what
seems natural. If the words don't fit who cares. John likes to
write specualtion I worte code that works. 
 
>
>And last but not least, consider people who, like myself, have a modem
>and one phone line.  I dial in, download my mail and news, and
>disconnect, so as to not prevent phone calls from coming in.  I
>generally only bother to look at web pages if I am quite certain that it
>will have information useful to me.  Maybe if it's past midnight, I'll
>browse the web, but not otherwise.
>
>Plus there's the extra annoyance of searching for your page, and then
>though your pages for the new thing added.  MOST people who've just
>added something to their web site will post a URL to that new document
>for our convenience, but I guess you think you're too good for that,
>just like you're too good to run a spellchecker before posting things,
>
    NOt to good fpr a spell checker just not enough time.

But I thought I usually includ a pointer to mew stuff. At least
I wish to amd will try to make pointers to new stuff


David A. Scott
-- 
SCOTT19U.ZIP NOW AVAILABLE WORLD WIDE
        http://www.jim.com/jamesd/Kong/scott19u.zip
Scott famous encryption website **now all allowed**
        http://members.xoom.com/ecil/index.htm
Scott LATEST UPDATED source for scott*u.zip
        http://radiusnet.net/crypto/  then look for
  sub directory scott after pressing CRYPTO
Scott famous Compression Page
        http://members.xoom.com/ecil/compress.htm
**NOTE EMAIL address is for SPAMERS***
I leave you with this final thought from President Bill Clinton:

------------------------------


** FOR YOUR REFERENCE **

The service address, to which questions about the list itself and requests
to be added to or deleted from it should be directed, is:

    Internet: [EMAIL PROTECTED]

You can send mail to the entire list (and sci.crypt) via:

    Internet: [EMAIL PROTECTED]

End of Cryptography-Digest Digest
******************************

Reply via email to