Cryptography-Digest Digest #482, Volume #13      Wed, 17 Jan 01 13:13:00 EST

Contents:
  Re: Entirely encrypted operative system or file system (Michael Schmidt)
  Q: split keys (Andy Resnick)
  Re: NSA and Linux Security (Eric Lee Green)
  Re: Comparison of ECDLP vs. DLP (DJohn37050)
  Re: future trends in asymmetric cryptography ([EMAIL PROTECTED])
  Re: multiple anagramming? (John Savard)
  Re: future trends in asymmetric cryptography (Anne & Lynn Wheeler)
  block algorithm on variable length without padding? ("N. Weicher")
  Re: RSA sign in 40ms on a DSP ? ("Pedro F�lix")
  RC5 on 16 word? ("N. Weicher")
  Re: NSA and Linux Security (Mok-Kong Shen)
  SAC question (Benjamin Goldberg)
  Key schedule question (Benjamin Goldberg)

----------------------------------------------------------------------------

Date: Wed, 17 Jan 2001 12:24:48 +0100
From: Michael Schmidt <[EMAIL PROTECTED]>
Subject: Re: Entirely encrypted operative system or file system

Check out Utimaco's "SafeGuard Easy" (www.utimaco.com). They perform
partition-based encryption, which also works for the OS boot partition.
You are prompted for your password still before the OS boots.


Michael

-- 
===================================================
Michael Schmidt
===================================================
Institute for Data Communications Systems
University of Siegen, Germany
www.nue.et-inf.uni-siegen.de
===================================================
http:    www.nue.et-inf.uni-siegen.de/~schmidt   
e-mail:  [EMAIL PROTECTED]
phone:   +49 271 740-2332   fax:   +49 271 740-2536
mobile:  +49 173 3789349
===================================================
###      Siegen - The Arctic Rain Forest        ###
===================================================


alberto65 wrote:
> 
> I mean, the computer boot, and then ask a passphrase.
> Then it begin to load the kernel of the OS using decrypting it.
> 
> Or, more simply just decrypt the encrypted file system....
> 
> Do you know if someone has made something like that?
> 
> Thanks.

------------------------------

From: Andy Resnick <[EMAIL PROTECTED]>
Subject: Q: split keys
Date: Tue, 16 Jan 2001 06:46:57 -0500

As many of you may know, Newsweek had an article about the history of
cryptograpy, and had (for me, anyways) an unusually clear explanation of
how public/private key encryption works.  I'm no expert.  My question
is, does the use of two keys imply that there is more than one
transformation to properly decode an encrypted signal? That is, I
recieve an signal encoded with (for example) PGP.  Now, I'm too lazy to
get the public key but I have infinite computing power (hey, this is a
thought experiment!).  It seems that I will find *two* keys to decrypt
the message, and I have a hunch that they will be based on the two
primes that factor a large number.

Am I somewhat on the right track here?

--
Andy Resnick, Ph.D.
Optical Physicist
Logicon Federal Data



------------------------------

From: [EMAIL PROTECTED] (Eric Lee Green)
Subject: Re: NSA and Linux Security
Reply-To: [EMAIL PROTECTED]
Date: Wed, 17 Jan 2001 14:41:55 GMT

On Tue, 16 Jan 2001 19:45:59 -0700, Shawn Willden <[EMAIL PROTECTED]> wrote:
>Greggy wrote:
>> After
>> the war, the great depression took place and (if you study history) FDR
>> and the congress technically, legally declared the citizens of the US
>> enemies of the US 
>
>This is quite a statement.  Can you provide a reference?

Rush Limbaugh (or G. Gordon Liddy) said it once, so it must be
true. :-)

Baaa! Baaa! Baaaa!

Oh my, I notice I have the words "subversives" and "conspiracy" both
in my posting, guess that means the NSA has my phone tapped (grin).
(Whoops, sorry, that's Louie Feebee's FBI who does the illegal
domestic phone taps, hi Louie, isn't my mom long-winded?! And aren't
those telemarketers a gas? :-).

For those who haven't figured it out, everything above is so
tongue-in-cheek it's a wonder my cheek hasn't exploded. This whole
thread needs to go to talk.politics.crypto.

-- 
Eric Lee Green     Linux Subversives
[EMAIL PROTECTED]    http://www.badtux.org

------------------------------

From: [EMAIL PROTECTED] (DJohn37050)
Date: 17 Jan 2001 14:44:35 GMT
Subject: Re: Comparison of ECDLP vs. DLP

Wei Dai wrote:
"In article <[EMAIL PROTECTED]>, djohn37050
@aol.com says...
> There are many more ways a candidate RSA public key MIGHT be invalid, if the
> modulus is easy to factor, it is easy to invert.  Or if it is hard to factor,
> it means that the (encryption) operation (for example) cannot be inverted by
> anyone, including the owner.

This seems to be a completely different issue from public-key 
validation, which is used to detect keys that may allow an attacker to 
obtain information about someone else's private key. What you're 
talking about here are keys that either don't protect the plaintext or 
make the ciphertext impossible to decrypt.

There are potential problems in elliptic curve key generation that are 
similiar to the ones you give for RSA. They are also not possible to 
detect using the public key alone. For example if the private key has 
low entropy because of a RNG failure, it would be easy for anyone to 
decrypt. Or if there is a bit flip in the elliptic curve point 
multiplication process, the public key may still be a valid point in 
the appropriate subgroup, but the owner can't decrypt.

If you do have access to the private key, then what we're talking about 
is private-key validation. The situation here isn't very different 
between EC and RSA. There is nothing that prevents you from doing lots 
and lots of tests with an RSA private key. You can run the primes 
through a bunch of different primality tests, encrypt and decrypt a 
bunch of messages, etc."

Here is the def. of public key validation that is emerging in stds.  It is the
testing of the arithmetic validity of the components of a public key to help
ensure they conform to the requirements of a standard.

Another note, PKV is the COMPLEMENT of POP, proof of possession.  Doing both
provides high levels of assurance.  

So the bit flip in the ECC private key so that the public key does not
correspond, but is still valid, is addressed by POP.  A bit flip in an ECC
public key is almost certain to make it invalid, hence it SHOULD fail PKV; even
if it does not, doing POP will detect the error that the private key and public
key are out of sync for some reason and therefore in error.

PKV is not a substitute in any way for POP, it complements POP, both are
useful.

Also, while PKV is used to detect keys that might attack your private key when
used in DH, there are many other potential concerns with an invalid public key.
 If using an invalid key, the encryption may not be invertible and hence not
able to be recovered by anyone or it may be invertible by anyone and hence
recoverable by anyone.  And here is a crucial point, if a public key is
invalid, EVEN IF A SIGNATURE VERIFIES, the signature should be considered
invalid.  This is because the game is not being played in the intended sandbox
and hence all bets are off.
Don Johnson

------------------------------

From: [EMAIL PROTECTED]
Subject: Re: future trends in asymmetric cryptography
Date: Wed, 17 Jan 2001 15:11:22 GMT

Mr Ashwood/Joe,

Thankyou.

In article <uCT8n1$fAHA.281@cpmsnbbsa07>,
  "Joseph Ashwood" <[EMAIL PROTECTED]> wrote:
>We have the beginnings of this process now, with
> the questioning of X.509 by several researchers over the last few
>years.

sir, would like to read more on this. could you provide any links to
papers...

<snip>
> do not expect anyone to prove the strength of cryptography within the
> foreseeable years, even though I plan on (I should probably say hope
to)
> etching some progress in that direction.

me 2.

regards,
rasane_s


Sent via Deja.com
http://www.deja.com/

------------------------------

From: [EMAIL PROTECTED] (John Savard)
Subject: Re: multiple anagramming?
Date: Wed, 17 Jan 2001 15:03:03 GMT

On Mon, 15 Jan 2001 20:46:04 GMT, Benjamin Goldberg
<[EMAIL PROTECTED]> wrote, in part:

><flame type=impersonal>
>It's utterly astounding how often requests for online references result
>in people responding with book references!
>
>Call me an idiot, or cheap and impatient, but I don't see how something
>which will cost quite a bit of money, and take a week or more to get,
>and which has only one or two things I want, out of an entire book, is
>supposed to help me on something I want help with now, in the next day
>or so (and preferably without spending money).
>
>When I ask for an online reference, there's a reason for it.
></flame>

I could flame right back.

The problem isn't so much that you are "cheap" or "impatient". There's
nothing wrong with wanting information at minimum cost, and maximum
convenience.

Instead, the problem is that you are unrealistic and unimaginative.

Unrealistic: Because the web is the home to mostly free content, the
quality of what you find there is often limited. There are good
references to some subjects, but they will often be incomplete.

Unimaginative: Books don't always have to cost a lot of money.
Sometimes you can find a recommended book in your local public
library.

In any case, I noted Gaines - a book from 1939 - to address the
question of the technique of multiple anagramming being classified
until recently.

John Savard
http://home.ecn.ab.ca/~jsavard/crypto.htm

------------------------------

Subject: Re: future trends in asymmetric cryptography
Reply-To: Anne & Lynn Wheeler <[EMAIL PROTECTED]>
From: Anne & Lynn Wheeler <[EMAIL PROTECTED]>
Date: Wed, 17 Jan 2001 16:01:44 GMT


[EMAIL PROTECTED] writes:

> Mr Ashwood/Joe,
> 
> Thankyou.
> 
> In article <uCT8n1$fAHA.281@cpmsnbbsa07>,
>   "Joseph Ashwood" <[EMAIL PROTECTED]> wrote:
> >We have the beginnings of this process now, with
> > the questioning of X.509 by several researchers over the last few
> >years.
> 
> sir, would like to read more on this. could you provide any links to
> papers...

x.509 isn't so much about asymmetric cryptography but about key
distribution and information binding between parties that have no
prior business relationship, especially in an offline environment. for
centuries, businesses have used account records for information
binding ... especially for timely and aggregated information.  x.509
identity certificates also can represent a signficant privacy issue,
especially with respect to retail financial transactions.

several process scenerios have been looked at that once two entities
establish some sort of (business or other) relationship, that
certificates for the purpose of establishing some level of reliance
between two parties become superfulous and redundant ... especially
with respect to online electronic transactions that might involve
timely and/or aggregated information (compared to stale &/or privacy
information bound into certificates).

there has been a mapping of the recently passed account-based secure
payment objects standard to an account-based public key authentication
process (as opposed to a certifcate-based public key authentication
process).

misc. refs at

http://www.garlic.com/~lynn/
http://www.garlic.com/~lynn/8583flow.htm
http://lists.commerce.net/archives/ansi-epay/200101/msg00001.html

-- 
Anne & Lynn Wheeler | [EMAIL PROTECTED] - http://www.garlic.com/~lynn/

------------------------------

Reply-To: "N. Weicher" <[EMAIL PROTECTED]>
From: "N. Weicher" <[EMAIL PROTECTED]>
Subject: block algorithm on variable length without padding?
Date: Wed, 17 Jan 2001 16:17:20 GMT

Is it possible to use a block algorithm (such as Blowfish or DES) to encrypt
plaintext where the length is not a multiple of eight bytes?  I know about
padding, but what if padding is not an option, ie, the encrypted data must
be the exact same length as the plaintext data?  Is this feasible?  If so,
how is it done?

Thanks for any feedback.

Neil

PS, if responding by email remove REMOVE




------------------------------

From: "Pedro F�lix" <[EMAIL PROTECTED]>
Subject: Re: RSA sign in 40ms on a DSP ?
Date: Wed, 17 Jan 2001 16:29:36 -0000

As mentioned before, "A cryptographic library for the Motorola 56000" by
M. Dusse and B. Kaliski is a good reference on the subject.

The main idea is to compute the Montgomery multiplication using convolution
sums, which are well suited for the MAC structures of DSPs.

You can find more information about this subject in [C. K. Koc, T. Acar, and
B. S. Kaliski Jr. Analyzing and comparing Montgomery multiplication
algorithms. IEEE Micro, 16(3):26-33, June 1996.] available at
http://www.security.ece.orst.edu/publications.html .

For the minimization of multiplications in a exponentiation see [C. K. Koc.
High-Speed RSA Implementation. TR 201, RSA Laboratories, 73 pages, November
1994] also available at http://www.security.ece.orst.edu/publications.html .

P. Felix




------------------------------

Reply-To: "N. Weicher" <[EMAIL PROTECTED]>
From: "N. Weicher" <[EMAIL PROTECTED]>
Subject: RC5 on 16 word?
Date: Wed, 17 Jan 2001 17:02:56 GMT

Can RC5 be used on a 16-bit word?  If so, how would that affect the
security?

Thanks.





------------------------------

From: Mok-Kong Shen <[EMAIL PROTECTED]>
Subject: Re: NSA and Linux Security
Date: Wed, 17 Jan 2001 18:18:19 +0100


Since a few follow-ups in this thread have touched on 
Echelon-like projects, I like to reproduce (in original) the 
following interview of Wirtschaftsinformatik, a German CS 
periodical, with Kevin McCurley, President of IACR, published
in its recent issue 42(2000), p.548. McCurley's answer 
reflects a very neutral standpoint of scientists towards such 
otherwise often temperamentally argued stuffs in my humble 
view.

WI: Is it true that government efforts for national security,
e.g. the Echelon network, can be abused for criminal actions,
e.g. espionage?

McCurley: I have no knowledge of the alledged Echelon network,
although it is natural to assume that systems of this type
exist. Espionage has always existed, and there is no reason
to expect it to go away. It has been argued that an informed
government tends to make better decisions than an ignorant
one, but this is a matter of debate. I'm sure it's very 
tempting to listen to e-mail, since it's a lot easier for a 
computer to read and index than a phone conversation is!

M. K. Shen
=============================
http://home.t-online.de/home/mok-kong.shen

------------------------------

From: Benjamin Goldberg <[EMAIL PROTECTED]>
Subject: SAC question
Date: Wed, 17 Jan 2001 17:20:26 GMT

In the normal definition of the Strict Avalanche Criterion (SAC), changing any input 
bit, or selection of input bits, should change each output bit with probability of 1/2.

        However, for an invertable function, there *must* be some bias. In the entry 
for SAC in Terry Ritter's glossary, he gives the example that for a 2 bit table, if 
one entry is the original value, there are only 3 "changed" values, so an input 
difference can cause each output bit to change with probability 2/3, not 1/2.

        If I were to consider an N bit table, the probability of an output bit 
changing when the input changes, should, if the table is as close to SAC as possible, 
change with probability of 1-(2^(N-1)-1)/(2^N-1).

        Is there any particular term for this type of "Almost SAC?"

        For instance, if a 128 bit cipher fulfills the property that if an input bit 
changes, then each of the output bits change with probability 1-(2^127-1)/(2^128-1), 
what do you call that property?

        Also, is there any term for calculating SAC on larger units than single bits?  
Maybe "bytewise SAC," or "wordwise SAC?"

        For instance, if there is some input difference to a function, any selection 
of x bits of the output should change with probability (2^x-1)/(2^x), if the strict 
definition of SAC is fulfilled.  With x=8, this is 255/256.

        Combining the two, if the function is invertable, and the blocksize is N then 
the probability of an x bit selection changing is 1-(2^(N-x)-1)/(2^N-1), if the 
"Almost SAC" property is fulfilled.  With x=8 and N=128, this is 1-(2^120-1)/(2^128-1).

        For those curious about why I would want to use/invent such terms and 
properties, it came about when I tried to calculate the rate of avalanche for a toy 
version of my hypercrypt cipher, using 8 1 bit bytes.
        Ignoring the bias, the odds of any output bit of a round changing when exactly 
one input bit is changed has probability 1/8.  In addition, assuming the mixers 
produced unbiased SAC lowered the odds of an output difference with each and every 
round!
        Taking the bias into account, if one bit is changed going into a round, each 
output bit changes with probability 8/27, and each subsequent round brings the 
probability of each output bit changing closer to um 1/2.  That's odd, shouldn't it be 
approaching 128/255?
        Hmm.  Each round has 3 layers, each layer has 4 mixings of 2 bits, each bit 
going into the mixing has probability p of a nonzero difference, and each bit coming 
from the mixing has changed with probability (1-(1-p)^2)*2/3.  After 16 rounds, 
starting at 8/27, this approaches 1/2.

Aaargh, someone tell me what I've done wrong!  What do I need to do to make it 
approach 128/255?

PS, the first round is calculated differently, since it's known that there's *exactly* 
a one bit input difference.  Propagating through the 3 layers of the network reduces 
that bit difference probability by a factor for each layer -- to be more specific, 
(2/3)^3 = 8/27.

PPS, if I've done any math wrong anywhere else, somebody please tell me!

-- 
Most scientific discoveries do not begin with "Eureka!"  They begin with
"That's odd.  I wonder why that happened?"



------------------------------

From: Benjamin Goldberg <[EMAIL PROTECTED]>
Subject: Key schedule question
Date: Wed, 17 Jan 2001 17:20:32 GMT

I'm trying to understand part of the key schedule in Tom's TC5 cipher, which I'm using 
as a basis for part of hypercrypt's key schedule.

In TC5, he does something like the following:
        memcpy( temp, ukey, 32 )
        for (i = 32; i < 512+32; i++)
                key[i - 32] = temp[i] =
                        temp[i-32] ^ temp[i-31] ^ temp[i-21] ^
                        temp[i-20] ^ temp[i-16] ^ temp[i-15] ^
                        temp[i- 6] ^ temp[i- 3] ^ temp[i- 1];

Whereas I do something like this:
        memcpy( key, ukey, 32 );
        for( i = 32; i < EXPANDEDKEY; ++i )
                key[i] = 
                        key[i-32] ^ key[i-31] ^ key[i-21] ^
                        key[i-20] ^ key[i-16] ^ key[i-15] ^
                        key[i- 6] ^ key[i- 3] ^ key[i- 1];

In both cases this linear stretching step is followed by nonlinear mixing.  Does Tom's 
way give any extra strength?  Am I weakening the stretching by having the stretched 
key start off with the user key?

Also, what kind of polynomial is this?  A complete order-31 poly, or all the taps but 
one of an order 32 polynomial?  If I want to pick my own poly for this step, what do I 
need?

Clearly what we're doing is essentially filling the "key" array with LFSR output of 
the user key.  It makes sense to me that we would want a maximum period stream, even 
if we only use the beginning of it, which means a primitive polynomial.

However, with 9 taps, there are 10 terms.  IIRC, you can't have an order 32 primitive 
polynomial with an even number of terms.  If it's not primitive, why that selection?

You could easily have a 13 term (12 tap) LFSR poly, so why doesn't this?

-- 
Unofficial member of the Procrastinator's Club of America.  I haven't applied for my 
membership card yet, but I'll get around to it.  Really I will.  Really!



------------------------------


** FOR YOUR REFERENCE **

The service address, to which questions about the list itself and requests
to be added to or deleted from it should be directed, is:

    Internet: [EMAIL PROTECTED]

You can send mail to the entire list by posting to sci.crypt.

End of Cryptography-Digest Digest
******************************

Reply via email to