Cryptography-Digest Digest #182, Volume #12       Sun, 9 Jul 00 07:13:00 EDT

Contents:
  Re: Using CRC's to pre-process keys (Mack)
  Re: computer program:  extract consonants/vowels from input ("Douglas A. Gwyn")
  Re: Using CRC's to pre-process keys (Mack)
  Re: A thought on OTPs (Bryan Olson)
  Re: Using CRC's to pre-process keys (Mack)
  Re: Using CRC's to pre-process keys (Mack)
  Re: MD of large data-sets (Bryan Olson)
  Re: Using CRC's to pre-process keys (Simon Johnson)
  Re: Using CRC's to pre-process keys (Simon Johnson)
  Re: Concepts of STRONG encryption using variable base http://www.edepot.com/phl.html 
([EMAIL PROTECTED])
  Re: Proposal of some processor instructions for cryptographical  (Mok-Kong Shen)
  Re: A thought on OTPs (Mok-Kong Shen)
  Re: Proposal of some processor instructions for cryptographical  (Mok-Kong Shen)
  Re: Proposal of some processor instructions for cryptographical  (Mok-Kong Shen)
  Re: Proposal of some processor instructions for cryptographical  (Mok-Kong Shen)
  Re: Proposal of some processor instructions for cryptographical  (Mok-Kong Shen)
  Re: Concepts of STRONG encryption using variable base  (Mok-Kong Shen)

----------------------------------------------------------------------------

From: [EMAIL PROTECTED] (Mack)
Subject: Re: Using CRC's to pre-process keys
Date: 09 Jul 2000 07:48:32 GMT

>SHA-1 is a 160-bit hash function.  I don't understand what you mean by
>"isn't likely to contain the full 128 bits, due to collisions".  I assume
>you are refering to 128 bits of entropy.  And the part about collisions
>doesn't make any sense because if SHA-1 suffers from it, then CRC will to.
>There are collisions in both cases.  Any time you take N bits and compute a
>hash value of M bits, and N is greater than M there will always be
>collisions.  And in some cases depending on the hash function there will be
>collisions even when N is less than or equal to M.  Also if what you say
>about a 128-bit CRC of a 128 value having the same entropy is true, then
>there is no reason to use CRC since you aren't gaining anything by doing it.
>
>- Adam
>
>

I believe what he was trying to say is that if you take a SHA-1 for example
and produce a 128 bit value (pick your method of reducing it) from a 128
bit input you will have some collisions.  By the birthday paradox 1
collision after 2^64 inputs.  With a CRC for a fixed 128 bit input length
you will never get a collision unless you reuse an input.

Is this any clearer?


Mack
Remove njunk123 from name to reply by e-mail

------------------------------

From: "Douglas A. Gwyn" <[EMAIL PROTECTED]>
Subject: Re: computer program:  extract consonants/vowels from input
Date: Sun, 09 Jul 2000 03:50:37 -0400

Darren New wrote:
> > Whether a letter is more correctly classified as a consonant or
> > as a vowel is, as you observe, context dependent.
> I want to know whether the "P" in "pneumonia" is a consonant or a
> vowel.

Usually it is said to be "silent", but it's still a consonant.

------------------------------

From: [EMAIL PROTECTED] (Mack)
Subject: Re: Using CRC's to pre-process keys
Date: 09 Jul 2000 08:11:27 GMT

>Mack <[EMAIL PROTECTED]> wrote in message
>news:[EMAIL PROTECTED]...
>> Normally keys are preprocessed with MD5 or SHA-1
>I assume that you mean, "when translating from a human memorizable key
>phrase into a key for a private key encryption system with a fixed key
>length, normally MD5, SHA-1 or some other secure hash is used".
>

It is designed for ASCII keys. Not nessessarily input by humans at the
stage it is used.

>There are other ways to generate private keys (eg. through a public key
>mechanism) where the issues are quite different.  And, of course, there are
>private key systems (eg. RC4) that can accept quite long keys directly, and
>so there's little point in doing a hash first.

RC4 leaves residual correlations that requires discarding the first 512 bytes
of
output if used with ASCII text.  Not perfect but respectable.  Of course the
original design did not include this.  As for public key mechanisms they
genrally
don't include ASCII text.

>
>> These tend to be a bit slow. And also a bit of
>> overkill if the cipher is secure.
>Slowness is not an issue.  Look at the problem statement: we input the key
>phrase from the user and then convert it.  The first part (inputting from
>the human) is so slow that the few microseconds taken to do a secure hash is
>irrelevant.

See above.

>
>And, some people (including me) like massive overkill if it is cheap enough.
>
How cheap is cheap enough???

>>
>> I am proposing pre-processing with an appropriate
>> length CRC.  ie. CRC-64, CRC-128 or CRC-256
>> depending on the required key.  This effectively
>> reduces the key length of an ASCII key and
>> provides balanced output.
>The obvious problem with a CRC is that collisions are easy to compute -- it
>would be trivial to compute two key phrases that correspond to the same
>private key.  Now, how that can be used by an attacker is not at all
>obvious.  However, that is a potential weakness that the secure hash does
>not share.
>
>The whole point of the translating step is to convert the entire key phrase
>into a key in such a way that as much entropy as possible in the key phrase
>is present within the secret key.  The ideal is that every possible key
>phrase should correspond to a separate secret key.  Practically speaking, a
>secure hash does meet that ideal, in that collisions do exist, but we have
>no way of finding them.  A CRC does not.
>
>>
>> Another proposal involves a fixed non-linear table
>> substitiution as the first step.  An 8-bit bijective table
>> with the properties that it have good avalanche
>> characteristics and at least some non-linearity.
>>
>> The second step would be to run the output bytes
>> in a CRC-like manner. ie. the bytes are
>> processed so that each bit location within a byte
>> is CRCed with the corresponding bits in the other bytes.
>>
>> For both of these methods the key is limited to 255 bytes.
>> The byte 0xff and the length are then prepended to the key
>> before processing. Then the final key is complemented.
>What's the point to complementing the key?

The complement is a residual from a previous version I was
playing with.  It is really extraneous.

>
>> For the second method the length would be substituted but
>> the byte 0xff would not.
>>
>> A more technical description will be released if response
>> is positive.
>>
>> Benefits of this method over a true hash are that it allows
>> mapping of weak keys to the true input and the speed
>> gained.
>A better way to avoid weak keys: choose an encryption algorithm with no
>(known) weak keys.  They do exist (for example, all of the AES finalists).
>
This can't be proven.  It may be quite some time before someone finds
a class of keys that are 'weak' against some future unknown attack.

>
>--
>poncho
>
>


Mack
Remove njunk123 from name to reply by e-mail

------------------------------

From: Bryan Olson <[EMAIL PROTECTED]>
Subject: Re: A thought on OTPs
Date: Sun, 09 Jul 2000 08:05:58 GMT

Mok-Kong Shen wrote:

> Bryan Olson wrote:
> > The answer again: given two black-box sources
> > we can in many cases reliably refute independence, but
> > cannot reliably establish independence where it exists.
>
> In many cases to refute the hypothesis of independence, of course,
> namely when the two sources are found to be correlated.

And in many other cases.

> But we want  in the present context a test that is
> GENERALLY applicable to investigate independence, in
> view of the troubling fact that zero correlation
> does not imply independence.

So the answer yet again: there is no test that will
always find dependence when it exists.


--Bryan
--
email: bolson at certicom dot com


Sent via Deja.com http://www.deja.com/
Before you buy.

------------------------------

From: [EMAIL PROTECTED] (Mack)
Subject: Re: Using CRC's to pre-process keys
Date: 09 Jul 2000 08:22:04 GMT

>In any case, we know that none of this funny business can happen if
>we use a secure hash, so unless we think we understand the phenomena
>here really well, the conservative choice seems to be to just hash
>everything.  No?
>
>Am I missing something?  Maybe I've just got unjustified paranoia on
>the brain.  If so, I hope you'll let me know.
>
>

The attack is valid for ciphers subject to related key attacks.
However the method originally described was for ASCII keys.
This does not make the attack any less valid however.


Mack
Remove njunk123 from name to reply by e-mail

------------------------------

From: [EMAIL PROTECTED] (Mack)
Subject: Re: Using CRC's to pre-process keys
Date: 09 Jul 2000 08:35:28 GMT

[EMAIL PROTECTED]  (David A. Wagner) wrote:
>In article <[EMAIL PROTECTED]>,
>Mack <[EMAIL PROTECTED]> wrote:
>> Normally keys are preprocessed with MD5 or SHA-1
>> These tend to be a bit slow.
>
>You preprocess enough keys that the speed of SHA1 is problematic?
>Are you sure?  I would be very surprised.

This is not a serious problem.  It is simply a search for something leaner
and still useful.

>
>> And also a bit of overkill if the cipher is secure.
>
>That's not at all clear.  Ciphers are analyzed under a model where
>the key is chosen uniformly at random.  If you take non-uniform keying
>material and pass it directly to the cipher (without pre-hashing), the
>security warranty has been voided, and all bets are off.
>

An ideal cipher is not subject to related key attacks.  Theoretically
that is part of the security waranty.  Ideally given keys differing in
some manner (say one bit).  We should not be able to readily
determin that this is the case (only one bit differs) for chosen plaintext.
Nor should we be able to determin anything about any bits
if we can cause one bit to be flipped. Not even that bits original
state.


Mack
Remove njunk123 from name to reply by e-mail

------------------------------

From: Bryan Olson <[EMAIL PROTECTED]>
Subject: Re: MD of large data-sets
Date: Sun, 09 Jul 2000 08:39:46 GMT

Efthymios Ntasis wrote:
> I want to the ensure the integrity of a data-set, which
> its zipped size
> is between 25 and 50 Mbytes.
> In practice, when a digital signature scheme is used, do I have to
> insert the whole data-set to a message digest
> algorithm as MD5 or SHA? Isn't this gonna take a long time?
> If there are any papers on this subject could you please
> recommend them to me?

If you use a hash tree, you can verify a signature
against the data you actually use in time proportional
to the length of data you use plus the logarithm of
the size of the entire data set.

The state-of-the-art in hash trees is from Kobbi
Nissim and Moni Naor, as described in their paper from
the Usnix security conference of 1998.  The tree can
implement a dynamic data set supporting insert, delete
and search so that each operation runs in logarithmic
time.  The updates re-sign the entire data set, and
the searches verify the result.  They present the
method for a specific application (certificate
revocation) but an authenticated dynamic dictionary
is clearly useful in many other areas.


--Bryan
--
email: bolson at certicom dot com


Sent via Deja.com http://www.deja.com/
Before you buy.

------------------------------

Subject: Re: Using CRC's to pre-process keys
From: Simon Johnson <[EMAIL PROTECTED]>
Date: Sun, 09 Jul 2000 02:48:19 -0700

[EMAIL PROTECTED] (S. T. L.) wrote:
><<I.e. the SHA-1 hash of a 128 bit key isn't likely
>to contain the full 128 bits, due to collisions.>>
>
>Show me a SHA-1 collision, lamer.

No, labeling him a lamer in this case is totally incorrect.
I think he means that it is likely that SHA-1 will produce
collisions even when the size of input is below 128.

I'll reply to your challenge withi this:

Show me that SHA-1 doesn't have collisions, lamer :D


===========================================================

Got questions?  Get answers over the phone at Keen.com.
Up to 100 minutes free!
http://www.keen.com


------------------------------

Subject: Re: Using CRC's to pre-process keys
From: Simon Johnson <[EMAIL PROTECTED]>
Date: Sun, 09 Jul 2000 02:55:59 -0700

Slowness is not an issue.  Look at the problem statement: we
input the key
phrase from the user and then convert it.  The first part
(inputting from
the human) is so slow that the few microseconds taken to do a
secure hash is
irrelevant.

Yes, but on top of this, preprocessing a user supplied key makes
a dictonary attack take that little bit longer. Of course, the
user shouldn't be supplying poor keys like that, but that
another story :D


===========================================================

Got questions?  Get answers over the phone at Keen.com.
Up to 100 minutes free!
http://www.keen.com


------------------------------

From: [EMAIL PROTECTED]
Subject: Re: Concepts of STRONG encryption using variable base 
http://www.edepot.com/phl.html
Date: Sun, 09 Jul 2000 10:07:56 GMT

In article <[EMAIL PROTECTED]>,
  "Trevor L. Jackson, III" <[EMAIL PROTECTED]> wrote:
> [EMAIL PROTECTED] wrote:
>
> > We all know that encryption these days are weak.  Weak in the sense
> > that they are static and can be brute force searched by permutating
> > through the keyspace of the encyption key.
>
> No they cannot.  Use your virtual calculator to figure out how long
it would
> take you to _count_ up to 2^256.  Then figure out how much slower
trial
> decryption is compared to counting.  A 256-bit key cannot be searched.

This is actually a very interesting statement.  Any static keylength
can be searched through.  The variable you are more concerned with
is the time required to do it.  To answer that one, I give you
two URLs...

http://www.distributed.net
http://www.edepot.com/baseencryption.html



>
> >
> >
> > One of the most revolutionary concepts of encryption that I have
> > come up with is dynamic encryption and the use of dynamic algorithm
> > and "keys".
>
> This concept is older than some written languages.

Actually, if you do a patent search, here is what you will come up
with...


>
> >
> >
> > Using the concepts of dynamic encryption as well as dynamic bases,
> > one can achieve one-time-pad security without the inconveniences
> > of using it.
>
> No.

Yes!

>
> >
> >
> > For more information on BASE Encryption, read it up
> > here http://www.edepot.com/phl.html
>
> Massive confusion site.  Now, if you created a diffusion site, people
could
> use your web sites as cryptographic primitives.


Actually, you have hit upon some aspects advanced encryption.
You see, many people are not able to think outside of the box.
They are limited by what they know, and are acustomed to.  Its
like the old story about the young elephant being tied with a
weak chain, and later in life thinks he can't break it (even
though he IS able to do it as an adult).

The first rejection of new concepts usually starts with violent
and aggressive attacks.  This is the customary reflex response to
protect the familiar.  It is an instinct inborn in most animals
against predators.  For example, most fishes swim in schools.  Most
animals (with the exception of primarily hunters like lions) gather
in groups.  When things out of the ordinary (a new species encountered)
happens upon the group, they will all retreat or attack.  This is in
response to the group's need to survive (we are surviving ok without
this new species, so as a group we want to keep things the same, and
this new specie is interrupting this balanced stability).  Think
of the spanish inquisition, and the death of many astronomers.

In a similar sense, if you are able to think outside of the box,
you will realize that what you reject most is usually what you
are most afraid of, or have least experience in.  It is the ego
of the mind to assure itself that the paradigm that it has built
up to label and categorize the world is correct.  Once a person
has spent a long time and effort on something, it would hurt to find
that what they studies was basically wrong or useless.  In psychology
there is a term for this, but I leave it up to you to decide whether
this interests you.

As for Base Encryption, for those that have read it, it simply
blew your mind.  You are afraid to admit that it is true.  People
started asking for source code.  They could not find a fault with it.
They simply just attacked with no coherent reason why they are
rejecting it other than that it was different, and anything different
can't be right... can it?  Maybe you can give it a shot...  Rather
than say... No.  Or You are wrong, etc.  Why not simply provide
a scholarly comment on it.  Provide references to comments you made.
I have provided the whole she-bang at http://www.edepot.com/phl.html
I have even provided a tool to allow you to test it.  I have yet
to get a solid response from this community that can find errors
or problems with it.

As for your comment of using my site as an encryption primitive...
You hit upon something that I have thought about and will document
it here... (I do this to show you how to think outside of the box)...

You see, the beginning of the computer industry started on mainframes
and black boxes that you put things into and obtain result.  This
provided the start of many-to-one relationships between user and
machine.  As more people wanted access to the machine, they either
formed queues or placed their computational inquiries into tickets
to be batch processed by the mainframes.  This eventually led to
dumb clients and smart servers.  But as things goes, some people
wanted more power in their clients (why do I always need to
connect to the mainframe?).  This led to client-server
architectures, and the boom to Solaris OS and Windows NT.
(but usually the servers just host the filesystem or some
critical database).  As things started picking up and people started
using their POWERFUL clients (PCs), the internet exploded.  Basically
the powerful servers that the clients communicated with got
replaced by websites.  Now everything is stored on the internet
and your PC became a dumb client again (just a browser to browse
the internet).  Well, you know how things go... what goes around
comes around.  History repeats itself.  So the browsers (dumb clients)
became more powerful and then had the ability to run applications
(java applets and ActiveX controls), and do client-side scrypting
(javascript, Vbscript).  Organizations like XML are popping up
to allow you to do client-side manipulation and rendering.
Do you know what this trend leads to?
Yes you guessed it...  Client-side webservers.  There will be more
powerful clients to the point where anyone with an internet connection
can use their browser to host data dynamically (without a registration
with DNS, or a dynamic way to register on the fly).  This is
basically to balance out the powerful webservers.  Yes you heard it
here first... and you will see it happen in the near future (just
like dynamic keys and algorithms will be the norm in the future)
But to keep this interaction lively, remember this post when it
does eventually come to light).

But I digress.  The ability to use a website as a cryptographic
primitive has been tried before.  We already have webservers
dedicated to servicing authentication (verisign is one).  There
are also websites that provided kerberos authentication (built
in most NT machines these days).

But what you are most interested in is... Is it possible to use
webpages in hierarchical or maybe random order as a cryptographic
primitive?  Absolutely.  Most computers communicate
on the TCP/IP channel through ports.  There are special ports
that are dedicated to serving certain types of traffic.  They
are simply numbered...  It happens that telnet uses one port
and http (the browser protocol) uses another.  There are thousands
of these ports.  You can simply make one up and communicate through
it using any protocol you want.  Well, why not use http protocol
as the default protocol?  Why not?  The data can be in static
webpages or dynamically generated via asp, perl, or cgi programs.

The order of the links between webpages (if you want to use
w3c xhtml 4 conforming pages) can serve as indices to other
algorithms or segments of the cyphertext (base encryption
can utilize this concept mind you).  So in the future
when people start using dynamically generated website content as
a cryptographic primitive, remember you heard it here first.
There are too many ways this can be approached, so let your
imagination run wild (as soon as you lose that ego of course).
To get you started, think: why does cypherblock need to be on
my local machine?  Why do I need to use a cypherblock?  Why
not utilize webservers?  Why not use multiple webservers?
Why not utilize the whole internet as my algorithm?  As you can
see, base encryption is suitable for this advancement as well.
dynamic content=dynamic algorithms.

Dont forget to visit http://www.edepot.com/phl.html
to read up on base encryption and other neato things I
created!

>
>


Sent via Deja.com http://www.deja.com/
Before you buy.

------------------------------

From: Mok-Kong Shen <[EMAIL PROTECTED]>
Crossposted-To: comp.arch
Subject: Re: Proposal of some processor instructions for cryptographical 
Date: Sun, 09 Jul 2000 12:35:47 +0200



"Douglas A. Gwyn" wrote:

> Mok-Kong Shen wrote:
> > Transposition is one of the basic operations in cryptography.
> > However, it is in my view poorly supported currently by processor
> > instructions at the bit level ...
>
> To the contrary, a general bit-set transformation requires a large
> number of bits to describe it, which might as well be organized in
> a lookup table:  transformed = table[input];  // very speedy.
> This method is used in virtually every fast implementation of block
> ciphers.  Limiting the architectural support to simple shuffles of
> the bits rather than general transformations seems pointless.

How large is that table if I want to do a certain (arbitrary)
permutation
on 32-bits of a word? If I change my permutation at runtime, I'll have
to redo that table and that is inefficient. One may wish doing much
larger permutations, but currently there is not even an instruction to
do
32 permutations, and one has to do that very clumsily and inefficiently.

That was my point.

M. K. Shen



------------------------------

From: Mok-Kong Shen <[EMAIL PROTECTED]>
Subject: Re: A thought on OTPs
Date: Sun, 09 Jul 2000 12:35:30 +0200



Bryan Olson wrote:

> So the answer yet again: there is no test that will
> always find dependence when it exists.

My point has been that it is an interesting fact that, while there is no

generally applicable test for independence (a correlation test can be
considered a pre-test to screen out unnecessary test candidates) in
practice, one nonetheless in quite a number of places of applied
statistics makes assumption of independence of random varaibles (and
deduces results, which are certainly theoretically interesting, but
.....).

M. K. Shen





------------------------------

From: Mok-Kong Shen <[EMAIL PROTECTED]>
Crossposted-To: comp.arch
Subject: Re: Proposal of some processor instructions for cryptographical 
Date: Sun, 09 Jul 2000 12:35:53 +0200



Nicol So wrote:

> Mok-Kong Shen wrote:
> >
> > Transposition is one of the basic operations in cryptography.
> > However, it is in my view poorly supported currently by processor
> > instructions at the bit level...
> >
> > [Proposal for two new instructions: "swapping" & "mirroring", omitted]
>
> Instead of adding specialized instructions to the instruction set that
> don't have much use outside of a small class of programs, a better
> approach would be to couple a processor with a reconfigurable device,
> something akin to an FPGA.

While I vaguely remember to have read about a forthcoming
architechture that has properties akin to those provided by FPGA, it
would be desirable that common computers, which are what people
normally have access to, have enhanced capabilities to do encryption
processing.

M. K. Shen



------------------------------

From: Mok-Kong Shen <[EMAIL PROTECTED]>
Crossposted-To: comp.arch
Subject: Re: Proposal of some processor instructions for cryptographical 
Date: Sun, 09 Jul 2000 12:35:43 +0200



Terje Mathisen wrote:

> Since future crypto algorithms will work with a minimum block size of
> 128 bits, this instruction would at the very minimum be capable of
> working with half that size, i.e. 64-bit registers. A generalized
> bit-shuffle operation would then need something like 64 * 6 = 384 bits
> of shuffle index data. (This could theoretically be limited to the
> number of bits needed to encode 64!, but I would not like to try to
> dynamically split this at runtime. :-()

I think that 64-bit PCs are in the coming, and the price of 64-bit
workstations are going down to be affordable to those having serious
encryption jobs that justify higher expenses. For a 128-bit algorithm,
64-bit permutation is not too bad, I suppose, noting that for a Feistel
cipher one splits the block into two halves. Could you please explain
why you need 384-bit permutations for a 128-bit algorithm? Thanks.

M. K. Shen



------------------------------

From: Mok-Kong Shen <[EMAIL PROTECTED]>
Crossposted-To: comp.arch
Subject: Re: Proposal of some processor instructions for cryptographical 
Date: Sun, 09 Jul 2000 12:35:56 +0200



"Trevor L. Jackson, III" wrote:

> Mok-Kong Shen wrote:
>
> > Transposition is one of the basic operations in cryptography.
> > However, it is in my view poorly supported currently by processor
> > instructions at the bit level at which all modern block ciphers
> > operate. For, while there are AND, OR and SHIFT/ROTATE
> > instructions to realize any arbitrary permutations of the bits
> > of a computer word, it can be very cumbersome and hence
> > inefficient to do so. Thus I like to suggest that future
> > processors will have an instruction to facilitate implementation
> > of encryption algorithms that employ arbitrary, eventually
> > dynamically determined, permutations of bits. Such an
> > instruction will naturally need two operands, one referencing
> > either a register or memory word and the other an arrary of
> > bytes/words that specify the target positions of the individual
> > bits. Since this very general instruction may be comparatively
> > costly in execution time, I think that the following two
> > special instructions could be desirable in addition:
> >
> > (1) Swapping. This instruction needs an operand that specifies
> > the level at which the swapping is to be done. At the first
> > level, the word (register or memory) is divided into two halves
> > that are exchanged in positions. At the second level, the
> > swapping is done separately on each half of the word.
> > Analogously for the higher levels.
>
> Many CPUs have these kinds of instructions.

If one has something in a 32-bit register, I am not aware of an instruction
that swaps the first 4 bits with the second 4 bits, the third 4 bits with
the fourth 4 bits, and so on.

> > (2) Mirroring. This also has levels similar to swapping. At the
> > first level, the bits of the word referenced are exchanged by
> > mirroring about the central axis. At the second level, the
> > mirroring is done separately on each half of the word.
> > Analogously for the higher levels.
>
> This is called bit reversal.  It is common in digital signal processing.

I implicitly assume a common processor, e.g. what people normally
have access to, e.g. a PC.

> All of these transforms can be efficiently implemented as lookup tables
> (LUT).  Given a LUT you can perform arbitrary and data-dependent
> transformations (because a LUT can expressed the full potential of unique
> mappings from input to output).  Given Ritter's Dynamic Substitution you can
> customize the transformation such that each byte (or other element) of a
> stream is uniquely transformed.

The problem is with the size of the look-up table and quite probably
also the efficiency incurred thereby in case the permutation is dynamically
determined so that one has to recompute the look-up table frequently.


> > Another processor instruction that I think is desirable is
> > to obtain the parity of groups of bits (e.g. 4, 8, etc.) from
> > consecutive words in memory and accumulate these into a word.
> > This could be useful to so to say distill the entropy out of
> > a given bit sequence.
>
> This is a kind of convolution.  It can be efficiently expressed as a finite
> impulse response (FIR) filter.

Perhaps I misunderstood you. But it is not a matter of formal formulation,
it is a matter of implementation (with processor instructions).

M. K. Shen



------------------------------

From: Mok-Kong Shen <[EMAIL PROTECTED]>
Crossposted-To: comp.arch
Subject: Re: Proposal of some processor instructions for cryptographical 
Date: Sun, 09 Jul 2000 12:35:39 +0200



Skipper Smith wrote:

> Have you looked at the AltiVec instructions contained in the MPC7400?  You
> can download the manual from:
> www.mot.com/SPS/PowerPC/teksupport/teklibrary

It would be nice, if you would give a list of instructions of that processor
that are particularly relevant for crypto.

M. K. Shen



------------------------------

From: Mok-Kong Shen <[EMAIL PROTECTED]>
Subject: Re: Concepts of STRONG encryption using variable base 
Date: Sun, 09 Jul 2000 12:55:22 +0200



[EMAIL PROTECTED] wrote:

> One of the most revolutionary concepts of encryption that I have
> come up with is dynamic encryption and the use of dynamic algorithm
> and "keys".

Employing variable keys and schemes that can vary through dynamic
change of parameters during the encryption process has been suggested
by me several times since quite a time in the group under what I call
the
principle of variability. Nothing is that revolutionary as you imagine.
Make sure that you do it right, if you exploit variability, though.
Otherwise
it might be better to remain conservative.

M. K. Shen



------------------------------


** FOR YOUR REFERENCE **

The service address, to which questions about the list itself and requests
to be added to or deleted from it should be directed, is:

    Internet: [EMAIL PROTECTED]

You can send mail to the entire list (and sci.crypt) via:

    Internet: [EMAIL PROTECTED]

End of Cryptography-Digest Digest
******************************

Reply via email to