Cryptography-Digest Digest #767, Volume #12 Mon, 25 Sep 00 08:13:01 EDT
Contents:
Re: LFSR as a passkey hashing function? (Bryan Olson)
Re: LFSR as a passkey hashing function? (Bryan Olson)
Re: HAC DES Test Vectors ("kihdip")
Re: Question on biases in random-numbers & decompression (Roger Schlafly)
Re: Big CRC polynomials? (Bryan Olson)
Re: RSA?? (Albert Yang)
A New (?) Use for Chi (John Savard)
Re: Tying Up Loose Ends - Correction (Mok-Kong Shen)
Re: Why is TwoFish better than Blowfish? (Albert Yang)
Re: Software patents are evil. (Vernon Schryver)
Re: Question on biases in random numbers & decompression (Mok-Kong Shen)
Re: What make a cipher resistent to Differential Cryptanalysis? (Mok-Kong Shen)
Re: What make a cipher resistent to Differential Cryptanalysis? (Mok-Kong Shen)
Re: Questions about how to run a contest (Sylvain Martinez)
Triple DES CBC test vectors ("MVJuhl")
Re: Tying Up Loose Ends - Correction (Tim Tyler)
Re: Why is TwoFish better than Blowfish? (Tom St Denis)
Re: What make a cipher resistent to Differential Cryptanalysis? (Tom St Denis)
Re: 128-bit Secure LFSR (Whoops) (Tom St Denis)
Re: Tying Up Loose Ends - Correction (Tim Tyler)
Re: Question on biases in random numbers & decompression (Tim Tyler)
----------------------------------------------------------------------------
From: Bryan Olson <[EMAIL PROTECTED]>
Subject: Re: LFSR as a passkey hashing function?
Date: Mon, 25 Sep 2000 06:40:46 GMT
Tom St Denis wrote:
> Try this.
>
> 1. Feed the state with 'n-bits' of userkey
> 2. Run the LFSR for 2n times thru a self-shrinking generator
> 3. Take the final state as the hash
Self-shrinking doesn't change the state sequence; it
only operates on the output. The above is just as
reversible as the original.
--Bryan
--
email: bolson at certicom dot com
Sent via Deja.com http://www.deja.com/
Before you buy.
------------------------------
From: Bryan Olson <[EMAIL PROTECTED]>
Subject: Re: LFSR as a passkey hashing function?
Date: Mon, 25 Sep 2000 07:01:14 GMT
Simon Johnson wrote:
> Say we take a 128-bit primitive polynomial mod 2 and
> covert it to an LSFR. If we want to store a 128-bit
> passkey for a 128-bit key encryption algorithm, for
> example, we would enter the key as the initial state
> of the register. We then clock it 128 times, to
> clear out the passkey. Then we clock it a futher
> 128-bit times, and record this bit sequence as the
> hash. Any problems with this design?
It's overly complicated as way to store a 128 bit
quatitity, which is the only stated requirment. As
Tom noted, the state transform is not one-way. But
before we consider the merits of solutions, we need
to define the problem. What security properties
are you looking for?
--Bryan
--
email: bolson at certicom dot com
Sent via Deja.com http://www.deja.com/
Before you buy.
------------------------------
From: "kihdip" <[EMAIL PROTECTED]>
Subject: Re: HAC DES Test Vectors
Date: Mon, 25 Sep 2000 09:12:36 +0200
Martin Wolters wrote in message <8ql3dr$1n8$[EMAIL PROTECTED]>...
>Hi,
>
>In the Handbook of Applied Cryptography,
>the following DES-Test Vectors are shown:
>
>Key: 0123456789ABCDEF
>
>P: 4E6F772069732074 68652074696D6520 666F7220616C6C20
>C: 3FA40E8A984D4815 6A271787AB8883F9 893D51EC4B563B53
>
>My Implemention returns:
>
>C: 5d1b45d8c23190cd 09b1c144bf6ff01b 603c73811b87f13b
>
>I once checked my Implementation with a Test vector making HTML file,
>and it seemed to be correct. Are the Test vectors in the HAC wrong?
Nope - The testvectors in Handbook of Applied Cryptography (p. 256) are 100%
right.
For other testvectors try the NIST Special Publication 800-17 at
http://csrc.nist.gov/nistpubs/800-17.pdf
(page 124 and forward)
Kim
------------------------------
From: Roger Schlafly <[EMAIL PROTECTED]>
Crossposted-To: comp.compression,sci.crypt.random-numbers
Subject: Re: Question on biases in random-numbers & decompression
Date: Mon, 25 Sep 2000 00:42:09 -0700
Benjamin Goldberg wrote:
> I've been looking for a way to convert a stream of unbiased random bits
> into an unbiased stream of random numbers in an arbitrary range, and I
> think I've hit on an idea that hasn't been suggested before:
>
> Take an arithmetic dececoder, and initialize it to believe that the
> values 0, 1, 2 are equiprobable, and no other values will occur. Then
> feed it the stream of random bits. This should result in an unbiased
> stream of random numbers in the range 0..2. However, I don't understand
> arithmetic decompression well enough to know for certain if this will
> work as I've suggested.
Yes, that will work, but why not just use the bits directly?
Ie, if you have 64 bits, convert to an unsigned 64 bit integer,
and divide by 2^64 to get a real number in the range from 0 to 1.
For other intervals, multiply by an appropriate scale factor.
------------------------------
From: Bryan Olson <[EMAIL PROTECTED]>
Subject: Re: Big CRC polynomials?
Date: Mon, 25 Sep 2000 08:04:54 GMT
Runu Knips wrote:
> > <[EMAIL PROTECTED]> wrote:
> > > I'm seeking good 128- and 256-bit CRC polynomials
[...]
> > > for the purpose of uniquely identifying large
> > > numbers millions) of binary files;
[...]
> IMHO a cryptographically hard hashsum is a little bit
> too much for this purpose.
Too much what? At these sizes I'd expect a
cryptographic hash to be competitive with the CRC
in terms of speed, space, and code size. If it
provides more security than we need, so what?
There's no such thing as a quality over-run or
cost deficiency.
If we use a CRC here, it means that anyone who
provides data for the files can induce collisions
to ruin the identification scheme. Are we sure
that no attacker can influence file content, or
that the effect of duplicate ID's will always be
innocuous?
When in doubt, defend against adversaries.
--Bryan
--
email: bolson at certicom dot com
Sent via Deja.com http://www.deja.com/
Before you buy.
------------------------------
From: Albert Yang <[EMAIL PROTECTED]>
Subject: Re: RSA??
Date: Mon, 25 Sep 2000 08:40:07 GMT
"Douglas A. Gwyn" wrote:
>
> Runu Knips wrote:
> > ... Mathematicans believe that RSA is as hard as factoring, ...
>
> Last I heard, it was known that certain classes of keys
> enabled cracking of RSA with dramatically less reources
> than general factoring algorithms would require, but no
> such short-cuts were known for other keys. So the worst
> case seems to be "as hard as factoring", and it is up to
> the key selector to ensure that "weak keys" are avoided.
First, let's address a few things. RSA can be cracked, it's just the
level of difficulty. RSA is 100% crackable given enough resources
thrown at the problem. 100 bit RSA? 7 minutes, done. RSA 512bits? I
think the government has enough resources to crack it. RSA 1024bits?
Now you are cooking with gas!
Keep in mind a few things, if I were the government, it's easier for me
to have someone sneak into your house, and put a video camera over your
keyboard and record your password than it is for me to try to crack
RSA. The "old" math problems in which most crypto relies upon, have
been proven to be pretty good. It's WAY cheaper to hold a knife to your
throat of the throat of a loved one than it is to crack a 1024bit RSA
implementation.
Now, let me address a few of the other problems. If your private key
sits on your HD in the open, then cracking RSA is as easy as reading a
file. So under optimum conditions, RSA should be "secure enough".
For example, a 2048 bit RSA key should yield enough security such that
all the computers in the world working on cracking that one key
shouldn't be able to.
As for "weak keys", I don't think you are talking about "weak keys" in
RSA, but rather, talking about weak choices for the "e" part of the RSA
algorithm usage. Coppersmith has discovered that you can throw out
quite a bit (about 75%) of the probable key space with the exponent is
"3". Of course, that means only a 2 bit loss in security, not a big
deal. There are white papers for this.
Albert
------------------------------
From: [EMAIL PROTECTED] (John Savard)
Subject: A New (?) Use for Chi
Date: Mon, 25 Sep 2000 08:44:13 GMT
Perhaps because this is not practical by hand, or because it is not
needed, it hasn't been mentioned in the references I've seen - which
don't include everything publicly available, of course -
it occurred to me that a part of the information in a contact chart
for either a monalphabetic substitution - or for one of the alphabets
in mixed-alphabet Vigenere - could be summarized in the following way:
The letters found following some letter, as they constitute a set of
letters, are a distribution. So the extent to which one letter has
only a few contacts, or a wide variety of them, could be represented
by the chi of the letters that follow it, and separately the chi of
the letters that precede it. Thus, in addition to its frequency, each
letter has two other simple numbers which might be useful in
distinguishing it from other letters.
I suspect this sort of thing has been thought of before, even in the
open literature; or, perhaps, since all these distributions will be
further from the flat one than the chi of the language itself, and
they will involve small numbers of letters, the test might lack
sensitivity.
John Savard
http://home.ecn.ab.ca/~jsavard/crypto.htm
------------------------------
From: Mok-Kong Shen <[EMAIL PROTECTED]>
Subject: Re: Tying Up Loose Ends - Correction
Date: Mon, 25 Sep 2000 11:12:54 +0200
"SCOTT19U.ZIP_GUY" wrote:
>
> [EMAIL PROTECTED] (Mok-Kong Shen) wrote in
> >"SCOTT19U.ZIP_GUY" wrote:
> >>
> >> [EMAIL PROTECTED] (Mok-Kong Shen) wrote:
> >> >
> >> >"SCOTT19U.ZIP_GUY" wrote:
> >> >> If your "STATIC HUFFMAN TREE IS SECRECT" then having
> >> >> a EOF symbol still sucks. I am not saying finding the tree is
> >> >> easy it may be very hard. But still the EOF symbol is likely
> >> >> to be the longest symbol and the last symbol. Why use it at
> >> >> all. But if you can't see a reason then by all means you can
> >> >> use it.
> >> >
> >> >Since the whole tree is unknown, how does the opponent
> >> >identify the eof, even if he knows it is longer than
> >> >the rest?
> >>
> >> Gee I guess he looks at the end of file for a clue
> >
> >What kind of clue?? Please show an example.
> >
> >
> >>
> >> >BTW, does your program deals with also word or block
> >> >boundary in addition to byte boundary?
> >
> >> Check my site out. I doubt you would belive me if I told
> >> you so I will not.
> >
> >If you wouldn't provide that simple information, it
> >means either the answer is no or you yourself don't
> >know that exactly.
> >
>
> Then your to stupid to read. Cause that ain't what it
> means. It means jerk look at the damn webpage. I can't
> spoon feed and burp you for ever. Grow up MOK.
I'll rather consider writing long sentences instead
of simply answering a Yes or No to be very stupid.
M. K. Shen
------------------------------
From: Albert Yang <[EMAIL PROTECTED]>
Subject: Re: Why is TwoFish better than Blowfish?
Date: Mon, 25 Sep 2000 09:09:25 GMT
"David C. Barber" wrote:
>
> TwoFish is newer, and I would think better, than BlowFish (unless the AES
> requirements required a worse cipher), but I've never seen a list of reasons
> just why. While I expect that Bruce is the logical person to answer this
> (btw, is my 2nd edition Applied Cryptography still the current version?),
> even a pointer to a web-page would be fine.
>
> *David Barber*
I personally trust Blowfish a bit more than Twofish. Twofish is overly
complex, and complexity is a bad thing in the crypto world. Also,
Blowfish has 9 years of cryptoanalysis, the real "stuff" that makes an
algorithm safe. There are a class of weak keys in Blowfish, but not
that big of a deal IMHO.
The one I trust me? Serpent! :-)
Albert
------------------------------
From: [EMAIL PROTECTED] (Vernon Schryver)
Subject: Re: Software patents are evil.
Date: 24 Sep 2000 17:39:25 -0600
In article <8qlr47$5j7$[EMAIL PROTECTED]>,
Bill Unruh <[EMAIL PROTECTED]> wrote:
> ...
>It would be nice if there were a bureacratic way of challenging a patent
>befor having to take it to court. Ie, get the patent office to
>reconsider...
You mean the sort of review and reconsideration that was given to the LZW
patent in about 1993 and that resulted in something like a tripling of
the number of its claims?
(Never mind that I think that the LZW patent was about a cool idea that
should have been recognized somehow, although I'm not sure who should have
received the recognition (e.g. IBM), for how long (e.g. for 5 years), or
in what form.)
Vernon Schryver [EMAIL PROTECTED]
------------------------------
From: Mok-Kong Shen <[EMAIL PROTECTED]>
Crossposted-To: com.compression
Subject: Re: Question on biases in random numbers & decompression
Date: Mon, 25 Sep 2000 11:56:11 +0200
Benjamin Goldberg wrote:
>
> nono, I'm not asking to PRODUCE "random numbers from a bitstream." I've
> GOT a random bitstream, which means random 0s and 1s. What I want from
> this get random 0s, 1s, and 2s. I want the arithmetic encoder to
> convert from one form to the other.
You have a conversion from a bit stream to a stream of
numbers in the range [0,2]. That's in fact producing
a stream of random numbers, if you are lucky enough
(good bit stream and good conversion). Whether through
the device you mentioned (I have no experience) does
a good job and whether it helps much to render a
predictable source to an unpredicable (never absolutly,
I suppose) one must be subjected to tests and detailed
investigations and can't be said from 'first principles',
I belive.
M. K. Shen
------------------------------
From: Mok-Kong Shen <[EMAIL PROTECTED]>
Subject: Re: What make a cipher resistent to Differential Cryptanalysis?
Date: Mon, 25 Sep 2000 11:56:22 +0200
Tom St Denis wrote:
>
> My point is that given a proper design random sboxes are almost always
> ok.
>
> Take a CAST variant... take a 4x4 MDS and four random 8x8 sboxes...
> that cipher will most likely be secure (given say 16 rounds) from all
> forms of statistical attacks because the diffusion is balanced and
> sufficient confusion is given by the 8x8's. Now replace the MDS with a
> 2PHT, the cipher could be weak against say truncated differentials...
Sorry, I thought you were against random S-boxes.
I believe, sufficient number of rounds helps as well.
M. K. Shen
------------------------------
From: Mok-Kong Shen <[EMAIL PROTECTED]>
Subject: Re: What make a cipher resistent to Differential Cryptanalysis?
Date: Mon, 25 Sep 2000 11:56:18 +0200
"Paul L. Hodgkinson" wrote:
>
> "Mok-Kong Shen" wrote
>
> > No practical cipher is absolutely secure.
>
> Which isn't entirely true.
> A one-time pad can be shown to be theoretically unbreakable, which requires
> a random key of the same length as the message.
> Such a pad was used in the Cold War to encrypt communications between Moscow
> and Washington, sent by secure courier.
The ideal OTP is 'theoretical' and, as you said,
'theoretically' unbreakable. No practical 'ideal' OTP
exists (or knowable to be ideal, if given one),
however.
M. K. Shen
------------------------------
From: Sylvain Martinez <[EMAIL PROTECTED]>
Subject: Re: Questions about how to run a contest
Date: Mon, 25 Sep 2000 10:36:12 GMT
In article <[EMAIL PROTECTED]>,
"Simon Dainty" <[EMAIL PROTECTED]> wrote:
>
> Sylvain Martinez <[EMAIL PROTECTED]> wrote in message
> news:8q57db$ee6$[EMAIL PROTECTED]...
>
> > > If you want to test your ciphers strength, you should surply the
> > >plaintext
> > > and the ciphertext and ask for the key. - This gives the attackers
> > >bigger
> > > opportunities.
> >
> > Ok, so if I want to be serious about it I should give the original
clear
> > text...
>
> Ideally, you would supply many ciphertexts and many plaintexts
> and then ask for the key. Asking someone - anyone - to attempt
> to produce a key when in posession of only a couple of plaintexts
> and ciphertexts is asking quite a lot.
>
In my new contest I give the 2 plain texts and crypted texts.
If nobody managed to find the key within few months I am thinking of
creating an application with the key hardcoded in it and then you could
automatically crypt any texts with the "secret contest key"
Even if someone managed to "de-assembler" the application and find the
key it won't really matter as to win the contest you need to find a
generic solution to decrypt any text cryped with this key.
That's just an idea, I'll probabely do that in few months.
> > Ok, then I should give the original clear text for the 2 cipher
files
> > provided for this contest.
>
> Up the limit. You've got nothing to loose. (Well, except �50. ;-)
>
you're right :O)
And all I want really is to know how good (or bad) is this algorithm.
> > Do we agree that if after that nobody can give me the key used to
crypt
> > these 2 text that would probabely mean this algorithm is secure ?
> > (or at least not too bad !)
>
> If you can prove that it can never be cracked then yes, it's secure.
Alas,
> that's rarely ever the case. All you can hope to claim is that "...it
> hasn't
> been cracked yet."
yes and so far I can only claim this...
(Well I guess it's better than nothing ;o)
Regards,
Sylvain.
--
---
Unix security administrator
BUGS crypto project: http://www.bcrypt.com
http://www.encryptsolutions.com
Sent via Deja.com http://www.deja.com/
Before you buy.
------------------------------
From: "MVJuhl" <[EMAIL PROTECTED]>
Subject: Triple DES CBC test vectors
Date: Mon, 25 Sep 2000 13:02:44 +0200
I'm looking for test vectors for Triple DES in CBC mode.
I have looked at NIST Special Publication 800-20, but it doesn't contain the
tests I'm looking for.
What I would like is encryption/decryption on 2 or 3 blocks of known
plaintext, so I can verify that the chaining I've implemented is working.
Thanks
M V Juhl
------------------------------
From: Tim Tyler <[EMAIL PROTECTED]>
Subject: Re: Tying Up Loose Ends - Correction
Reply-To: [EMAIL PROTECTED]
Date: Mon, 25 Sep 2000 10:56:26 GMT
Bryan Olson <[EMAIL PROTECTED]> wrote:
: Tim Tyler wrote:
:> David Hopwood wrote:
:> : Tim Tyler wrote:
:> :> ... David is discussing the effect of adding an additional section
:> :> of known plaintext to the end of the file. This normally has the
:> :> effect of decreasing the keyspace by almost exactly five bits [...]
:>
:> : With all due respect, this is complete nonsense.
:>
:> : When we talk about "reducing the keyspace", that means reducing the
:> : size of the set of keys that need to be considered at all; it does
:> : not mean finding a test that will eliminate keys by testing them one
:> : by one.
:>
:> I try to use the term "effective keyspace", when discussing this type
:> of rapid elimination of whole classes of keys - I used the word
:> "effective" in the section quoted above - though I see that Icould
:> have used it again where I did not, and gained some clarity.
:>
:> The technique can reduce the time taken to deal with specific keys,
:> and can /sometimes/ speed up the process of a keyspace search.
:>
:> I'm reasonably happy with describing the effect as "a reduction in the
:> effective keyspace" - despite the fact that the keys still exist, and
:> may still require /some/ consideration.
: It's still complete nonsense. With no attack better than exhaustive
: search you have no way to rapidly eliminate any large class of keys.
You might well have a method that works much faster than decrypting
blocks and analysing the plaintext for known characteristics.
The latter might require a number of blocks and take a non-trivial
volume of processing.
: The problem is a conceptual error and cannot be fixed by adjusting
: terminology.
I don't think so - the issue appears to be purely terminological.
: A trial decryption with one candidate key takes some minimum constant
: time. That time, multiplied by the number of effective keys puts a
: lower bound on the time for exhaustive search.
Generally speaking the idea that to get the total time required, you need
to multiply the time taken for a single decrypt by the number of keys is a
potentially serious conceptual error.
Different keys may evaluated concurrently. Consequently, the lower bound
on the time for a search has nothing to do with taking the time for a
single decrypt and multiplying it by the number of keys.
This aside, you may gain the ability to reject a key after decrypting a
single block. The ability to do so would speed things up.
A test for plaintext characteristis itself might take up valuable space,
while (say) looking for a header of 0000 would be much more compact,
(and have a reduced chance of making categorical errors). Saved silicon
area can be converted into speed by using parallelism.
You may be able to speed things up further by saving the area required to
decrypt half of that block by discarding any information that only affects
half that block as early a stage as is possible in the algorithm - and
using the remaining space to increase parallelism.
--
__________ http://alife.co.uk/ http://mandala.co.uk/
|im |yler [EMAIL PROTECTED] http://hex.org.uk/ http://atoms.org.uk/
------------------------------
From: Tom St Denis <[EMAIL PROTECTED]>
Subject: Re: Why is TwoFish better than Blowfish?
Date: Mon, 25 Sep 2000 11:09:38 GMT
In article <[EMAIL PROTECTED]>,
Albert Yang <[EMAIL PROTECTED]> wrote:
>
>
> "David C. Barber" wrote:
> >
> > TwoFish is newer, and I would think better, than BlowFish (unless
the AES
> > requirements required a worse cipher), but I've never seen a list
of reasons
> > just why. While I expect that Bruce is the logical person to
answer this
> > (btw, is my 2nd edition Applied Cryptography still the current
version?),
> > even a pointer to a web-page would be fine.
> >
> > *David Barber*
>
> I personally trust Blowfish a bit more than Twofish. Twofish is
overly
> complex, and complexity is a bad thing in the crypto world. Also,
> Blowfish has 9 years of cryptoanalysis, the real "stuff" that makes an
> algorithm safe. There are a class of weak keys in Blowfish, but not
> that big of a deal IMHO.
Um Blowfish is only six years old, so 9 years is really good in that
time :)
Also the design of Twofish is very sound, although it is new. There
are ways to make Blowfish like ciphers without the class of weak keys
(hence my TC4 and BF2 ciphers on my website) so it's not like Blowfish
is perfect.
> The one I trust me? Serpent! :-)
Why? Serpent is even newer!
Tom
Sent via Deja.com http://www.deja.com/
Before you buy.
------------------------------
From: Tom St Denis <[EMAIL PROTECTED]>
Subject: Re: What make a cipher resistent to Differential Cryptanalysis?
Date: Mon, 25 Sep 2000 11:13:55 GMT
In article <[EMAIL PROTECTED]>,
Mok-Kong Shen <[EMAIL PROTECTED]> wrote:
>
>
> "Paul L. Hodgkinson" wrote:
> >
> > "Mok-Kong Shen" wrote
> >
> > > No practical cipher is absolutely secure.
> >
> > Which isn't entirely true.
> > A one-time pad can be shown to be theoretically unbreakable, which
requires
> > a random key of the same length as the message.
> > Such a pad was used in the Cold War to encrypt communications
between Moscow
> > and Washington, sent by secure courier.
>
> The ideal OTP is 'theoretical' and, as you said,
> 'theoretically' unbreakable. No practical 'ideal' OTP
> exists (or knowable to be ideal, if given one),
> however.
Actually an OTP is unconditionally secure if you can't guess my secret
pad better then a prob of 1/2. If I make a CDROM full of random bits
(or two, one for either direction) and the CDs *are not* compromised
then the communication is *provably* secure until the pads run out.
Tom
Sent via Deja.com http://www.deja.com/
Before you buy.
------------------------------
From: Tom St Denis <[EMAIL PROTECTED]>
Subject: Re: 128-bit Secure LFSR (Whoops)
Date: Mon, 25 Sep 2000 11:12:20 GMT
In article <39cebba7$0$62635$[EMAIL PROTECTED]>,
[EMAIL PROTECTED] (Jeff Gonion) wrote:
> In article <8ql154$kau$[EMAIL PROTECTED]>, Tom St Denis
> <[EMAIL PROTECTED]> wrote:
>
> > Learn to read C code first my friend.
> >
> > >> } else
> > >> r = 0
> > >> m[0] = ...
> >
> > The m[0] is not part of the else clause.
> >
>
> You are, of course, correct.
> Now that's just a little embarrassing, isn't it...
That's ok, I often post the first thing that pops into my head too (and
often I am wrong).
It's all of the posting experience. But it's much better then not
posting at all I spose.
Nice paper btw, I already knew about LFSR's but perhaps someone else
will read it that didn't.
Tom
Sent via Deja.com http://www.deja.com/
Before you buy.
------------------------------
From: Tim Tyler <[EMAIL PROTECTED]>
Subject: Re: Tying Up Loose Ends - Correction
Reply-To: [EMAIL PROTECTED]
Date: Mon, 25 Sep 2000 11:16:00 GMT
John Savard <[EMAIL PROTECTED]> wrote:
: Tim Tyler <[EMAIL PROTECTED]> wrote, in part:
:>Has this problem been studied? Are there padding schemes that do better
:>than (say) prepending a self-terminating length field to the start of
:>the message? This results in many "impossible" padded messages :-<
: If the message is to be padded to a multiple of 8 bits, insert a 3-bit
: length field, indicating the number of real bits in the last byte. No
: termination is required, and no message becomes impossible as a
: result.
There /are/ multiple ways of representing a single message, though.
If you have a deterministic padding program, you will be likely to
able to distinguish a real padded file from some of the possible
decrypts by examing what comes after the real bits in the last byte.
To make the scheme into one where an attacker would always be unable to
distinguish real padded files, you would need to pad the last few bits
at the end of the message out with genuinely random bits.
I was trying to talk about arbitrary-length padding - not just the case of
< 8 bits.
The more padding there is, the more potential information you're giving
the attacker by using an "inefficient" representation of the length of the
padding.
I mentioned "self-terminating" representations of the length:
Such representations aren't anywhere near ideal - since, the
self-terminating symbol might not terminate - or might terminate
indicating a volume of padding larger than the entire message.
This seems to be "the wrong way", but I'm not sure what "the right way"
might look like...
--
__________ Lotus Artificial Life http://alife.co.uk/ [EMAIL PROTECTED]
|im |yler The Mandala Centre http://mandala.co.uk/ VIPAR GAMMA GUPPY.
------------------------------
Crossposted-To: com.compression
From: Tim Tyler <[EMAIL PROTECTED]>
Subject: Re: Question on biases in random numbers & decompression
Reply-To: [EMAIL PROTECTED]
Date: Mon, 25 Sep 2000 11:33:13 GMT
Benjamin Goldberg <[EMAIL PROTECTED]> wrote:
: I've GOT a random bitstream, which means random 0s and 1s. What I want
: from this get random 0s, 1s, and 2s. I want the arithmetic encoder to
: convert from one form to the other.
: Since I think that if I were *starting out* with random (equiprobable)
: 0s, 1s, and 2s, and *en*coded them with an arithmetic coder, I *should*
: get a random bitstream (with 0 and 1 equiprobable), I think that using
: an arithmetic *de*coder on a random bitstream should let me producde
: random 0s, 1s, and 2s.
: The reason I want to try this, is that if I take my random bitstream,
: and take two bits at a time, getting a number in the range 0, 1, 2, 3,
: and discard all 3s to get a number in the desired (0, 1, 2) range, I'm
: WASTING 25% of my random bits, AND I might end up taking an arbitrarily
: long time to get a single trit. Bleh. [...]
It sounds like using arithemetic decompression routine will do
much of what you want.
I think you will find you might still wind up taking an arbitrarily long
time to get a single trit, though - I don't think there's any way around
that without introducing bias.
I suspect that your arithmetic compressor will need access to an unbounded
quantity of RAM as well in order to be guaranteed to operate correctly.
In all, the idea sounds a bit like converting a binary number to base 3
(or base n), with the MSBs coming first.
If thinking about it this way, consider treating the stream as binary and
tertiary (or n-ary) expansions of real numbers between zero and one.
--
__________ http://alife.co.uk/ http://mandala.co.uk/
|im |yler [EMAIL PROTECTED] http://hex.org.uk/ http://atoms.org.uk/
------------------------------
** FOR YOUR REFERENCE **
The service address, to which questions about the list itself and requests
to be added to or deleted from it should be directed, is:
Internet: [EMAIL PROTECTED]
You can send mail to the entire list (and sci.crypt) via:
Internet: [EMAIL PROTECTED]
End of Cryptography-Digest Digest
******************************