Cryptography-Digest Digest #493, Volume #12 Mon, 21 Aug 00 01:13:00 EDT
Contents:
Re: News re annotated version of "Fire Upon the Deep" (Eric Lee Green)
Re: Bytes, octets, chars, and characters (Guy Macon)
Lehmann Primality Test (ChenNelson)
Re: Lehmann Primality Test ([EMAIL PROTECTED])
Re: How Many? (Future Beacon)
Re: Lehmann Primality Test (David A Molnar)
Does anyone want a coded message to decode? ([EMAIL PROTECTED])
Re: Lehmann Primality Test (Mack)
Re: Does anyone want a coded message to decode? (Mr. Ian LeKoy)
Re: Does anyone want a coded message to decode? ([EMAIL PROTECTED])
Re: blowfish problem (John Hascall)
Re: Lehmann Primality Test ([EMAIL PROTECTED])
Re: News re annotated version of "Fire Upon the Deep" (John Savard)
----------------------------------------------------------------------------
From: Eric Lee Green <[EMAIL PROTECTED]>
Crossposted-To: rec.arts.sf.written
Subject: Re: News re annotated version of "Fire Upon the Deep"
Date: Mon, 21 Aug 2000 01:23:36 GMT
phil hunt wrote:
> >files is available. A likelier candidate for Palm release is
> >peanutpress.com's format which is encrypted and better suited to books in
> >the first place (and there's peanutpress.com's infrastructure and
> >pre-existing publishing links, especially in the SF world); the encryption
> >uses the purchaser's name and credit card number, so I can't really imagine
> >people giving others copies with the necessary info to use them :-). I've
> >got no idea how hard it is to crack though - probably not since plaintext is
> >available and cyphertexts can be obtained in multiple versions where the
> >changes are known (different name and credit card number).
>
> I suspect it is not too hard to crack. Not being an experienced cryptanalyst,
> I'm crossposting this to sci.crypt.
It would not be hard to crack at all, especially on the Palm, which has
no memory protection. Basically, all you need to do is load the program
into one of the many available Palm emulators (there exists a number of
them that run on common PC's) and single-step through the program until
you reach the loop that decrypts bytes and displays them on the screen.
Then you intercept the part that displays them on the screen and instead
save them to another area of memory, this is a case of putting a single
'jmp' command that goes to your own code rather than to the display
code.
It actually gets rather more involved than the above, since it is likely
that the program is heavily obfuscated (i.e., that it self-decrypts
itself as it runs, etc.) and thus would require a long time to
single-step to the point where you have a usable program. But since the
program and all keys must reside on your computer, and since you have
full control over your computer, you don't have to "break" the
encryption... all you have to do is intercept the bytes right before
they're to be displayed.
I have to laugh at all the so-called "secure" digital media initiatives.
No matter how deeply you encrypt things, sooner or later your software
has to decrypt them in order to display them or shove them into the
music hardware. Unless we add decryption capability into the video and
sound hardware itself, all such efforts are doomed, and it will be a
long time before all the current (non-decrypting) hardware goes away.
For that matter, even if we add decryption capability into the video and
sound hardware itself, it's still possible to jam a high-speed A/D
converter on the output of the video and sound hardware and get a good
quality reproduction. It won't be "perfect", but it will certainly be
good enough for most people, and, like all digital reproductions, can be
distributed without any further loss of quality.
--
Eric Lee Green There is No Conspiracy
[EMAIL PROTECTED] http://www.badtux.org
------------------------------
From: [EMAIL PROTECTED] (Guy Macon)
Crossposted-To: comp.lang.c
Subject: Re: Bytes, octets, chars, and characters
Date: 21 Aug 2000 01:34:12 GMT
Trevor L. Jackson, III wrote:
>
>"Bruce G. Stewart" wrote:
>
>> David Hopwood wrote:
>>
>> > > And to show that this is not just crazy C programmers talking, see
>> > > this (admittedly non-normative) quote from the Jargon File:
>> > >
>> > > byte /bi:t/ n.
>> > >
>> > > [techspeak] A unit of memory or data equal to the amount used to
>> > > represent one character;
>> >
>> > But, this definition is wrong, because C 'char' != character in general.
>> > It's very common for 1, 2, 4, or variable numbers of octets or bytes to
>> > be used to represent a character (both in C programs, the terminology
>> > used in the C specs notwithstanding, and in other contexts). The use of
>> > the word "character" to mean anything other than a unit of text (such as
>> > a symbol, letter, etc., or possibly a control code), should be
>> > strenuouslyresisted; characters are *not* units of storage.
>>
>> Well, heck, it just says the amount to represent one character, not the
>> amount to represent any character, so variable-length mbcs codes could
>> get off the hook on a technicality.
>
>Not quite. We don't refer to multi-character character sequences, but
>multi-byte character sequences. The leading byte of an MBCS character is
>not a character. It is part of a character.
>
>Otherwise we're using Humpty Dumpty speech, and we can refer to the leading
>bit as a character ... in binary of course.
>
Just out of curiousity, is all of you experience with computers
post-1969? If so, you missed a major redefinition of terms.
Did you know that "Computer" was a job title long before the
"mechanical computer" and "electronic computer" were developed?
(In case anyone doesn't know, the definitive and canonical version of
the Jargon File is at [ http://pebbles.eps.mcgill.ca/jargon/ ]. There
are a LOT of outdated versions on the 'Net, so bookmark this one.)
Perhaps a better place to go would be Federal Standard 1037C:
Telecommunications: Glossary of Telecommunication Terms
[ http://www.its.bldrdoc.gov/fs-1037/fs-1037c.htm ].
Here are key definitions from these sources...
================================================================
>From the Jargon File:
byte /bi:t/ n.
[techspeak] A unit of memory or data equal to the amount used to
represent one character; on modern architectures this is usually
8 bits, but may be 9 on 36-bit machines. Some older architectures
used `byte' for quantities of 6 or 7 bits, and the PDP-10 supported
`bytes' that were actually bitfields of 1 to 36 bits! These usages
are now obsolete, and even 9-bit bytes have become rare in the
general trend toward power-of-2 word sizes.
Historical note: The term was coined by Werner Buchholz in 1956
during the early design phase for the IBM Stretch computer;
originally it was described as 1 to 6 bits (typical I/O equipment
of the period used 6-bit chunks of information). The move to an
8-bit byte happened in late 1956, and this size was later adopted
and promulgated as a standard by the System/360. The word was
coined by mutating the word `bite' so it would not be accidentally
misspelled as bit. See also nybble.
================================================================
>From the Jargon File:
nybble /nib'l/ (alt. `nibble') n.
[from v. `nibble' by analogy with `bite' => `byte'] Four bits;
one hex digit; a half-byte. Though `byte' is now techspeak, this
useful relative is still jargon. Compare byte; see also bit.
The more mundane spelling "nibble" is also commonly used.
Apparently the `nybble' spelling is uncommon in Commonwealth
Hackish, as British orthography would suggest the pronunciation
/ni:'bl/.
Following `bit', `byte' and `nybble' there have been quite a
few analogical attempts to construct unambiguous terms for bit
blocks of other sizes. All of these are strictly jargon, not
techspeak, and not very common jargon at that (most hackers
would recognize them in context but not use them spontaneously).
We collect them here for reference together with the ambiguous
techspeak terms `word', `half-word' and `double word'; some
(indicated) have substantial information separate entries.
2 bits:
crumb, quad, quarter, tayste, tydbit
4 bits:
nybble
5 bits:
nickle
10 bits:
deckle
16 bits:
playte, chawmp (on a 32-bit machine), word (on a 16-bit machine),
half-word (on a 32-bit machine).
18 bits:
chawmp (on a 36-bit machine), half-word (on a 36-bit machine)
32 bits:
dynner, gawble (on a 32-bit machine), word (on a 32-bit machine),
longword (on a 16-bit machine).
36 bits:
word (on a 36-bit machine)
48 bits:
gawble (under circumstances that remain obscure)
64 bits
double word (on a 32-bit machine)
The fundamental motivation for most of these jargon terms (aside
from the normal hackerly enjoyment of punning wordplay) is the
extreme ambiguity of the term `word' and its derivatives.
================================================================
>From Federal Standard 1037C:
byte (B): A sequence of adjacent bits (usually 8) considered as a unit.
Note: In pre-1970 literature, "byte" referred to a variable-length bit
string. Since that time the usage has changed so that now it almost
always refers to an 8-bit string. This usage predominates in computer
and data transmission literature; when so used, the term is synonymous
with "octet."
================================================================
>From Federal Standard 1037C:
character: 1. A letter, digit, or other symbol that is used as part
of the organization, control, or representation of data. 2. One of
the units of an alphabet.
================================================================
>From Federal Standard 1037C:
word: A character string or a bit string considered to be an
entity for some purpose. Note: In telegraph communications,
six character intervals are defined as a word when computing
traffic capacity in words per minute, which is computed by
multiplying the data signaling rate in baud by 10 and dividing
the resulting product by the number of unit intervals per character.
================================================================
------------------------------
From: [EMAIL PROTECTED] (ChenNelson)
Subject: Lehmann Primality Test
Date: 21 Aug 2000 01:45:06 GMT
=====BEGIN PGP SIGNED MESSAGE=====
Hash: SHA1
Does anyone know the actual average fraction of composites that will
falsely indicate "prime" with the Lehmann primality test?
Lehmann test: take a random a, compute a^((p-1)/2) mod p. If that
value is 1 or p-1, then output "prime."
I know the theoretical *upper* limit is 1/2, but a few trial runs on
my computer seem to indicate an average value much lower than this.
Later,
Nelson Chen
=====BEGIN PGP SIGNATURE=====
Version: PGP for Personal Privacy 5.5.2
Comment: For public key, go to key server with key ID 0xD28C0DD9
iQA/AwUBOaCK2G1ACZTSjA3ZEQL32gCfeM/vbf28wthti8l7h1VjMb2rMpkAn2rp
SSIxwo3J2NxXlvv1FkXUjoWB
=xiaB
=====END PGP SIGNATURE=====
==========================
To earn $0.05 per clickthrough from your web page, please go to
http://www.3wmart.com/ and sign up for our button banner program.
------------------------------
From: [EMAIL PROTECTED]
Subject: Re: Lehmann Primality Test
Date: Mon, 21 Aug 2000 02:14:11 GMT
In article <[EMAIL PROTECTED]>,
[EMAIL PROTECTED] (ChenNelson) wrote:
> -----BEGIN PGP SIGNED MESSAGE-----
> Hash: SHA1
>
> Does anyone know the actual average fraction of composites that will
> falsely indicate "prime" with the Lehmann primality test?
>
> Lehmann test: take a random a, compute a^((p-1)/2) mod p. If that
> value is 1 or p-1, then output "prime."
>
> I know the theoretical *upper* limit is 1/2, but a few trial runs on
> my computer seem to indicate an average value much lower than this.
if 'p' is prime then a^(p-1) mod p will be 1. This is a similar test
(actually the same test). This is also a variation of Fermats theorem
and not very reliable.
Tom
Sent via Deja.com http://www.deja.com/
Before you buy.
------------------------------
From: Future Beacon <[EMAIL PROTECTED]>
Subject: Re: How Many?
Date: Sun, 20 Aug 2000 22:20:06 -0400
The sort of elements I had in mind might include the factor
of shared data (original, secret, or derived from previous
transmissions). The list I had in mind should definitely
include random number files (or generators) also.
Jim Trek
Future Beacon Technology
http://eznet.net/~progress
[EMAIL PROTECTED]
------------------------------
From: David A Molnar <[EMAIL PROTECTED]>
Subject: Re: Lehmann Primality Test
Date: 21 Aug 2000 02:28:18 GMT
ChenNelson <[EMAIL PROTECTED]> wrote:
> I know the theoretical *upper* limit is 1/2, but a few trial runs on
> my computer seem to indicate an average value much lower than this.
I was recently at a talk which mentioned that Landrock, Damga^*rd, and
Pomerance had investigated the average case for Rabin-Miller. Perhaps
that kind of analysis would shed light on the Lucas-Lehmann case.
or perhaps the bibliography for the paper would have pointers.
Unfortunately I don't know the exact reference.
-David
------------------------------
From: [EMAIL PROTECTED]
Subject: Does anyone want a coded message to decode?
Date: Mon, 21 Aug 2000 02:57:40 GMT
I have a pair of routines that encode and decode. The encoded message
is a binary file. I can send you as much encoded text as you need.
Would anyone like to have a go at decoding? The encoding algorithm is
pretty simple, but novel.
I wasn't planning to send any plaintext, just the encoded text.
z
Sent via Deja.com http://www.deja.com/
Before you buy.
------------------------------
From: [EMAIL PROTECTED] (Mack)
Subject: Re: Lehmann Primality Test
Date: 21 Aug 2000 03:26:09 GMT
>In article <[EMAIL PROTECTED]>,
> [EMAIL PROTECTED] (ChenNelson) wrote:
>> -----BEGIN PGP SIGNED MESSAGE-----
>> Hash: SHA1
>>
>> Does anyone know the actual average fraction of composites that will
>> falsely indicate "prime" with the Lehmann primality test?
>>
>> Lehmann test: take a random a, compute a^((p-1)/2) mod p. If that
>> value is 1 or p-1, then output "prime."
>>
>> I know the theoretical *upper* limit is 1/2, but a few trial runs on
>> my computer seem to indicate an average value much lower than this.
>
>if 'p' is prime then a^(p-1) mod p will be 1. This is a similar test
>(actually the same test). This is also a variation of Fermats theorem
>and not very reliable.
Most indications are that the fermat test is fairly good at distinguishing
composites. See Rabins article "Finding Four Million Large Random Primes"
from Crypto '90 At least in that case no composites were found that
had no small divisors and passed the fermat test. Note that the confirmation
of primality was via the Rabin-Miller test so the results are iffy. But it
seems to indicate that numbers that pass the fermat test and fail
the Rabin-Miller test are sparse.
I am sure there is more recent literature that confirms that point.
If you really need numbers that are provably prime then you should
generate them with a method that guarantees primality.
>
>Tom
>
>
>Sent via Deja.com http://www.deja.com/
>Before you buy.
>
>
Mack
Remove njunk123 from name to reply by e-mail
------------------------------
From: [EMAIL PROTECTED] (Mr. Ian LeKoy)
Subject: Re: Does anyone want a coded message to decode?
Date: Mon, 21 Aug 2000 03:41:37 GMT
[EMAIL PROTECTED] wrote:
>I have a pair of routines that encode and decode. The encoded message
>is a binary file. I can send you as much encoded text as you need.
>Would anyone like to have a go at decoding? The encoding algorithm is
>pretty simple, but novel.
If you want to have your work evaluated, make available the source code and
a complete explanation of your routines, including why and how each step
contributes to security. DO NOT post gibberish here and challenge everyone
to make sense of it, because it just doesn't work like that. For more
information, see "2.3. How do I present a new encryption scheme in
sci.crypt?" in the Cryptography FAQ.
--
"Mr. Ian LeKoy" is actually 0276 439581 <[EMAIL PROTECTED]>.
01 234 56789 <- Use this key to decode my email address and name.
Play Five by Five Poker at http://www.5X5poker.com.
------------------------------
From: [EMAIL PROTECTED]
Subject: Re: Does anyone want a coded message to decode?
Date: Mon, 21 Aug 2000 03:37:18 GMT
In article <8nq5qt$1vi$[EMAIL PROTECTED]>,
[EMAIL PROTECTED] wrote:
> I have a pair of routines that encode and decode. The encoded message
> is a binary file. I can send you as much encoded text as you need.
> Would anyone like to have a go at decoding? The encoding algorithm is
> pretty simple, but novel.
Then send the algorithm
Tom
Sent via Deja.com http://www.deja.com/
Before you buy.
------------------------------
From: [EMAIL PROTECTED] (John Hascall)
Crossposted-To: comp.lang.c
Subject: Re: blowfish problem
Date: 21 Aug 2000 04:06:44 GMT
Trevor L. Jackson, III <[EMAIL PROTECTED]> wrote:
>Gergo Barany wrote:
>> Daniel Leonard <[EMAIL PROTECTED]> wrote:
>> > I do not want to be rude, but there are some "errors" in your code.
>> > On 17 Aug 2000, John Hascall wrote:
>> > > out = malloc(inLen * 2 + 1);
>> >
>> > shouldn't it be:
>> > out = malloc((inLen * 2 + 1) * sizeof(char));
>> > /* a char could be more than 1 byte */
>> No, a char is always one byte in C.
>No it's not. A character is the minimum unit of addressability, which may be
>other than a byte. Espcially on older machines, making this assumption is an
>error.
sizeof(char) is *DEFINED* to be 1. Period. Always. End of Story.
(inLen * 2 + 1) * sizeof(char);
is therefor defined to be:
(inLen * 2 + 1) * 1;
which one can write as:
(inLen * 2 + 1);
should one so desire.
John
--
John Hascall (__) Shut up, be happy.
Software Engineer, ,------(oo) The conveniences you demanded
Acropolis Project Manager, / |Moo U|\/ are now mandatory.
ISU Academic IT * ||----|| -- Jello Biafra
------------------------------
From: [EMAIL PROTECTED]
Subject: Re: Lehmann Primality Test
Date: Mon, 21 Aug 2000 04:28:35 GMT
In article <[EMAIL PROTECTED]>,
[EMAIL PROTECTED] (Mack) wrote:
> >In article <[EMAIL PROTECTED]>,
> > [EMAIL PROTECTED] (ChenNelson) wrote:
> >> -----BEGIN PGP SIGNED MESSAGE-----
> >> Hash: SHA1
> >>
> >> Does anyone know the actual average fraction of composites that
will
> >> falsely indicate "prime" with the Lehmann primality test?
> >>
> >> Lehmann test: take a random a, compute a^((p-1)/2) mod p. If that
> >> value is 1 or p-1, then output "prime."
> >>
> >> I know the theoretical *upper* limit is 1/2, but a few trial runs
on
> >> my computer seem to indicate an average value much lower than this.
> >
> >if 'p' is prime then a^(p-1) mod p will be 1. This is a similar test
> >(actually the same test). This is also a variation of Fermats
theorem
> >and not very reliable.
>
> Most indications are that the fermat test is fairly good at
distinguishing
> composites. See Rabins article "Finding Four Million Large Random
Primes"
> from Crypto '90 At least in that case no composites were found that
> had no small divisors and passed the fermat test. Note that the
confirmation
> of primality was via the Rabin-Miller test so the results are iffy.
But it
> seems to indicate that numbers that pass the fermat test and fail
> the Rabin-Miller test are sparse.
Wasn't that paper by Rivest? And wasn't it shown that the Fermat
method is wrong with a higher probability?
Tom
Sent via Deja.com http://www.deja.com/
Before you buy.
------------------------------
From: [EMAIL PROTECTED] (John Savard)
Crossposted-To: rec.arts.sf.written
Subject: Re: News re annotated version of "Fire Upon the Deep"
Date: Mon, 21 Aug 2000 04:47:39 GMT
On Mon, 21 Aug 2000 01:23:36 GMT, Eric Lee Green <[EMAIL PROTECTED]>
wrote, in part:
>Unless we add decryption capability into the video and
>sound hardware itself, all such efforts are doomed, and it will be a
>long time before all the current (non-decrypting) hardware goes away.
This is true. But the support chips for the Pentium III - the same
chip that has the hardware RNG - are designed to provide a form of
encryption that would protect programs so that one would need to
actually intercept bus signals to get data going to sound cards and
video cards.
Also, if people needed to buy a new decrypting video card to watch
movies on their computer, they probably would - DVD playback
approaches this already - and such a card could also handle sound as
well.
I think it is only a matter of time before some form of
content-protecting device becomes widely available. If it doesn't
cripple the computer it is installed in, or if it is a separate box,
it may even have some success.
For example, it isn't reasonable to expect that, any time soon, one
will be able to electronically browse the books at the public library
_without_ such a system in place. But there is, of course, no
guarantee that _with_ such a system, it will be used for anything
except controlled distribution of electronic content for which a high
price is charged. And in that case, for the same or nearly the same
money, people are likely to prefer buying physical media - including
books printed on paper as well as CDs and DVDs.
And the same privacy concerns that hampered DIVX would apply even to
renting movies this way - and, unlike DIVX, such a scheme would face
bandwidth problems for a while yet.
John Savard
http://home.ecn.ab.ca/~jsavard/crypto.htm
------------------------------
** FOR YOUR REFERENCE **
The service address, to which questions about the list itself and requests
to be added to or deleted from it should be directed, is:
Internet: [EMAIL PROTECTED]
You can send mail to the entire list (and sci.crypt) via:
Internet: [EMAIL PROTECTED]
End of Cryptography-Digest Digest
******************************