Cryptography-Digest Digest #122, Volume #13 Wed, 8 Nov 00 14:13:01 EST
Contents:
Re: Whole file encryption (Mok-Kong Shen)
Re: algorithms before 1939 (Mok-Kong Shen)
Re: hardware RNG's (David Schwartz)
Re: Hardware RNGs (David Schwartz)
Re: Randomness from key presses and other user interaction (David Schwartz)
Re: Help Needed with Public Key Cryptography (Mike Rosing)
On the Limits of Digital Rights Management Systems in Consumer Market Contexts
(Peter Cassidy at Boston)
Re: hardware RNG's (Alan Rouse)
Re: hardware RNG's (David Schwartz)
Re: Updated XOR Software Utility (freeware) Version 1.1 from Ciphile (Richard
Heathfield)
----------------------------------------------------------------------------
From: Mok-Kong Shen <[EMAIL PROTECTED]>
Subject: Re: Whole file encryption
Date: Wed, 08 Nov 2000 18:09:28 +0100
Tom St Denis wrote:
>
> Mok-Kong Shen <[EMAIL PROTECTED]> wrote:
> >
> >
> > Benjamin Goldberg wrote:
> > >
> > > The following is a simple idea for whole file encryption.
> > > sbox is actually a keyed sbox.
> > >
> > > encrypt_r( data, length, sbox )
> > > tmp1 = length, tmp2 = tmp1/2, tmp3 = tmp1-tmp2
> > > ptr1 = &data[0], ptr2 = &data[tmp3]
> > > for( i = 0; i < tmp2; ++i, ++ptr1, ++ptr2 )
> > > *ptr1 += sbox[*ptr2]
> > > *ptr2 += sbox[*ptr1]
> > > *ptr1 += sbox[*ptr2]
> > > if( tmp2 != tmp3 )
> > > *ptr1 = sbox[*ptr1]
> > > if( tmp2 > 0 )
> > > encrypt_r( data, tmp2, sbox )
> > > if( tmp3 > 0 )
> > > encrypt_r( ptr1, tmp3, sbox )
> > >
> > > encrypt( data, length, sbox )
> > > encrypt_r( data, length, sbox )
> > > encrypt_r( data, length, sbox )
> >
> > I don't see why you have
> >
> > *ptr1 += sbox[*ptr2]
> > *ptr2 += sbox[*ptr1]
> > *ptr1 += sbox[*ptr2]
> >
> > and don't have simply the first two statements. (Compare
> > a normal Feistel cipher).
>
> Because the Luby-Rackoff proofs of construction only hold when there
> are at least three rounds.
In a common Feistel cipher one has 2 rounds with which each
half, which corresponds in a general sense to an element
of the array in the present case, gets processed and such
a 'cycle' gets repeated several times, e.g. 8 cycles for DES.
In the algorithm presented by Benjamin Goldberg in the
main function encrypt the function encrypt_r is called two
times (I suppose that should be called more times or a
user-specified number of times -- a point yet awaiting
answer by him). So I don't think that it is necessary to
have the third statement in the above code.
> > Does
> >
> > encrypt( data, length, sbox )
> > encrypt_r( data, length, sbox )
> > encrypt_r( data, length, sbox )
> >
> > simply mean that you want to repeat encrypt_r exactly
> > two times? If yes, why?
> >
> > If I understand correctly, what you do in encrypt_r
> > is equivalent to doing a permutation of the elements
> > of the array data (or segments of that array in the
> > recursive calls) thru bringing those at some distance
> > away to become neighbours and then perform a normal
> > Feistel cycle. There is some similarity to an idea I
> > posted recently in the thread 'On block encrpytion
> > processing with intermediate permutations' (25th Sep),
> > the difference being that I use pseudo-random
> > permutations, while you employ special permutations,
> > if I don't err.
> >
> > It may be of interest to note that, if the file, i.e.
> > the array data, has 2^m elements, then one way of
> > applying a Feistel cipher is doing recursion, i.e.
> > first dividing the whole into two halves for Feistel
> > processing which is itself done by subdivision and so
> > on. See the thread 'On higher order Feistel schemes'
> > posted by me on 13th May.
>
> That was my design in TC5, and Matt Blazes idea for Turtle, this is
> nothing new!!!!
There is actually nothing NEW under the sun. One of the
main attempts in natural science, as far as I under
understand, is to find some general principles that can be
applied in as wide a context as possible and are formulated
as simple as possible.
M. K. Shen
------------------------------
From: Mok-Kong Shen <[EMAIL PROTECTED]>
Subject: Re: algorithms before 1939
Date: Wed, 08 Nov 2000 18:11:53 +0100
Michal z Sopotu wrote:
>
> I`m looking for some internet pages,magazines, books (or other sources) of
> cipher/decipher algorithms used before 1939.
The URL www.bletchleypark.org.uk might be of some interest
to you.
M. K. Shen
------------------------------
From: David Schwartz <[EMAIL PROTECTED]>
Subject: Re: hardware RNG's
Date: Wed, 08 Nov 2000 09:17:47 -0800
Alan Rouse wrote:
>
> David Schwartz wrote:
>
> > ...it's not difficult to conclude that the variation is, at least
> > in part, due to the temperature change. That and the fact that nobody
> > else could measure that temperature change is all you need.
>
> Not necessarily. Actually, the question of who can measure the temp
> changes is completely irrelevant to the question of whether those
> changes are random (although it is relevant to usability of the
> randomness for cryptography).
I do grant that they are separate questions. Radioactive decay is as
random as anything, but if your attacker can measure the decay of the
same sample you can, then you're in trouble.
> In addition to what you stated, one needs to know something about the
> physics of temperature changes in a crystal oscillator, and the
> relationship between those changes and changes in oscillator
> frequency. If the oscillator frequency is determined by the
> temperature, then the randomness of the frequency is completely
> dependent on the randomness of the temperature. I really don't know
> that the temperature is not correlated with time. Actually I strongly
> suspect that the temperature at time n is correlated to the temperature
> at time n+1, at least for some sampling rates.
Of course it is, highly correlated. However, that really doesn't
matter, because it's the parts-per-billion of frequency that we care
about, which are influenced by the parts-per-billion of temperature. And
that cannot be predicted by any model I have ever seen. The only way to
determine the effect temperature has on a crystal oscillator from
moment-to-moment is to measure that frequency.
DS
------------------------------
From: David Schwartz <[EMAIL PROTECTED]>
Subject: Re: Hardware RNGs
Date: Wed, 08 Nov 2000 09:24:59 -0800
Mack wrote:
>
> >
> >Mack wrote:
> >
> >> The LSB of the RDTSC are purely deterministic. It increments one for each
> >> clock tick. I may not be able to 'guess' the bits of the RDTSC but I can
> >sure
> >> calculate them. In a multi-process environment this is a bit more
> >difficult
> >> but
> >> not impossible. It is effectively measuring the combined process run
> >times.
> >
> > You can't calculate them because the number of clock cycles it takes to
> >do many things the CPU takes is non-deterministic. For example, I go to
> >read a block from a disk controller. Which CPU clock cycle that read
> >comes in at is non-deterministic because it relies upon the exact ratio
> >of two real numbers (the CPU oscillator and the disk controller
> >oscillator).
> Hard drive access times are not the same as the RDTSC being random.
Yes they are because CPU operations will be delayed until data from the
hard drive is available.
> Many workstations don't have hard-drives of thier own. The network
> traffic is very easy to monitor. It can often be done from outside of the
> building. Tempest certainly works if the cables aren't shielded.
Monitoring the network traffic doesn't do you any good because you
don't know the exact instant the network card on the client machine will
notice the traffic. Knowing when it's put on the wire won't tell you the
billionth of a second that it will be noticed by the CPU.
> > You would need to know an awful lot of internal timing data from the
> >computer that would normally be completely inaccessible to an attacker.
> >And, of course, any attempt you made to measure the disk controller's
> >performance would change the very numbers you are measuring. The net
> >result is that there is certainly no practical way and arugably no
> >conceivable way to predict the LSB of the TSC.
> I agree that there is no practical way. But the arguement was
> strictly are the LSBs random. By themselves no.
What do you mean by "by themselves"?
> Measuring other sources of randomness using the LSB of
> the TSC will certainly give randomness. But that is
> far from the TSC being random.
That happens automatically on any realistic machine. Consider, for
example, a machine that provides random numbers for Internet gambling.
Each request that requires random numbers must come in from somewhere,
and the timing of that request measured to a billionth of a second has
some entropy in it. That entropy will be in the lsb of the TSC when the
code to process that request gets executed.
> On a side note there is also the matter of hard disk turbulence
> which produces a very slight amount of randomness. There are
> also misreads which happen occassionally.
>
> Does anyone know which IDE drives have indpendent internal
> clocks and which ones synchronize the clock to the system bus?
> This tends to be a serious issue in overclocked systems.
> Ie. if the bus is overclocked the drive stops working.
They may synchronize their I/O clock to the system bus, but I doubt
they could synchronize other clocks. That would require that the
frequencies line up. If they do synchronize their CPU to the bus clock,
however, that would significantly reduce (in theory) the amount of
entropy available from disk reads.
I just looked through a stack of 6 drives on my desk. The SCSI ones
seem to have their own oscillators. The IDE ones seem to not.
DS
------------------------------
From: David Schwartz <[EMAIL PROTECTED]>
Subject: Re: Randomness from key presses and other user interaction
Date: Wed, 08 Nov 2000 09:26:26 -0800
Mack wrote:
> But the thread was about the user interaction. There was
> already a thread on oscillators.
If timing keystrokes mines oscillator entropy, then it's a good source
of entropy. If it doesn't, it may or may not be. You posted the original
message, read it again.
DS
------------------------------
From: Mike Rosing <[EMAIL PROTECTED]>
Subject: Re: Help Needed with Public Key Cryptography
Date: Wed, 08 Nov 2000 11:55:03 -0600
Lee Hasiuk wrote:
>
> I'm set on 64 bits because I want to allow the possibility of the
> registration key being used on a computer which is not connected to
> the Internet, where the user would type the code in manually, and that
> would be all that's needed to unlock the product. Maybe I shouldn't
> allow for that, but it seemed like something that might be easy to do.
> Now I'm not so sure.
With ECC you can get 64 bits of security with 130 bits transmitted. If
you only need 56 bits of security (DES equivelent) then you can transmit
113 bits. If you transmit 64 bits only, you have about 32 bits of security.
That's truely worthless for anyone who wants to attempt to crack it, but
for those who won't make the attempt it's perfecty secure.
So what are the odds that someone will attempt to crack your security
vs. the attempt to bypass your security? If bypassing is more likely,
you might as well go with less security since it won't make any difference
in terms of cracking anyway.
Patience, persistence, truth,
Dr. mike
------------------------------
From: Peter Cassidy at Boston
Subject: On the Limits of Digital Rights Management Systems in Consumer Market Contexts
Date: Wed, 8 Nov 2000 18:05:51 GMT
the bs- and buzzword-free journal of entertainment and technology
Digital Mogul Vol 3 Report 7
Page 7
Revolving Expert Series The Forum for Rotating Gurus / Where those who
know, tell you what they know.
"Will Fixation on Security Silence the Trumpets of Fame?"
by Scott Moskowitz and Peter Cassidy, Blue Spike, Inc.
Today, the entire culture is being forced to revisit questions of how
copyright is defined, applied, and enforced something it hasn't had to
do, with this scope, for hundreds of years. Way Back When in England,
the big deal was about the control of heretical texts and, later, the
suppression of Scots publishers. Now it's about control of
profit-making intellectual property and suppression of renegade online
distributors; gods, scripture and technologies come and go, but we
keep getting into the same squabbles.
What has delivered us to our latest evolutionary threshold of
copyright is the inherent democratization that market-chasing
technological advances engender. Ease of copying and distribution by
technical illiterates has never been greater, pitting content owners
against their own customers. In this advance of electronic
communications, established media conglomerates
have and won legislation in the Digital Millennium Copyright Act
(DMCA) that criminalizes the act of bypassing anti-copying systems.
Technologies have been perfected that would allow creators and
distributors to force compliance with usage rules in ways that can
circumvent fair use entitlements built into copyright law and may
outlast copyright itself, ensuring a sort of copy-control in
perpetuity.
What is novel about our moment is that the new laws and technical
solutions being developed can effectively pre-empt and eliminate
consumers' legitimate abilities to manipulate digital works that they
have purchased or licensed before any breach of copyright has
transpired.
Strangely, but in context with our times, a military-intelligence
model of security has been applied to consumer goods, threatening the
flow of free samples that irrigates consumer markets. In a way, we may
well be reversing the gains made since modern copyright was born 300
years ago. We abhor piracy and appreciate the position of the media
companies. Without intellectual property, economies shrivel.
Yet, in the long-term, the new communications media will provide the
greatest boon to these established market makers, as has been the case
with radio, television and recordable media like VHS. And Internet
technologies will do that in much the same way as older technologies
by allowing consumers' love to blow the trumpets of fame and generate
recognition and demand for artists' work. Sometimes that promotional
dynamic is expressed as friends on a stoop with a boom box and a case
of discount beer, sometimes as the loaning of records between
impoverished enthusiasts and sometimes as swapping of MP3 files among
friends via e-mail. Who knows what else may be possible as the
technology evolves?
What we are faced with today, and this is the core of Blue Spike's
sense of trepidation, is the frantic development of technologies and
laws based on the a priori presumption that access restriction is the
only credible approach to securing copyright and protecting
intellectual property on the Internet. It is as if air transport
technologies were being built around the assumption that only
lighter-than- air vehicles can achieve flight.
Evolution of security and e-commerce instrumentation to mediate
legitimate trade in music and media objects are being severely
retarded, if not wholly precluded by this assumption, creating a
dysfunctional cycle. Content owners and technologists fixate on access
restriction and only access restriction technologies are considered.
They fail to meet consumers' expectations that have been acclimated by
decades of free access, sending them scampering off to the contraband
bazaars of the Web. This only proves the content owners' theory that
consumers are a pack of feral thieves whose larcenous appetites must
be arrested by black-box content protection systems.
The threat, however, is manifold, potentially impinging on the culture
and tradition of the public domain that has been so hard won and
which has served our economies so well. It is worth reviewing how
media consumers' rights came to light and what benefits they have
bestowed before we run off and stock the shelves of our electronic
markets with black boxes. Copyright law never gave a hammerlock of
control to creators, nor did it give unlimited freedom of use to the
public or commercial sectors. In its baldest characterization,
copyright is an institutionalized compromise between copyright
holders' legal right to hold temporary monopolies on
information and the public's codified right to use that information
according to those prescriptions during and after the monopoly is in
force.
The concept of copyright in its modern manifestation cannot exist
outside of the context of public interest and its embodiment in the
legal entity called the "public domain." Four hundred and fifty years
ago, in England, copyright laws were established for printers, in
large part to control the dissemination of heretical or dangerous
texts that could threaten the social order. In 1710, however, the
Statute of Anne established copyrights for creators that could be
maintained up to twenty-eight years, after which their works passed
into the public domain. Creators got protection to
reward them for their innovative expression and the public was
enriched with new ideas and knowledge. (The politics of the Statute of
Anne were complex, driven not entirely by unalloyed beneficence. A
large part of its motivation was to get control of "pirate" publishers
in Scotland then only recently incorporated into the U.K. who were
exporting high-quality texts, undercutting crown-licensed
contemporaries in London.)
Yet the concepts this statute embodied inspired others in Europe and
the U.S., and sired a number of wonderful precipitates. Among the
most important was a continuously invigorated consumer population
whose knowledge was indeed expanded and interests cultivated, not
incidentally creating markets for new works. Copyright's limitations
and leaks made consumers of people who might not have participated in
markets for literature, had books been locked away by permanent
monopolies or failed to achieve a diversity (inspired by copyright
protecting authors who could profit from their ingenuity) that could
satisfy all interests. In the context of making retail markets for
information goods, the public's grazing rights on the info-commons
engendered alert, informed and lusty consumer markets.
Since modern copyright laws have appeared, the success of new
technologies from radio to VHS to RIO have been secured by the rough
maintenance of the trade-offs prescribed in copyright law. Today, the
achievements of copyright and public domain hang in the balance.
Supra-legal solutions that bypass the fair use rights under copyright
laws could well reverse the enormous social and economic benefits they
have produced. Our sense is that the rush to secure copyrights with
black boxes to stave off the threats posed by new communications
technologies is actually an artifact of our times.
The Cold War, some of its technology (cryptography) and its binary
worldview, has been transferred to Internet and the markets forming on
it: Eastern Block bad; Western nations good. Napster bad; 1024-bit
key-secured crypto-vault good.
The consumer has been cast as the Evil Doer or as an accomplice whose
machinations must be met with technologically superior armament,
seething yuppie lawyers and a civil law-enforcement posture that would
make 1980's South Korea look snugglesome. (Hint: If the police ever
arrive to search your server with a water cannon, it is not time to
break out the soap-on-a-rope.) This approach has been enormously
expensive in the opportunities lost in the pursuit of black-box
security systems for media assets such as Digital Rights Management
(DRM) systems that restrict usage according to predetermined rules
scripted by content owners and distributors.They are impressive
technologies but by design not appropriate for markets in which
consumers have to be lured and seduced. The guiding questions cast by
the media industries thus far have been focused around annihilating
the Napsters or MP3s of the world.
Our thinking about copyrights and the Web should, instead, be geared
to sculpting rational instrumentation for securing intellectual
property and mediating its honest trade in electronic markets. Part of
the solution may involve e-locks and e-keys but our sense is that the
total solution will be a lot more layered and nuanced, ultimately
defaulting toward easing consumers' usage burdens.
Technically speaking, the fixation with creating black boxes for media
assets has absorbed a lot of development time that could have been
applied to solutions that preserved the copyright balance that is in
place today and serve consumers' interests. The Secure Digital Music
Initiative (SDMI) is a good example. Charged to find a solution to
online music piracy, the SDMI chose to use digital watermarking as an
instrument for a larger copy-control and playback-control mechanism.
SDMI's specification required a great deal of specialized knowledge in
signals processing and steganographic arts, disciplines that Blue
Spike first married in its patents around five years ago. Application
of all that technology in a machine-readable access control scheme has
produced systems that have all reportedly been hacked by the Princeton
group.
Sidestepping the issue of SDMI's design specification for open,
machine-readable watermarks and its inherent exposures, we ask,
what advances could have been made had all the engineering
experience, expertise and research embodied by SDMI been
dedicated to alternative proposals? What if some of that knowledge of
signals processing and steganography were invested in a standard for
fingerprinting and authenticating songs that would be cleared through
subscription-based online catalogues to direct and mediate payment to
artists whose work is passed through the system. Technically, the
pieces needed to assemble such a system are on the shelf today. Blue
Spike has been sitting on much of the techniques and technologies
required for such a scheme for years. Our contemporaries in the fields
of security, steganography and signals processing possess parts of the
technologies required for such a system as well. This is but one
example.
The bottom line is that a great many solutions with potential for
constructively animating authentic markets for digital media on the
Web are within reach, but can't get on the agenda because of the
institutionalized limitation of the technical imaginations being
brought to bear on the problem. Even if access control schemes can be
made to work or consumer expectations can be lowered or changed it may
not create a better world or bigger markets. Military-intelligence
style information security technologies applied for their own sake in
consumer media goods would eliminate the most useful promotional
aspect of Internet, limiting the ability for consumers to blow the
trumpets of fame that make headliners of aspirants and keep the stars
in the firmament.
Scott Moskowitz is CEO of Blue Spike, Inc., a media security company
that pioneered the development of key-based, secure digital watermarks
([EMAIL PROTECTED]); Peter Cassidy is Director of Communications at
Blue Spike ([EMAIL PROTECTED]).
------------------------------
From: Alan Rouse <[EMAIL PROTECTED]>
Subject: Re: hardware RNG's
Date: Wed, 08 Nov 2000 18:14:05 GMT
David Schwartz wrote:
> Of course it is, highly correlated. However, that really doesn't
> matter, because it's the parts-per-billion of frequency that we care
> about, which are influenced by the parts-per-billion of temperature.
And
> that cannot be predicted by any model I have ever seen.
You seem to be equating an event's randomness with your ability to
predict that event. I think that is an inadequate definition of
randomness. An event that occurs with statistical bias is not random,
but it still might be extremely difficult to predict.
Sent via Deja.com http://www.deja.com/
Before you buy.
------------------------------
From: David Schwartz <[EMAIL PROTECTED]>
Subject: Re: hardware RNG's
Date: Wed, 08 Nov 2000 10:23:12 -0800
Alan Rouse wrote:
>
> David Schwartz wrote:
> > Of course it is, highly correlated. However, that really doesn't
> > matter, because it's the parts-per-billion of frequency that we care
> > about, which are influenced by the parts-per-billion of temperature.
> And
> > that cannot be predicted by any model I have ever seen.
>
> You seem to be equating an event's randomness with your ability to
> predict that event. I think that is an inadequate definition of
> randomness. An event that occurs with statistical bias is not random,
> but it still might be extremely difficult to predict.
That is total nonesense. If an event can only be described
statistically (bias or no) that means it's random. Random and
unpredictable are synonymous. The extent to which an event cannot be
predicted is the extent to which it is random. The extent to which you
can measure the unpredictable data (and others can't) is the amount of
entropy you will get out if you feed the data into a cryptographically
strong hash function (up to its limits, of course).
DS
------------------------------
Date: Wed, 08 Nov 2000 18:32:30 +0000
From: Richard Heathfield <[EMAIL PROTECTED]>
Crossposted-To: alt.freespeech,talk.politics.misc,talk.politics.crypto
Subject: Re: Updated XOR Software Utility (freeware) Version 1.1 from Ciphile
Scott Craver wrote:
>
> Anthony Stephen Szopa <[EMAIL PROTECTED]> wrote:
> >"Trevor L. Jackson, III" wrote:
> >>
> >> Pointing out the limitations of your software is to amusement as Jerry Springer is
> >> to entertainment.
> >
> >You said it: so what are the limitations of the XOR software
> >utility?
>
> We just *told* you. It's a huge binary, not open, and only runs on a
> single platform.
>
> -S
Mr Szopa's program is 315392 bytes in size after decompression. No
source code is provided. I know I'm not the only one to think this to be
the height of lameness. So, perhaps not unnaturally, I wondered (purely
in the spirit of scientific enquiry, as befits a sci. newsgroup) if it
were possible to write an even lamer program. I tried hard. But did I
succeed? That's for the scientific community to judge. (I was going to
save this till April 1st, but the moment seems ripe.)
I'm afraid I can't challenge the original for file size, even though I
made every effort not to tell the compiler to optimise for size, and
even copied the OP's idea to bloat the binary by making the interface
graphical. I included some radio buttons and a progress bar, in a
desperate attempt to add even more bloat. I managed to scrape together
302592 bytes of binary - just 12800 bytes short of the target.
I chose C++ Builder as my development platform, and Borland are working
hard to make Builder portable to Linux, so although I match the original
on lack of actual portability, my program is - alas - potentially
portable.
Unfortunately, the full source code (C++ Builder files, I'm afraid) is
available at http://users.powernet.co.uk/eton/crypto/SNAsrc.zip (around
8 kilobytes zipped). This doesn't bode well for the title challenge.
On features, I think I have him. My program doesn't just do the XOR
thing. It also has two other settings - Vigenere (re-using the shorter
file to ensure that the longer file is fully encrypted) and Bit Flip -
the original SNA-Coil algorithm which some of you may remember.
You can get the binary at
http://users.powernet.co.uk/eton/crypto/SNACoil.zip (around 150
kilobytes zipped), if you're the kind of person who runs Windows
binaries without having first personally built them from source code.
(You have my word, for whatever you think my word is worth, that the
program is non-destructive in all respects EXCEPT it will happily
overwrite an existing file if you specify it as an OUTPUT file. It never
modifies its input files.)
So, my question is: have I succeeded in outlaming the lamer? Only you
can decide.
--
Richard Heathfield
"Usenet is a strange place." - Dennis M Ritchie, 29 July 1999.
C FAQ: http://www.eskimo.com/~scs/C-faq/top.html
K&R answers, C books, etc: http://users.powernet.co.uk/eton
------------------------------
** FOR YOUR REFERENCE **
The service address, to which questions about the list itself and requests
to be added to or deleted from it should be directed, is:
Internet: [EMAIL PROTECTED]
You can send mail to the entire list (and sci.crypt) via:
Internet: [EMAIL PROTECTED]
End of Cryptography-Digest Digest
******************************