RE: Foibles of user "security" questions

2008-01-07 Thread Ian Farquhar (ifarquha)
I've been having this problem for years (my mother's maiden name is,
indeed, four characters long).  It's often rejected as too short, yet
I'm forced to enter it.  I do the workaround of entering it twice, but
then have to remember which sites I applied this hack for.

It's a typical dumb programmer mistake.  Data (password) vs. information
(mother's maiden name).  Character length contributes entropy to one,
but not to the other.  But on an even more fundamental level, it also
indicates a lack of attention to the input data, which could highlight
vulnerabilities in other areas too.



I'm probably preaching to the choir here, and maybe it's a sign of
"grumpy old guy syndrome", but the average programmer seems to me to be
getting dumber every year.  I personally blame University courses who've
so divorced software development from any understanding of the
underlying OS, hardware or information theory, that we've got a bunch of
people who think everyone programs in Java or C#, Microsoft is the only
OS vendor there is, and if your program runs slowly, you just needs more
memory.



Ian.

-Original Message-
From: [EMAIL PROTECTED]
[mailto:[EMAIL PROTECTED] On Behalf Of Leichter, Jerry
Sent: Tuesday, 8 January 2008 4:14 AM
To: cryptography@metzdowd.com
Subject: Foibles of user "security" questions

Reported on Computerworld recently:  To "improve security", a system was
modified to ask one of a set of fixed-form questions after the password
was entered.  Users had to provide the answers up front to enroll.  One
question:  Mother's maiden name.  User provides the 4-character answer.
System refuses to accept it:  Answer must have at least 6 characters.

I can just see the day when someone's fingerprint is rejected as
"insufficiently complex".
-- Jerry

-
The Cryptography Mailing List
Unsubscribe by sending "unsubscribe cryptography" to
[EMAIL PROTECTED]

-
The Cryptography Mailing List
Unsubscribe by sending "unsubscribe cryptography" to [EMAIL PROTECTED]


Re: Death of antivirus software imminent

2008-01-07 Thread Adam Shostack
On Mon, Jan 07, 2008 at 10:35:00AM -0500, [EMAIL PROTECTED] wrote:
| 
| Jerry,
| 
| It is always possible that I misunderstand the McCabe
| score which may come from the fact that so many build
| environments compute it along with producing the binary,
| i.e., independent of human eyeballs.  If complexity
| scoring requires human eyeballs or the presence of the
| designer's flow charts, then will we ever get meaningful
| numbers (sans artificial intelilgence) for code we did
| not write ourselves?  [...yes, this parallels the many
| arguments about how can you trust crypto code you didn't
| write, either...]
| 
| If McCabe scoring is your area, do you agree with the
| rule that a McCabe score of <10 is essential -- an argument
| that I am quoting from some NASA spec I read a while ago
| and can dig up again if that turns out to be necessary.

I'd question the description of "essential."  I've seen code (not at
my current employer) that was very successful in the marketplace that
likely scored in the tens of thousands.  The code had been unrolled
for performance reasons, and those responsible knew the cost they were
paying.

Adam

-
The Cryptography Mailing List
Unsubscribe by sending "unsubscribe cryptography" to [EMAIL PROTECTED]


Foibles of user "security" questions

2008-01-07 Thread Leichter, Jerry

Reported on Computerworld recently:  To "improve security", a system
was modified to ask one of a set of fixed-form questions after the
password was entered.  Users had to provide the answers up front to
enroll.  One question:  Mother's maiden name.  User provides the
4-character answer.  System refuses to accept it:  Answer must have
at least 6 characters.

I can just see the day when someone's fingerprint is rejected as
"insufficiently complex".
-- Jerry

-
The Cryptography Mailing List
Unsubscribe by sending "unsubscribe cryptography" to [EMAIL PROTECTED]


Re: Death of antivirus software imminent

2008-01-07 Thread Leichter, Jerry
| Jerry,
| 
| It is always possible that I misunderstand the McCabe
| score which may come from the fact that so many build
| environments compute it along with producing the binary,
| i.e., independent of human eyeballs.  If complexity
| scoring requires human eyeballs or the presence of the
| designer's flow charts, then will we ever get meaningful
| numbers (sans artificial intelilgence) for code we did
| not write ourselves?  [...yes, this parallels the many
| arguments about how can you trust crypto code you didn't
| write, either...]
| 
| If McCabe scoring is your area, do you agree with the
| rule that a McCabe score of <10 is essential -- an argument
| that I am quoting from some NASA spec I read a while ago
| and can dig up again if that turns out to be necessary.
It's not really my area, sorry.  I'm just looking at this from
a very general point of view.  The point of McCabe and similar
measures is to point you to areas likely to contain bugs, or
that are likely to be particularly costly to implement.  Since
it's *humans* who actually implement (and produced bugs), a
meaningful measure can't depend on things that human beings
don't see.  This kind of analysis can be very powerful.  Everyone
has heard of Galileo's experiment dropping balls of different
weights and proving they hit the ground at the same time - but how
many people are aware of his theoretical argument that this must
be the case?  It's very simple:  Suppose a 2lb ball drops faster
than a 1lb ball.  Take the 2lb ball and pull it into a dumbell
shape, with (almost) 1lb at each end, but the ends very close
together.  Presumably, it still drops at the speed of a 2lb
ball.  Now pull the halves apart a bit at a time, gradually
thining out the connecting segment.  Eventually, you have two
1lb ball connected by a thread.  Does that drop at the speed
of the individual 1lb balls, or at the speed of a 2lb ball?
Clearly, it has to be both - the 1lb and 2lb balls must drop
at the *same* speed!

I pretty sure the build environments that give you McCabe measures
automatically are pulling the information from the control
flow analysis in compiler front ends.  This is where basic
blocks and the edges connecting them are first extracted.
Computing McCabe is trivial at this point - and the structure
it is computed on will correspond pretty directly to what a
human being would have perceived.  As various optimizations
are applied, the structure will change - and there is no
reason to believe that the McCabe measure won't change along
the way, since preserving McCabe is hardly a goal of optimizing
transformations.

| Always ready for re-education, but wary of the best
| being the enemy of the good,
|
| --dan
-- Jerry

-
The Cryptography Mailing List
Unsubscribe by sending "unsubscribe cryptography" to [EMAIL PROTECTED]


Re: Death of antivirus software imminent

2008-01-07 Thread dan

Jerry,

It is always possible that I misunderstand the McCabe
score which may come from the fact that so many build
environments compute it along with producing the binary,
i.e., independent of human eyeballs.  If complexity
scoring requires human eyeballs or the presence of the
designer's flow charts, then will we ever get meaningful
numbers (sans artificial intelilgence) for code we did
not write ourselves?  [...yes, this parallels the many
arguments about how can you trust crypto code you didn't
write, either...]

If McCabe scoring is your area, do you agree with the
rule that a McCabe score of <10 is essential -- an argument
that I am quoting from some NASA spec I read a while ago
and can dig up again if that turns out to be necessary.

Always ready for re-education, but wary of the best
being the enemy of the good,

--dan

-
The Cryptography Mailing List
Unsubscribe by sending "unsubscribe cryptography" to [EMAIL PROTECTED]


Re: Death of antivirus software imminent

2008-01-07 Thread Leichter, Jerry
| ...Taking as our metric the venerable McCabe score:
| 
|v(G) = e - n + 2
| 
| where e and n are the number of edges and nodes in the
| control flow graph, and where you are in trouble when
| v(G)>10 in a single module, the simplest patch adds two
| edges and one node, i.e., v'(G)=v(G)+1, and that minimum
| obtains only for patches with no conditional branches in
| the patch
While I agree with your general point, this particular argument
is a misuse of the McCabe score.  Replacing:

X
Y

with

X
goto L
Y
L':
...
L:  Y'
goto L'

*at the machine code level* should have absolutely no effect on the
complexity of the algorithm (beyond any delta between Y and Y').  If you
insist on computing your McCabe score from the generated code, and it
gives you a different answer, then the score you are deriving is
meaningless.

The whole point of measurements like McCabe is to measure the complexity
of the algorithm *as seen by a human being*.  Automated code transforma-
tions that no human being ever sees should not affect it.  Otherwise,
you're going to have to throw out all your optimizing compilers.  The
transformation above occurs not just in patching but in other contexts -
e.g., this might be adventageous, with Y and Y' semantically equivalent,
if Y contains a bunch of calls that can't be reached with short
call sequences when at their original location, but can be if they are
relocated to L.  Or there might be cache interference effects that are
avoided by relocating.

Roughly similar patterns are used in generating code for loops, where
the surface semantics might require two copies of a test (one at loop
entry, one at the bottom of the loop) but this transformation lets you
get by with a single copy.

As long as the algorithm developer's view is that control flows directly
from X to Y (and there are no incoming edges at Y), this is one node, no
matter how the compiler or patch generator decides to shake and bake it
into memory.
-- Jerry

-
The Cryptography Mailing List
Unsubscribe by sending "unsubscribe cryptography" to [EMAIL PROTECTED]


BSF/DIMACS/DyDAn Workshop on Data Privacy

2008-01-07 Thread Linda Casals
*

BSF/DIMACS/DyDAn Workshop on Data Privacy

  February 4 - 7, 2008
  DIMACS/DyDAn Center, CoRE Building, Rutgers University

Organizers:
  Kobbi Nissim, Ben Gurion University, kobbi at cs.bgu.ac.il 
  Benny Pinkas, University of Haifa, benny at cs.haifa.ac.il 
  Rebecca Wright, Rutgers University, rebecca.wright at rutgers.edu 

Presented under the auspices of the DIMACS Special Focus on 
Communication Security and Information Privacy and 
the Center for Dynamic Data Analysis (DyDAn).



An ever-increasing amount of data is available in digital form, often
accessible via a network. Not surprisingly, this trend is accompanied
by an increase in public awareness of privacy issues and by
legislation of privacy laws. The interest in privacy, and the tension
between privacy and utility of data, is amplified by our growing
ability to collect and store large amounts of data, and our ability to
mine meaningful information from it. This workshop will view privacy
in a broad sense in order to facilitate interaction and discussion
between privacy-oriented researchers in different communities.

The study of "privacy" is inherently interdisciplinary, spanning a
range of applications and scenarios, such as analysis of census data,
detection and prevention of terrorist activity, and biomedical
research. There is a fundamental interplay between privacy and law,
security, economics, and the social sciences. This workshop will
foster interactions between researchers in these fields with those in
statistics and computer science, toward the goal of developing problem
formulations that can be translated into a technical mathematical
language that lends itself to a more rigorous study of privacy. The
workshop will contrast these formal definitions with more intuitive
notions of privacy from the social sciences, economics, philosophy and
law to determine the extent to which they capture the perceived
meaning of privacy in different settings.

Privacy-preserving technologies may soon become an integral part of
the basic infrastructure for the collection and dissemination of
official statistics, as well as for research in business, economics,
medical sciences, and social sciences. Functional solutions for
preserving privacy would therefore serve as a central part of the
infrastructure for those disciplines. This workshop will address a
variety of questions on algorithms for privacy-preserving analysis
such as:

  * To what extent can such techniques be applied to 
 statistical data?
  * What are the consequences to privacy and confidentiality 
 if such techniques are not used?
  * Are changes in statistical tools needed to make them 
 compatible with such techniques?
  * Can the techniques be modified to allow use of standard 
 statistical tools and practices? 

**
Registration:

Pre-registration deadline: January 28, 2008

Please see website for registration information.

*
Information on participation, registration, accomodations, and travel 
can be found at:

  http://dimacs.rutgers.edu/Workshops/DataPrivacy/

   **PLEASE BE SURE TO PRE-REGISTER EARLY**



-
The Cryptography Mailing List
Unsubscribe by sending "unsubscribe cryptography" to [EMAIL PROTECTED]


Re: Question on export issues

2008-01-07 Thread Thierry Moreau


Thanks for this long and thoughtful reply. Some feedback below


Jon Callas wrote:



[...]

If you look at the basic components we have, the ciphers, hash  
functions, and so on, they're all secure enough that a major  government 
can't crack them. [...]


If you look at the medium-level functions, like HMAC, salted hashing,  
tweakable cipher modes, and so on, they are *more* secure. [...]


If you look at the protocols, like TLS, IPsec, OpenPGP, S/MIME, and  so 
on, they're also secure, because they assemble the reasonably  secure 
components together reasonably securely. [...]


All of these things are freely exportable. It's just a matter of  
filling out paperwork.




Indeed, there is no doubt that good algorithms and good protocols are 
implemented in exportable implementations.


I was referring mainly to key management and implementation correctness 
for "hard things" in applied cryptography, e.g.


how to hide a secret on a comupter system, e.g. from Trojan horse 
attacks,

	how to distribute trust in a remote party public key given that 
brwosers and OSs allow easy tampering with the list of "trusted" CAs,


	how to make secret random generation reliable in the presence of 
"ennemy" software on the local system


how to provide traffic flow confidentiality

	strength of the API between the (non-crypto) application and the 
crypto-services layer


	all of the above with a fool-proof user interface, including at 
crypto-application installation time



I don't have an example of a cryptosystem that I'd actually want to  use 
that is non-exportable. And I'm sure that if someone made  something 
that is custom, it's exportable. I have direct evidence of  this.




Agreed, if you are satisfied with the current state of development for 
IT security with respect to issues such as the above ones, and if the 
extent of customization does not include innovations in these issues.


Otherwise, the export control regime is still a nuisance.

Back in 1999, when we were at Counterpane together, John Kelsey and I  
created a set of incompatible Blowfish variants.


By itsef, that's alggorithm tweaking. Remote from key management and 
implementation pitfall avoidance.



- Thierry Moreau

-
The Cryptography Mailing List
Unsubscribe by sending "unsubscribe cryptography" to [EMAIL PROTECTED]


Re: Death of antivirus software imminent

2008-01-07 Thread Ivan Krstić

On Jan 5, 2008, at 3:47 AM, Alexander Klimov wrote:

It sounds like: we cannot make secure OS because it is too large --
let us don't bother to make a smaller secure OS, just add some more
software and hardware to an existent system and then it will be
secure. Sounds like a fairytale


I don't think this is really being said. In fact, I've been pretty  
concretely saying "here's an OS not designed from scratch, but with  
certain pieces modified, that's likely to be extremely resistant to  
viruses, malware and other pests", in regard to the OS being designed  
for the OLPC. We're still implementing large chunks of the security  
system, but my spec[0] has been public for a year, our security  
working group contains a number of people from this list, and no one  
so far has claimed that this design won't successfully resist most --  
though not all -- classes of attacks we've seen or can presently  
imagine seeing in the desktop security realm.




[0] http://wiki.laptop.org/go/OLPC_Bitfrost

--
Ivan Krstić <[EMAIL PROTECTED]> | http://radian.org

-
The Cryptography Mailing List
Unsubscribe by sending "unsubscribe cryptography" to [EMAIL PROTECTED]


Re: DRM for batteries

2008-01-07 Thread Peter Gutmann
Izaac <[EMAIL PROTECTED]> writes:

>The calculus here has little to do with unit profit margins.  It has
>everything to do with mitigating product liability suits, recalls, and bad
>press.  (You won't read a report of "Chiwonkistan Knock-off Battery Destroys
>WidgetCo Laptop".  You'll read the report "WidgetCo Laptop Catches Fire.")

The device is also sold for use in other high-profit-margin items like laser
printer toner cartridges and ink tanks.  While I can certainly see the point
of your argument as applied to batteries, I haven't heard of too many toner
cartridges exploding recently, and given the extensive bad press given to
dubious tactics like "starter cartridges" I don't think the printer
manufacturers are that worried about publicity issues (the printer ink/toner
market is definitely a racket, and has been documented in some detail
elsewhere).

Peter.

-
The Cryptography Mailing List
Unsubscribe by sending "unsubscribe cryptography" to [EMAIL PROTECTED]


Re: Question on export issues

2008-01-07 Thread Jon Callas


On Jan 4, 2008, at 12:50 PM, Thierry Moreau wrote:




Jon Callas wrote:

They let strong crypto through all the time. I can't imagine what   
*technology* you couldn't get through.


Do you have an example of allowed strong crypto having good key  
management and not already widely-implemented/easily-implementable  
by competitors outside of the Wasseenar zone?


I'm sorry, but I don't understand the question. I've read it about  
ten times and don't know what you're asking. Let me try to answer by  
talking around it.


If you look at the basic components we have, the ciphers, hash  
functions, and so on, they're all secure enough that a major  
government can't crack them. Yes, we know that there are weaknesses  
in lots of hash functions, but by now, we have a pretty good handle  
on that. We know about how broken they are, and there are  
workarounds. Furthermore, if you look at the push to fix this --  
where is it coming from? NIST, NESSIE, etc.


If you look at the medium-level functions, like HMAC, salted hashing,  
tweakable cipher modes, and so on, they are *more* secure. For  
example, even if you don't like SHA-1, a SHA-1 HMAC is still  
considered secure.


If you look at the protocols, like TLS, IPsec, OpenPGP, S/MIME, and  
so on, they're also secure, because they assemble the reasonably  
secure components together reasonably securely. Yes, we can have  
discussions about some of them, but again, we know lots about their  
security, and can actually discuss it rationally. It was much harder  
to do that ten to twenty years ago.


All of these things are freely exportable. It's just a matter of  
filling out paperwork.


I don't have an example of a cryptosystem that I'd actually want to  
use that is non-exportable. And I'm sure that if someone made  
something that is custom, it's exportable. I have direct evidence of  
this.


Back in 1999, when we were at Counterpane together, John Kelsey and I  
created a set of incompatible Blowfish variants. We were going to use  
them in TLS so that Counterpane gear would have its own little walled  
garden. We could have used a family key, but this was fun, and also a  
test of the export regime. Blowfish, as you may or may not know, has  
some initialization constants that are hex digits of pi. These  
"colorfish" ciphers used different digits of pi for the  
initialization. I constructed the family of: Blackfish, Brownfish,  
Redfish, Orangefish, Yellowfish, Greenfish, Bluefish, Indigofish,  
Purplefish, Whitefish, Silverfish, Goldfish, Octarinefish, and  
Plaidfish. I sent them for export and there wasn't a peep. Nothing.  
These days, British Telecom owns them.


There are cryptosystems I know of from non-Wassenaar countries that I  
wouldn't go near. I don't think they're very good. I don't care if  
that's a matter of competence or malice; I'm not favorably impressed.  
I am, however, quite sure they're exportable.


I don't have an example of any crypto technology that I would think  
wouldn't be exportable.





Definitely, however, there are  *people* who couldn't get an  
export license because they've been bad  in the past.


If one were to look emprircally at these *poeple*, is it possible  
that, e.g. as if by chance, they would be designers of good crypto  
having good key management and not widely-implemented/easily- 
implementable by competitors outside the Wasseenar zone?


So the answer to your questions is that they're vetting who you  
are  far more than what you're exporting.


Do you mean that they judge whether your are competent to design  
good crypto of the above type? Perhaps even they assess whether you  
are "organizationally unimpeded" to do so?


No, I don't mean that. Crypto is a "dual use" technology. Most things  
are dual use. There are obvious dual use things, like nuclear  
materials, but video games are also dual use, as are milling  
machines, laser diodes, navigation equipment, and so on. Basically,  
if it's fun, it's dual use.


You have to have export licenses for dual use items. Sometimes the  
license is very easy to get. In some cases, it is nothing more than  
giving them your web logs if they ask for them and there's nothing  
requiring you to keep them. (Most open source software falls here.)  
Other times, there's more. For some people, like Ivan at OLPC and me  
(at PGP), we jump through hoops we don't necessarily have to because  
we don't want to end up on the wrong side of things.


If you violate export rules, there can be legal and administrative  
penalties. The administrative penalties can be much worse, because  
they can essentially just decide you can't ever export anything.  
These days, I suspect this would also be a good way to end up on the  
permanent  list for flying.


When I took a course on all of this, I was told about a guy in the  
import-export biz who was known for being able to get things into  
countries with sanctions. Eventually, he was caught and never  
prosecute

Re: DRM for batteries

2008-01-07 Thread Steven M. Bellovin
On Sun, 6 Jan 2008 17:23:56 -0800
Jon Callas <[EMAIL PROTECTED]> wrote:

> 
> On Jan 6, 2008, at 9:09 AM, Steven M. Bellovin wrote:
> 
> > On Sat, 5 Jan 2008 15:28:50 -0800
> > Stephan Somogyi <[EMAIL PROTECTED]> wrote:
> >
> >> At 16:38 +1300 04.01.2008, Peter Gutmann wrote:
> >>
> >>> At $1.40 each (at least in sub-1K quantities) you wonder whether
> >>> it's costing them more to add the DRM (spread over all battery
> >>> sales) than any marginal gain in preventing use of third-party
> >>> batteries by a small subset of users.
> >>
> >> I don't think I agree with the "DRM for batteries"
> >> characterization. It's not my data in that battery that they're
> >> preventing me from getting at.
> >
> > Correct.  In a similar case, Lexmark sued a maker of print
> > cartridges under the DMCA.  Lexmark lost in the Court of Appeals
> > and the Supreme Court declined to hear the case.  See
> > http://www.eff.org/cases/lexmark-v-static-control-case-archive and
> > http://www.scc-inc.com/SccVsLexmark/
> >
> 
> Also remember that there is a specific exemption in the DMCA for
> reverse engineering for the purpose of making compatible equipment.
> It is there precisely to protect people like the printer cartridge
> folks. That's why they lost.
> 
> Going back to the '60s, there was the Ampex case, where they made
> compatible tape drives for IBM mainframes. IBM sued, and lost in the
> Supreme Court. This is what gave us the plug-compatible peripheral
> biz. My memories of this say that some judge or other said that
> copyright is not intended to give a monopoly.
> 
> That doesn't mean that other companies can't pull crap and try to sue
> competition away. But they're wrong, and the effect may help the
> little guy, because by now, the big guys ought to be able to pay for
> lawyers smart enough to know the precedent.
> 
It's worth reading the actual opinion of the Appeals Court.  (Legal
note: this opinion is only binding in the Sixth (U.S.) Circuit; the
Supreme Court declined to hear Lexmark's appeal, so no national
precedent was set.)  The Court had many reasons for rejecting the DMCA
part of the argument.  Among other things, they held that the
copyrighted work -- the printer software -- wasn't protected against
copying, and that the purpose of the DMCA was to protect works against
*copying*.  They held that the copyrighted code in question -- the
firmware in the printer -- was "used" by people trying to print things,
rather than by the print cartridge, and that access to the protected
work was gained by buying a printer, not by buying a print cartridge.

There are a lot more nuances than that in the opinion; I suggest that
folks read it.  For now, let it suffice to say that the DMCA bars
circumvention of mechanisms that protect copyrighted material; it does
not bar circumvention of access control mechanisms that protect other
things.  (I also fear that a clever, technically-minded lawyer could
design a print cartridge that would work around the Court's ruling.)


--Steve Bellovin, http://www.cs.columbia.edu/~smb

-
The Cryptography Mailing List
Unsubscribe by sending "unsubscribe cryptography" to [EMAIL PROTECTED]


Re: DRM for batteries

2008-01-07 Thread Jon Callas


On Jan 6, 2008, at 9:09 AM, Steven M. Bellovin wrote:


On Sat, 5 Jan 2008 15:28:50 -0800
Stephan Somogyi <[EMAIL PROTECTED]> wrote:


At 16:38 +1300 04.01.2008, Peter Gutmann wrote:


At $1.40 each (at least in sub-1K quantities) you wonder whether
it's costing them more to add the DRM (spread over all battery
sales) than any marginal gain in preventing use of third-party
batteries by a small subset of users.


I don't think I agree with the "DRM for batteries" characterization.
It's not my data in that battery that they're preventing me from
getting at.


Correct.  In a similar case, Lexmark sued a maker of print cartridges
under the DMCA.  Lexmark lost in the Court of Appeals and the Supreme
Court declined to hear the case.  See
http://www.eff.org/cases/lexmark-v-static-control-case-archive and
http://www.scc-inc.com/SccVsLexmark/



Also remember that there is a specific exemption in the DMCA for  
reverse engineering for the purpose of making compatible equipment.  
It is there precisely to protect people like the printer cartridge  
folks. That's why they lost.


Going back to the '60s, there was the Ampex case, where they made  
compatible tape drives for IBM mainframes. IBM sued, and lost in the  
Supreme Court. This is what gave us the plug-compatible peripheral  
biz. My memories of this say that some judge or other said that  
copyright is not intended to give a monopoly.


That doesn't mean that other companies can't pull crap and try to sue  
competition away. But they're wrong, and the effect may help the  
little guy, because by now, the big guys ought to be able to pay for  
lawyers smart enough to know the precedent.


Jon

-
The Cryptography Mailing List
Unsubscribe by sending "unsubscribe cryptography" to [EMAIL PROTECTED]


Re: Death of antivirus software imminent

2008-01-07 Thread dan

Alexander Klimov writes, in part:
-+---
 | It sounds like: we cannot make secure OS because it is too
 | large -- let us don't bother to make a smaller secure OS,
 | just add some more software and hardware to an existent
 | system and then it will be secure. Sounds like a fairytale :-)


As yet another variation on the theme "complexity is the
enemy of security," consider patches.  Patches tend to
add complexity to the code they patch, viz., it is the
rare patch indeed that simply elides some non-working
feature.

Taking as our metric the venerable McCabe score:

   v(G) = e - n + 2

where e and n are the number of edges and nodes in the
control flow graph, and where you are in trouble when
v(G)>10 in a single module, the simplest patch adds two
edges and one node, i.e., v'(G)=v(G)+1, and that minimum
obtains only for patches with no conditional branches in
the patch.

If someone wanted to have fun, it would be to examine
what fraction of patches are themselves re-patched at
a later date -- as in Fred Brooks' famous dictum in
_The Mythical Man Month_ where, in paraphrase, he said
that you should stop patching when the probability of
fixing a known problem is no longer substantially
greater than the probability of simultaneously creating
an unknown problem.

Yet security patches are a special case: no vendor can
obey Brooks' Law, and so they will inevitably over-run
the boundary condition where the known flaw the new
patch patches is no longer likely to be substantially
more probable than the inadvertent introduction of an
unknown flaw at the same time.  As such, I would guess
that the more often an application receives security
patches the less secure it is, at least at the limit.

--dan

-
The Cryptography Mailing List
Unsubscribe by sending "unsubscribe cryptography" to [EMAIL PROTECTED]


Re: Question on export issues

2008-01-07 Thread James A. Donald

Peter Gutmann:
> > That's because there's nothing much to publish: In
> > the US, notify the BIS via email.

Ivan Krstic' wrote:
> Our outside counsel -- specializing in this area --
> thought this was insufficient.

You were probably asking your counsel the wrong
question.  Never ever ask the question "is this legal",
for you will always find that the answer to that is that
NOTHING is legal, and hearing your counsel tell you that
puts you in trouble.  The question should have been "Has
anyone got in trouble for doing this, and if so, how big
a trouble, and what attracted the attention of the man?

Had you asked that question instead, you would have
heard the answer that no one ever gets in trouble for
minimalist compliance with the export laws - an answer
that cannot get you, or your counsel, in trouble.

Legislation and regulation, though never revealing what
is legal, frequently forbids and commands all sorts of
things quite clearly, telling us all sort of things are
forbidden and yet ninety nine percent of such
legislation and regulation is dead as a doornail.

It is never possible to know what is permitted, so you
must never ask your counsel about what is permitted,
since he is legally required to give answers that create
problems.  Legal uncertainty and capricious enforcement
of obscure, unclear  and incomprehensible laws is simply
a cost of doing business.  No legal counsel can reduce
this cost, and if asked to do so, is legally required to
give answers that increase, rather than reduce this
cost.

When you ask a counsel to provide legal certainty, you
ask him for what can never exist, and what he is
forbidden to provide.

-
The Cryptography Mailing List
Unsubscribe by sending "unsubscribe cryptography" to [EMAIL PROTECTED]


Re: Death of antivirus software imminent

2008-01-07 Thread James A. Donald

Leichter, Jerry
> > Why not just require that the senders of malign
> > packets set the Evil Bit in their IP headers?
> >
> > How can you possibly require that encrypted traffic
> > *generated by the attackers* will allow itself to be
> > inspected?

Alex Alten wrote:
> You misunderstand me.  We can for the most part easily
> identify encrypted data, either it is using a standard
> like SSL or it is non-standard but can be identified
> by data payload characteristics (i.e. random bits).

Steganography will beat that.  If the government demands
non random bits, non random bits will be provided.

> If it is a standard (or even a defacto standard like
> Skype) we can require access under proper authority.
> If it is not (or access under authority is refused),
> then just simply block or drop the packets, there's no
> need to inspect them.

This means that only authorized, regulated, officially
registered data formats shall be permitted.  It will be
almost impossible, most likely completely impossible,
for *my* format to get registered even though it sends
data completely in the clear.  Skype will be
grandfathered in, but the next Skype will not be.

So I will do what the bad guys do - steganograph my
entirely innocuous application, which would not need
cryptography at all except to escape intrusive
regulation, forcing me to hide my actual data format
inside a registered and officially authorized data
format.

-
The Cryptography Mailing List
Unsubscribe by sending "unsubscribe cryptography" to [EMAIL PROTECTED]


Re: Question on export issues

2008-01-07 Thread Sidney Markowitz

Ivan Krsti? wrote, On 6/1/08 1:33 PM:

On Jan 3, 2008, at 10:47 PM, Peter Gutmann wrote:

That's because there's nothing much to publish:
In the US, notify the BIS via email.


Our outside counsel -- specializing in this area -- thought this was  
insufficient


That's the problem with using lawyers, they'll always give you a conservative cautious 
answer. Unfortunately for people who don't use them, sometimes those really are the 
correct, prudent answers. When I worked for a company that had to face this we acted 
according to what looked like the plain language documentation from the BIS, concluding 
that use of an existing open source package required just sending an email. We were never 
as high profile as OLPC and in fact never ended up exporting anything, so our 
interpretation of the laws was not only not made by qualified legal counsel, it also was 
never tested.


I do look forwrd to seeing what you discover were the considerations that your outide 
counsel had.


 -- sidney

-
The Cryptography Mailing List
Unsubscribe by sending "unsubscribe cryptography" to [EMAIL PROTECTED]