"The Mythical Man-Month" is a great book, but it's almost 30 years old. Brooks considered OS/360 to be hopelessly bloated. My favorite quote (from Chapter 5, The Second System Effect, p. 56):

"For example, OS/360 devotes 26 bytes of the permanently resident date-turnover routine to the proper handling of December 31 on leap years (when it is Day 366). That might have been left to the operator."

Modern operating system are 2 to 3 orders of magnitude larger than OS/360.. They are far more reliable than OS/360 was in its early days and do not presume the availability of an on-site team of operators and system programmers. For the most part they are still maintained one bug at a time The bug fixing process has not reached Brook's predicted crisis.

My other concern with the thesis that finding security holes is a bad idea is that it treats the Black Hats as a monolithic group. I would divide them into three categories: ego hackers, petty criminals, and high-threat attackers (terrorists, organized criminals and evil governments). The high-threat attackers are likely accumulating vulnerabilities for later use. With the spread of programming knowledge to places where labor is cheap, one can imagine very dangerous systematic efforts to find security holes. In this context the mere ego hackers might be thought of as beta testers for IT security. We'd better keep fixing the bugs.


Arnold Reinhold




At 5:10 PM -0400 6/14/04, Steven M. Bellovin wrote:
In message <[EMAIL PROTECTED]>, Ben Laurie writes:


What you _may_ have shown is that there's an infinite number of bugs in any particularly piece of s/w. I find that hard to believe, too :-)


Or rather, that the patch process introduces new bugs. Let me quote from Fred Brooks' "Mythical Man-Month", Chapter 11:

        The fundamental problem with program administration is that fixing
        a defect has a substantial (20-50 percent) chance of introducing
        another.  So the whole process is two steps forward and one step
        back.

        Why arene't defects fixed more cleanly?  First, even a subtle
        defect shows itself as a local failure of some kind.  In fact it
        often has system-wide ramifications, usually nonobvious.  Any
        attempt to fix it with minimum effort will repair the local and
        obvious, but unless the structure is pure or the documentation
        very fine, the far-reaching effects of the repair will be
        overlooked.  Second, the repairer is usually not the man who wrote
        the code, and often he is a junior programmer or trainee.

        As a consequence of the introduction of new bugs, program
        maintenance requires far more system testing per statement written
        than any other programming.

        ...

        Lehman and Belady have studied the history of successive release
        in a large operating system.  They find that the total number of
        modules increases linearly with release number, but that the
        number of modules affected increases exponentially with release
        number.  All repairs tend to destroy the structure, to increase
        the entropy and disorder of the system.  Less and less effort is
        spent on fixing original design flaws; more and more is spent on
        fixing flaws introduced by earlier fixes.  As time passes, the
        system becomes less and less well-ordered.  Sooner or later the
        fixing ceases to gain any ground.  Each forward step is matched by
        a backward one.

Etc.  In other words, though the original code may not have had an
infinite number of bugs, the code over time will produce an infinite
series....

---------------------------------------------------------------------
The Cryptography Mailing List
Unsubscribe by sending "unsubscribe cryptography" to [EMAIL PROTECTED]

--------------------------------------------------------------------- The Cryptography Mailing List Unsubscribe by sending "unsubscribe cryptography" to [EMAIL PROTECTED]

Reply via email to