In message <[EMAIL PROTECTED]>, John Kelsey writes:
>
> Nor do I. But there's a related engineering question: Does
> it make sense to build large systems in which there's no way
> for humans to overrule the actions of programs once they're
> set in motion? *That* is the question I'm raising, not
> whether mathematicians and scientists should have tried to
> somehow suppress the research that has made this possible.
> It's clearly possible; that doesn't mean it's a good idea to
> design systems like this.
Yup. To many of you, the phrase "mine shaft gap" will provide a clear example
of what I'm talking about.
Of course, in a crypto context this is a very hot button -- one of the
arguments used for key escrow was that we should make sure that messages are
decryptable in case of dire need....
>
> To use a more common example, I believe there were some cars
> (maybe experimental, I don't know) which would simply refuse
> to start the ignition until all passengers had their
> seatbelts on. There's no doubt that it's possible to design
> such a car. But you couldn't sell them without making it
> illegal to buy any other car, and users would flock to
> mechanics to have the feature removed in droves, regardless
> of the law.
Circa 1976, U.S. Federal regulations required that cars implement a state machine
-- get in, close the door, buckle your seat belt, start the car, leave it
buckled or loud obnoxious noises sounded. These were built and sold -- and
were so unpopular that Congress passed a law rescinding that (administratively
promulgated) regulation.
--Steve Bellovin