On Saturday, 4 October 2014 at 08:15:51 UTC, Walter Bright wrote:
On 10/3/2014 8:43 AM, Sean Kelly wrote:

In NO CASE does avionics software do anything after an assert but get shut down and physically isolated from what it controls.

I've explained this over and over. It baffles me how twisted up this simple concept becomes when repeated back to me.

AFAICT, you hold the righ idea. Either the discussion fellows argue about a different thing and this is a communication problem, either they are on a wrong path.

Once an inconsistency in its own state is detected, any program quits guessing, and shuts down, sometimes with a minimal logging (if not harmless).

An inconsistency is when a software or a system finds intself in a situation for which it wasn't designed to face. It simply breaks the ultimat variants of all that any critical program makes. This invariant is: "I know what I'm doing."

Yes, a software might detect inconsistencies in the rest of the system and try to correct those *if it is designed (read: if it is made to know) what is doing. Majority voting, for example, is such case, but there the basic hypothesis is that sensors are not 100% reliable. And, sometimes, even the majority voting is directly wired.

But, a critical software (and, for the general case, any software) only goes as far as it (believes) that it knows what's doing. When this is not the case, it does not continue, it stops immediately.

Reply via email to