On 2017-05-16 02:23, Richard Gaskin via use-livecode wrote:
Until we see some enforced security standards, I have no interest in
"smart cars", "smart TVs" or "smart homes".  When I look at those
products I just see one big botnet.

I'd at least hope that 'smart cars' software is engineered to a much
higher standard than other places:

   https://en.wikipedia.org/wiki/MISRA_C

[ Of course, it is slightly worrying that Chrome thinks that www.misra-c.com
is 'Not Secure' ;) ]

Such standards, processes and tooling helps to ensure that code that is
written minimizes the chance of vulnerabilities *in the code* (e.g. buffer
overruns). Of course it doesn't necessarily cover vulnerabilities in
design - e.g. in protocols.

There are other methods to ensure this is the case (at least in terms of
the code):

  - adequate automated testing (100% coverage being the goal - i.e. the
    tests should exercise every line of code).

  - fuzz testing (giving random inputs to every module in a system
    to make sure it handles any potential case gracefully).

  - extensive code review (i.e. ensuring that code does not make it into
    a system unless it has been thoroughly checked by as many people as
    possible).

  - use static analysis tools (e.g. Coverity).

Of course, the principal issue really is that most code which is used in
critical systems (and systems generally) tends to be C/C++ - or something
like Java where the VM and parts of the supporting libraries are written
in C/C++ (e.g. Smart TVs, Blu-Ray players, Android phones).

C/C++ are inherently unsafe (let you do things which are wrong) languages (although it is becoming increasingly easy to write safe code in C++ - as long as you use it in a specific way - you cannot yet turn off unsafe aspects
of the language which means they can be used).

The reason they are unsafe is because the design of them means that static checking is really hard to do well (the depth of what Coverity does, for example is quite astounding but it is not perfect by any means) and impossible to do
completely.

The solution then is to use *safe* languages - ones which don't let you write code which contains the kind of exploitable vulnerability which result in
virii being able to be written in the first place.

However, the reality is that the number of safe-systems-languages (ones where it is possible to write device drivers, kernels etc.) in existence is well, negligible (Rust is probably the only one which has floated to the surface of the dev community in recent years that I can think of); this combined with the fact that there are probably not just billions but trillions of lines of C/C++ code in the world means that things are probably not going to change much soon - the cost to rewrite all of that in a language such as Rust would
probably be larger than the entire economic output of the entire world.

Warmest Regards,

Mark.

--
Mark Waddingham ~ m...@livecode.com ~ http://www.livecode.com/
LiveCode: Everyone can create apps

_______________________________________________
use-livecode mailing list
use-livecode@lists.runrev.com
Please visit this url to subscribe, unsubscribe and manage your subscription 
preferences:
http://lists.runrev.com/mailman/listinfo/use-livecode

Reply via email to