On Friday, 17 October 2014 at 10:30:14 UTC, eles wrote:
On Friday, 17 October 2014 at 09:46:49 UTC, Ola Fosheim Grøstad wrote:
The second thing I would change is to make whole program analysis mandatory so that you can deduce and constrain value ranges.

Nice idea, but how to persuade libraries to play that game?

1. Provide a meta-language for writing propositions that describes what libraries do if they are foreign (pre/post conditions). Could be used for "asserts" too.

2. Provide a C compiler that compiles to the same internal representation as the new language, so you can run the same analysis on C code.

3. Remove int so that you have to specify the range and make typedefs local to the library

4. Provide the ability to specify additional constraints on library functions you use in your project or even probabilistic information.

Essentially it is a cultural thing, so the standard library has to be very well written.

Point 4 above could let you specify properties on the input to a sort function on the call site and let the compiler use that information for optimization. E.g. if one million values are evenly distributed over a range of 0..100000 then a quick sort could break it down without using pivots. If the range is 0..1000 then it could switch to an array of counters. If the input is 99% sorted then it could switch to some insertion-sort based scheme.

If you allow both absolute and probabilistic meta-information then the probabilistic information can be captured on a corpus of representative test-data. You could run the algorithm within the "measured probable range" and switch to a slower algorithm when you detect values outside it.

Lots of opportunities for improving "state-of-the-art".

Reply via email to