On 25-Feb-1999, Carl R. Witty <[EMAIL PROTECTED]> wrote:
> Fergus Henderson <[EMAIL PROTECTED]> writes:
> 
> > Certainly a language with dependent types should define exactly what
> > types the type checker will infer.  But when generating code, the
> > compiler ought to be able to make use of more accurate type information,
> > if it has that information available, wouldn't you agree?
> > I think it would be most unfortunate if simply adding more accurate type
> > information could change a program's behaviour.
> 
> I'm not sure if that last sentence refers to 1) the compiler inferring
> more accurate type information or 2) the user adding a more accurate
> type declaration.

I meant it to refer to both.

> 1) There's a word for "optimizers" that change the meaning of a
> program (in ways not allowed by the language spec); that word is
> "buggy".

Sure.  But it would be a little unfortunate if compilers had to keep
around two sets of type information, one being the types that the language
specification required that it infer, and the other being the
(potentially more accurate) set of types that it was able to
infer after inlining etc.  It would be nicer if compilers could just
use the most precise types and be guaranteed that the extra precision
wouldn't affect the semantics.

However, this is a less important point than the one below.

> 2) Yes, I agree that the possibility that user-supplied type
> declarations can change the meaning of the program is a strike against
> the idea.

-- 
Fergus Henderson <[EMAIL PROTECTED]>  |  "Binaries may die
WWW: <http://www.cs.mu.oz.au/~fjh>  |   but source code lives forever"
PGP: finger [EMAIL PROTECTED]        |     -- leaked Microsoft memo.



Reply via email to