On 2 Oct 2006, at 17:48, Jonathan Lang wrote:
The examples I gave involved specific roles or routines being
forbidden from use in certain situations; my gut instinct is that if
you don't think that it's appropriate to use a particular role or
routine somewhere, you should simply not use it there; I can't see why
you'd want the compiler or runtime to enforce not using it there.

Some of the discussion you're referring to reminded me of 'final' in Java. Declaring a class 'final' means it can't be subclassed. In Java there are two reasons for that - optimisation and security.

java.lang.String is final so the compiler and runtime can optimise string operations without worrying about handling String subclasses - i.e. it allows knowledge of java.lang.String's interface to hard coded into the compiler / runtime.

The security angle is that it prevents the creation of subclasses that can circumvent security restrictions in the base class.

I wonder if some of the debate here was informed by the perception that 'final' is a valuable feature in Java whereas actually it's a hacky bodge to solve a couple of language design problems?

--
Andy Armstrong, hexten.net

Reply via email to