TSa <[EMAIL PROTECTED]> wrote:
I strongly agree. Having a language that allows supertying has novelty.
But I think that union is not there for completion but as integral part
when it comes to defining a type lattice which I still believe is the
most practical approach to typing. This includes computed types, that
is "artificial" nodes in the lattice. These intermediate types are
usually produced during type checking and automatic program reasoning.
Think e.g. of the type of an Array:
my @array = (0,1,2); # Array of Int
@array = "three"; # Array of Int(&)Str
Actually, these would be something along the lines of "Array of Int"
and "Array of (Int, Int, Int, Str)", respectively. That is, each of
@array[0..2] would be of type "Int", while @array would be of type
"Str". @array itself would be of type "Array" (which, without any
further qualifiers, is equivalent to "Array of Any"). If you must
force a more restrictive type on @array, go with "Array of (Int |
Str)" (yes, I do mean "|", not "(|)"; it's a type-matching issue, not
a container-construction issue.)
> And yes, this "roles as sets" paradigm would presumably mean that you
> could examine roles using '⊂', '⊃', '∈', and so on.
BTW, are the ASCII equivalents spelled (<), (>) and (in)?
I'd hope that they'd be something like '(<)', '(>)', and 'in'; only
use parentheses when neccessary. Likewise, I'd want the "relative
complement" operator to be '-', not '(-)'.
Jonathan "Dataweaver" Lang