On Thu, Sep 4, 2014 at 11:45 AM, Jonathan S. Shapiro <[email protected]> wrote:
> I agree (and I said) that it's a challenge for type inference. While part of
> me likes the idea of reducing syntactic weight by using constraints, I
> probably agree that it's not a good idea in this case. The reason I went in
> to this example is to illustrate that there is nothing magical about arrays
> or vectors that inherently requires them to be homogeneous in all
> situations, and that there is nothing magical about indexing syntax, that
> indices need to be disjoint (but not necessarily drawn from Nat) and
> discriminators need to be disjoint (but not necessarily drawn from a Sort
> defined implicitly by the datatype definition).
>
> So I'm mainly trying to tease out the conceptual distinctions here as a way
> to start looking at what other kinds of discriminators are possible.

Habit's type system with type level naturals (or the various hideous
versions of type level naturals embedded in Scala / Haskell / etc.)
can do this (thanks for telling me to reread their spec).  However,
you have to accept full polyinstantiation over all the integer indices
considered.  That's easy to explain to a C++ programmer whose mind has
already been corrupted by the instantiation restriction (myself, some
of time), but will seem bizarrely crippled to everyone else.

Incidentally, the SMT refinement type paper I linked in the other
thread can also do such things, and has the advantage of decaying
nicely to runtime checks if the proof obligations can't be
established.

Geoffrey
_______________________________________________
bitc-dev mailing list
[email protected]
http://www.coyotos.org/mailman/listinfo/bitc-dev

Reply via email to