On Tue, Feb 24, 2015 at 11:25 PM, Matt Oliveri <[email protected]> wrote:

> On Wed, Feb 25, 2015 at 12:28 AM, Jonathan S. Shapiro <[email protected]>
> wrote:
>
> > The thing is: application proceeds left-to-right. It follows that the
> deep
> > specialization is necessarily an inductive consequence of the
> left-to-right
> > shallow specialization. The difference between the two is that the
> shallow
> > specialization can be pursued before the program is fully concrete.
>
> You seem to be assuming that if we use deep arities, we wouldn't be
> able to shallow specialize without fully knowing the deep arity. I
> don't see why that must be. The tail of a list is a list, after all.
> (1 :: 'arity) is an arity.
>

I don't think I'm assuming that. In any case I do not intend to. I suppose
I just prefer to proceed inductively from the left and make progress as
things get instantiated and/or unified.


> But this doesn't explain why you said that argument positions can
> concretize as certain arities.
>

I think you are reading too much into my wording. Let's see if I can say it
a different way:

Once

fn 'arity 'a->'b->'c->'d->'r


unifies with

fn 1 'u->'v


we know that the result type 'v must be:

fn 'newArity 'b->'c->'d


which tells us that the first function, when applied, will return a second
function. Mathematically you might be able to compose these two functions
to arrive at a third. Programatically you can only perform their
applications in sequence.

So all I'm saying is that this result function has an arity-abstract type
that will (in turn) be unified and specialized, and will eventually turn
into an arity-concrete function type. The particular arities I gave were
wrong, because 'd could unify as a function type at some point.

I suspect we are merely hung up on a notational issue. For convenience of
discussion, we have been assuming (in next example) that 'r will not end up
being a function type, and therefore that:

fn 'arity 'a->'b->'c->'c->'r


might possibly end up having deep concrete arity [1, 2, 1], But of course
no concrete function will actually end up with arity [1, 2, 1], because all
concrete function arities have a depth of exactly 1. What we will end up
with instead in this case is *three* concrete functions. The first will
have arity 1, the second arity 2, and the third arity 1.


> > My choice of notation for function types is not innocent. The [shallow]
> > concrete arity corresponds to the stuff up to, but not including, the
> return
> > type.
>
> This is not necessarily true. Well, it is if you interpret "return
> type" with care. Maybe that's what you're saying. In (fn 'arity
> 'a->'b->'c) we do not know what the return type is. It isn't
> necessarily 'c, because if 'arity is 1, then the return type is
> something like (fn 'newArity 'b->'c).
>

Indeed we do not, and that is precisely why we have written 'arity rather
than some natural number. Please note that in my sentence above I said "the
shallow **concrete** arity".


> > This leaves the return type to unify in abstract form, just as it does
> > when 'c unified with 'w->'x.
> >
> > The part of this that needs some care in explaining is how the Nat
> > constrains the unification. In particular it needs to account for why
> >
> >    fn 1 'a->'b
> >
> > cannot unify with
> >
> >   fn 2 'a->'b->'c
> >
> > because the number of [transitive] shallow argument positions cannot be
> > reconciled.
>
> This explanation doesn't seem to have anything to do with the axiom of
> choice, or how 'newArity got there.
>

You are right. I said "axiom of choice", but I was thinking about
permutations. Somehow I turned "n choose k" into "axiom of choice" in my
head for a moment. Please forget my axiom of choice brain-fart. Sorry about
that.


> >> There you go with argument positions again. Maybe I'm really missing
> >> something here. Or maybe you're smoking something funny. What's wrong
> >> with this specialization:
> >>
> >> fn 'arity Nat->Nat->Nat->Nat
> >> becomes
> >> fn 1 Nat -> fn 'newArity Nat->Nat->Nat
> >>
> >> Notice that the concreteness of the argument types has not forced the
> >> return type to be concrete.
> >
> > What's wrong with it is subtle. Generalizing your example, the type:
> >
> > fn 1 Nat -> fn 'newArity 'a 'b 'c
> >
> > is syntactically well-formed and semantically sensible, but it is not a
> type
> > that can feasibly be derived under the typing regime that I have
> proposed.
> >
> > The universe of derivable-by-inference function types is an inductive
> > construction over compositions of well-formed program fragments. If all
> > global function definitions have a native arity, then it is not possible
> to
> > compose program fragments in such a way that their types compose to
> produce
> > the type you propose.
>
> Well I'm not sure exactly what you mean, but I never claimed any of my
> example types were the types of programs. I don't see why you refuse
> to talk about them for that reason.


Matt: please calm down. I did not "refuse" to talk about them. You asked me
a question, and I am trying to answer. In fact, I said quite clearly above
that the type you gave is syntactically well-formed (provided we replace
Nat with a valid type). I also said that it was semantically sensible, but
I'm actually not convinced of that.

It is certainly possible to see an expression

f i j k


and infer

fn 'arity int->int->int->int


at an application, and then to partially instantiate that. But there is no
way within the type rules to get from there to:

fn 1 int -> fn 'newArity int->int->int


the reason this is impossible is that this would mean that the function f
has the type

fn 1 int -> 'a


where 'a is unifiable with

fn 'newArity int->int->int


But there are only two ways that we can arrive at a function having type fn
1 int->'a:

1. f is underspecified, in which case the program cannot be run
2. f returns some global value g which is underspecified, in which case the
program cannot be run.

I'm not all that interested in the types of non-executable programs.


You keep asking me where the 'newArity comes from. The answer is that
'arity is not an arbitrary type variable. It's a type variable in a
specific position of the fn type constructor. As a consequence we know that
it is actually 'newArity::Nat (that is: it has Nat kind), and we know that
we can in general rewrite

'SomeNat

as

1 + 'SomeOtherNat


so long as 'SomeNat > 0. Which we in turn know structurally because we are
looking at an arrow type, and all arities are greater than zero.


> Indeed, we are interested in
> partially specialized types because they arise by specialization. Are
> you saying it's illegal to specialize the arity of the application in
> (f 2 3 5) + 7
> ?
>

I don't see any problem with that.


> I object anyway! I don't see why that needs to be verified, if
> well-formed types make sense whether or not they arise.
>

The types are syntactically well-formed, but not semantically valid. My
claim is that types like fn 1 int -> 'a can only appear in underspecified
programs.

However, I may have been wrong about my constructive argument, so let's set
it aside. Behind the scenes I was trying to answer how 'newArity got
introduced. I think the explanation I give above (peano decomposition) is a
simpler answer.

Incidentally: the capacity for this substitution is why Nat kind is so
important more generally!

>> Umm. Maybe that answered my question. What about that 'a1, 'a2, 'a3
> >> example I had? I guess we can talk about specifics more easily if you
> >> explain how to unify function types in general.
> >
> > My apologies, but can you re-issue the example? Somewhere in the last 150
> > messages I lost track. :-)
>
> Here:
> On Tue, Feb 24, 2015 at 9:31 PM, Matt Oliveri <[email protected]> wrote:
> > What makes you think that choose_one-like problems are happy if
> > only the shallow arity can be constrained?
> >
> > In particular, how do we know that choose_one's type
> > fn 3 (fn 'ar a->b->c->d)->(fn 'ar a->b->c->d)->Bool->(fn 'ar a->b->c->d)
> > doesn't unify with
> > fn 3 (fn 1 a->fn 'a1 b->c->d)->(fn 1 a->fn 'a2 b->c->d)->Bool->(fn 1
> a->fn 'a3 b->c->d)
> > aside from that it makes no semantic sense?
>
> I realize now that I used an old notation for arity-concrete function
> types in that email, so I edited the quote.


It CAN unify that way. But all of the original types had the same arity
type variable 'ar, so all of the decompositions of 'ar have to result in
the same arity type variable 'newArity. When the unification above is done,
the 'a1, 'a2, 'a3 variables will have unified, and the *result* type will
therefore be

fn 3 (fn 1 a->fn 'newArity b->c->d)->(fn 1 a->fn
'newArity b->c->d)->Bool->(fn 1 a->fn 'newArity b->c->d)


The shallow arity thing is deceptively powerful. the peano-style
decomposition ensures that arity agreement is preserved over the entire
arity-abstract function type even though the instantiations are done
stepwise.


shap
_______________________________________________
bitc-dev mailing list
[email protected]
http://www.coyotos.org/mailman/listinfo/bitc-dev

Reply via email to