Happy pi day

On Sat, Mar 14, 2015 at 1:23 PM, Jonathan S. Shapiro <[email protected]> wrote:
> On Fri, Mar 13, 2015 at 2:14 PM, Matt Oliveri <[email protected]> wrote:
>>
>> On Fri, Mar 13, 2015 at 11:06 AM, Jonathan S. Shapiro <[email protected]>
>> wrote:
>> I was starting to think so. Your mixed arrow scheme is actually quite
>> different from my deep arity (cfn/afn) scheme then.
>
> I still don't understand your deep arity scheme.

I can explain anything about it, just ask.

>> Whatever kind of constraint it is, constraints are generally not
>> enough to guarantee abstract types will concretize the same. You can
>> get lucky sometimes, and you did get lucky, as I'll explain.
>
> Hey! choose_one was *your* example! :-)

It was William's example. Apparently, there are multiple ways to use
choose_one, and not all of them are good tests for joins in general.

>> > Now lets look at pseudo-code for choose_one:
>> >
...
>> > These two types are not compatible, so the application (correctly) fails
>> > to
>> > type check. In fact, this is what has puzzled me about the choose_one
>> > example; it has always seemed to me that the argument types unify and so
>> > there can be no failure to detect concrete incompatibility at
>> > specialization.
>>
>> Shap, this example is way too easy. Of course if you have
>> arity-concrete function types, it'll work out. That is not in any way
>> a test of your arity-abstract function types. Here's the example I
>> think you meant, which still happens to work, because you're lucky:
>>
>> The f's are unknown functions with arity-abstract types:
>> f1 : int=>int->int->int
>> f2 : int->int=>int->int
>> f3 : int->int->int=>int
>
> No. This is most definitely NOT the example that I meant, because this is
> (or should be) impossible in my scheme. All function definitions assign a
> concrete arity. Therefore, a passed function must, of necessity, have
> concrete arity. If that isn't true then my entire approach breaks down.

This is weird position to take, although I think you said something
like it before. We need concrete types for code generation, but we
should be able to check code at abstract types without strange
restrictions like this. Anything we can express in the language, we
can also express with arity-abstract functions in scope. It doesn't
matter where f[123] came from. We have to be able to check this code.

> I want you to pause here, and try to explain to me how, in a language where
> all function definitions assign concrete arity, it is possible for f1, f2,
> f3 to end up with abstract function types in the way that you describe
> above. I can certainly see how this might occur while the functions are
> still abstract. But by the time they are concretized I think this should no
> longer be possible.

That is after type checking. We have to type check with arity-abstract
functions.

Consider this code:

def fun_and_games f1 f2 f3 =
  print ((f1 1) 2 3 + (f2 1 2) 3 + f3 1 2 3);
  (choose_one f1 f2 f3) 42 666 1337

Again, this is not the example that breaks. But this is a situation
where we get f[123] like I assumed. I'm going to pause, like you
asked, and reply to the rest in another email.
_______________________________________________
bitc-dev mailing list
[email protected]
http://www.coyotos.org/mailman/listinfo/bitc-dev

Reply via email to