On Tue, Feb 24, 2015 at 7:57 PM, Jonathan S. Shapiro <[email protected]> wrote:
> On Mon, Feb 23, 2015 at 1:55 PM, Matt Oliveri <[email protected]> wrote:
>> I can see why we should restrict to concrete arity for functions
>> exported to C. But probably we already knew we can't export
>> abstract-typed definitions to C. So that doesn't seem like a reason to
>> ban arity-abstract definitions overall.
>
> It isn't. The reason to ban arity-abstract definitions is driven by the
> "allocation must be explicit" rule. It may be that I have converged on a
> solution that constrains things more than it needs to. If so, it is easier
> to relax such a solution than it is to make it tighter if I have erred in
> the other direction. I feel (possibly wrongly) that we know how this
> approach will play out, and I don't [yet] feel that confidence about some of
> what Keean is saying. Not because I doubt him, but because I haven't got my
> head around his idea yet.

I feel very differently about Keean's proposal and my proposal for
arity-abstract definitions. I wasn't planning to try and convince you
to use Keean's proposal, since I don't currently see that it works
either.

>> And I don't see why it matters to the calling convention whether arity
>> is known before or after specialization. And if it does, why we're not
>> already doomed 'cause of function parameters.
>
> In principle, arity doesn't need to be fully known until after
> specialization. In practice, there are many types where distinct
> specializations produce identical code. E.g. list operations over object
> references. It is somewhat simpler if we now in advance that such
> specializations will all end up having the same arity and the same calling
> convention.

Hmm. Hopefully they'd still have the same set of possible arities.

>> > The reason to infer arity at applications is that it (a) it reduces the
>> > likelihood of library bifurcation for human reasons
>>
>> I don't see how. We can fully apply curried and uncurried definitions
>> the same way, but we can only partially apply curried definitions.
>> They're still different.
>
> I agree. But the bigger difference we were discussing at the time I made
> that statement was that arity-aware application would involve passing an
> argument tuple, with the consequence that the two types of application would
> have completely different surface syntaxes. That's a bigger inducer of
> disparity.

I see.

>> > and (2) there are some
>> > useful constructs in functional languages that are syntactically very
>> > awkward when a concrete-arity-mandatory application syntax (as in
>> > Pascal) is
>> > adopted.
>>
>> If it's really only the syntax that makes one thing or another
>> awkward, then how is that relevant to specialization? Maybe it's
>> technically not just syntax, and you need to give an example.
>
> It isn't relevant, except that the spare syntax is what is introducing
> arity-abstract applications in the first place, which is why we need a story
> for how to specialize our way back to arity-concrete applications without
> damaging the surface syntax.

So using one syntax for curried and tuplized application is your main
reason. So maybe we shouldn't bother considering application-driven
specialization.
_______________________________________________
bitc-dev mailing list
[email protected]
http://www.coyotos.org/mailman/listinfo/bitc-dev

Reply via email to