On Fri, Feb 20, 2015 at 10:17 AM, Keean Schupke <[email protected]> wrote:
> I realise this is longer than it need be as I am thinking this through as I
> write it. You could skip straight to the conclusion underlined as "Abstract
> Arities" if you don't want the thinking out-loud :-)
>
>
>
> f x y = x + y -- inferred type f :: fn 'a 'a -> 'a
> f x = \y . x + y -- inferred type f :: fn 'a -> fn 'a -> 'a
>
> These are types with concrete arities. Imported functions, and foreign
> functions get explicit concrete type signatures:
>
> import 'C' f :: fn 'a 'a -> 'a
>
> We can infer the arity of an application:
>
> (f 1) 2 -- inferred type f :: fn 'a -> fn 'a -> a
> (f 1 2) -- inferred type f :: fn 'a 'a -> 'a

This is what my "fake currying" proposal did, but not Shap's arity
specialization proposal, in which applications infer arity-abstract
function types. I realize you are proposing something different. I
will tell you why it doesn't work. :)

> Again each of these is a concrete type we infer.
>
> So far we have everything concrete, and can only apply functions that
> exactly match arities, so we should agree on everything so far?

I agree that so far you are requiring arities to exactly match.

> Now we would like to be able to call functions with some flexibility, but we
> have to consider that we do not want silent allocations. We cannot allow
> curried application of a non-curried function as that would require
> allocations, but it seems okay to allow non-curried use of a curried
> function. To achieve this all we need is a subtyping relationship defined on
> function types such that more curried is a subtype of less curried.

Yes. Very interesting. Since you're about to make a small error, let's
just clarify that
fn 'a -> fn 'a -> 'a
is more curried than
fn 'a 'a -> 'a

> for example:
>
> fn 'a 'a 'a -> 'a :< fn 'a 'a -> fn'a -> 'a
> fn 'a 'a 'a -> 'a :< fn 'a -> fn 'a 'a -> 'a
> fn 'a 'a -> fn'a -> 'a :< fn 'a -> fn 'a -> fn 'a -> a
> fn 'a -> fn 'a 'a -> 'a :< fn 'a -> fn 'a -> fn 'a -> a

Your relation looks backwards. You're saying (fn 'a 'a 'a -> 'a) is
the minimum, but it's the least curried, so by your own reasoning, it
should be the maximum. In other words, it's the least useful, easiest
to provide.

> Such hierarchies can be automatically generated by the compiler.

Yup.

> We are find
> to just let applications type check, and simply generate the correct number
> of applications when emitting the code for the function as we always have
> the concrete arities from the definition/import and the application.

No. Here's where it all goes wrong. You _don't_ necessarily have the
concrete arity when you compile an application. The example we keep
giving you is a higher-order function calling its parameter.

The cool thing is how close to right it would be to just compile the
application at whatever type it needs. This is ostensibly all wrong,
given the subtyping, but if we treat the subtyping as coercion
subtyping, it works. The only problem is that these coercions
introduce at least one closure allocation unless they all get inlined
right where the application finally occurs. Optimizing together all
the coercions and the application gets you exactly the specialized
application from Shap's scheme, I figure. This is an argument for
arity-concrete/abstract being the right way to do this. We want one
big jump from the original concrete type to the abstract type we
apply. We cannot afford to spread this out over multiple subsumptions
and a separate application.

> As an optimisation, if we have access to the function definition, we can
> generate 'uncurried' specialisations of function definitions to make calls
> more efficient.

This sounds like another way of looking at the dual arity
specialization where arity is chosen by the application.

> As we can always make 'uncurried' specialisations locally, and we always
> require type-signatures to export functions, we are safe to infer the
> most-curried version of all types for local use (still concrete) and just
> allow the export statement to create the necessary specialisation. So I
> would change the inferred type above to:
>
> g x y = x + y -- inferred type g :: fn 'a -> fn 'a -> 'a
>
> You could then export it as:
>
> export g :: fn 'a 'a -> 'a
>
> but locally you could use:
>
> ((g 1) 2)
>
> which would use a specilised definition of f by inserting a lambda in the
> definition (specialising it for the applied arity) with arity (1, 1) or
>
> (g 1 2)
>
> which would create an arity (2) specialised version.
>
> You could probably even allow multiple exports like:
>
> export g :: fn 'a 'a -> 'a
> export g :: fn 'a -> fn 'a -> 'a

I'm not sure why you'd want to do this stuff, but I think it would be
possible with dual arity specialization.

> In which case you could just treat them as overloads. This does introduce
> intersection types at the top level (g :: fn 'a 'a -> 'a /\ fn 'a -> fn 'a
> -> 'a), but in general top level intersections do not introduce any of the
> problems that more general intersection types have).

It sounds like this is dealing with inference issues. I don't know
much about that, but Shap does.

> Abstract Arities
> --------------------
>
> So there are no abstract-arities in this description, and if I think of an
> abstract arity like a type class:
>
> Arity 2 => fn 'a 'a -> 'a -- It seems an abstract arity is simply the
> argument count of the function for its least-curried application?

I have no idea what that type is supposed to mean. I've never seen
that notation before.

> It allows for more-curried applications of less-curried functions which is
> not allowed as it introduces allocations.

You mean Shap's proposal could introduce an allocation? I don't think
you're right. Can you give an example?

> It seems to me that a subtyping hierarchy more naturally fits, as we want a
> partial-order on function types, not an abstract arity which is unordered.

Well I argued above that coercion subtyping doesn't work, because
isolated coercions allocate. True subsumption doesn't work, because
different arities are not natively compatible.

My arity-abstract type constraints version of arity specialization
essentially does have a partial order. The order of which
possibly-abstract types instantiate to which other possibly-abstract
types. Concrete types are minimal. Shap seems to think his arity
variable version is essentially the same, so I guess it has an
analogous order.
_______________________________________________
bitc-dev mailing list
[email protected]
http://www.coyotos.org/mailman/listinfo/bitc-dev

Reply via email to