Ø  Offhand: while I have a bit of trouble reading => to mean concrete 
application because of System F (or System Fc) which most seem to be using to 
declare constraints introduced by concreting arity, as long as people don't mix 
=> and cfn I think it's clear enough.

Yes, I noticed that, I just wanted types of “plain-old-functions” to be 
visually distinct from other types of functions, hence the => arrow.

PKE

From: bitc-dev [mailto:[email protected]] On Behalf Of William ML 
Leslie
Sent: Tuesday, March 03, 2015 5:55 PM
To: Discussions about the BitC language
Subject: Re: [bitc-dev] Arity, Take N

On 4 March 2015 at 10:16, Pal Engstad 
<[email protected]<mailto:[email protected]>> wrote:
Hi all,

let’s define the problem. We want to translate all function definitions in our 
program from a language that allows applications to one that only uses function 
calls. This is a step towards creating assembly code, of course, and at one 
step we have to get down from the high-level language to a lower-level one.

Now, we are talking about different ways of translating our (lambda-like) input 
language to one that does not have lambdas, nor applications, only let-bindings 
and function-calls.

​Yes yes yes.
This seems to be the most productive way to have this discussion.  Given 
datatypes representing the arity-abstract surface syntax for programs and 
types, provide an algorithm that generates values in a program datatype that 
has a concrete arity at each application.​  Then we can discuss the soundness 
of the algorithm.
This is what Keenan did when he provided prolog code.

Notable pain points when doing this include not being able to rely on the types 
used to be fully concrete in the case of mutual recursion and aliasing of type 
variables with possibly different arity.


What is not so clear to me is how to proceed when g is a value if type A -> B, 
for instance as provided as an argument to the top-level  function. Or rather, 
the main problem is that the type of a parameter can be A -> B -> C -> … -> R, 
but we want to be able to call it with a concrete function f : [A B C … ] => R. 
In this case, you would have to coerce the value f to another f’ that is an 
applicative function (i.e. the tuple (a, e) above).

​The Shapiro solution says: polyinstantiate the callee against the provided 
arity.  The arity of the provided function is a static parameter, resolved in 
an additional typing pass.

The Schupke solution says: the callee will mandate a specific arity, either 
because it appears in the elaborated type, or because in typing the application 
we already have an explicit arity within the constraints of the function or its 
return type.  That is, arity typing proceeds from use to def.​



So, we need to expand on the simple applicative function values to more general 
applicative function values:

1) Term: g a0 a1 … aN
2) Type: A0 A1 … AN -> R
3) Value: A tuple (a, e) where a : [E A0 A1 .. AN] => R, e : E.

Note: I’m contradicting myself here, but yes, you *can* sensibly talk about the 
arity of a general applicative function – oops!

We can convert a primitive function value to general applicative function 
value, by auto-generating the function

    t : [f : [A*] => R] => (A* -> R)
    t f = (\args . f [Nil args], Nil)       -- We might not *actually* have to 
generate this if the calling convention allows for a hidden environment 
function argument.

​The concern with doing this is - if we want to be region explicit - these 
functions have different effect types.  So, if we add the region types back 
into the function arrows, we see
given t0 : [X Y] => Z
t2 : (X in %rx, Y in %ry) -> ​(Z in %rz) in static
t1 : (X in %rx) -> ((Y in %ry) -> (Z in %rz) in %rt_partial) in static

​we see that the region variable %rt_partial had to be introduced to hold the 
result of the partial application.​
To the extent to which this is a problem, ensuring that %rt_partial is relevant 
(or linear) could be a viable solution, but it's still getting very implicit.  
More likely, we must mandate that the region is passed explicitly.  And at that 
point, it's syntactically easier to use an operator to do the partial 
application and introduce the required region constraints for you.
---

I guess too, in a language where effect types are not required to be explicitly 
stated, there could be several real types that are X -> X -> X -> X:
X -(e0)-> X -(e1)-> X -(e2)-> X
where each of e0, e1, and e2 may be empty, especially if the function was 
defined with arity 3, in which case only e2 may be a nonempty set.  The problem 
is that while we don't write these explicitly, they are relevant to the correct 
functioning of our program.  If effects are seemingly randomly postponed 
because of automatic partial conversion we will not be able to reason about the 
ordering of effects using applicative reduction semantics.

---
Offhand: while I have a bit of trouble reading => to mean concrete application 
because of System F (or System Fc) which most seem to be using to declare 
constraints introduced by concreting arity, as long as people don't mix => and 
cfn I think it's clear enough.

--
William Leslie

Notice:
Likely much of this email is, by the nature of copyright, covered under 
copyright law.  You absolutely MAY reproduce any part of it in accordance with 
the copyright law of the nation you are reading this in.  Any attempt to DENY 
YOU THOSE RIGHTS would be illegal without prior contractual agreement.
_______________________________________________
bitc-dev mailing list
[email protected]
http://www.coyotos.org/mailman/listinfo/bitc-dev

Reply via email to