Re: Reinstallable - base

2023-10-17 Thread Adam Gundry

Hi Simon,

Thanks for starting this discussion, it would be good to see progress in 
this direction. As it happens I was discussing this question with Ben 
and Matt over dinner last night, and unfortunately they explained to me 
that it is more difficult than I naively hoped, even once wired-in and 
known-key things are moved to ghc-internal.


The difficulty is that, as a normal Haskell library, ghc itself will be 
compiled against a particular version of base. Then when Template 
Haskell is used (with the internal interpreter), code will be 
dynamically loaded into a process that already has symbols for ghc's 
version of base, which means it is not safe for the code to depend on a 
different version of base. This is rather like the situation with TH and 
cross-compilers.


Adam



On 17/10/2023 11:08, Simon Peyton Jones wrote:

Dear GHC devs

Given the now-agreed split between ghc-internal and base 
<https://github.com/haskellfoundation/tech-proposals/pull/51>, what 
stands in the way of a "reinstallable base"?


Specifically, suppose that

  * GHC 9.8 comes out with base-4.9
  * The CLC decides to make some change to `base`, so we get base-4.10
  * Then GHC 9.10 comes out with base-4.10

I think we'd all like it if someone could use GHC 9.10 to compile a 
library L that depends on base-4.9 and either L doesn't work at all with 
base-4.10, or L's dependency bounds have not yet been adjusted to allow 
base-4.10.


We'd like to have a version of `base`, say `base-4.9.1` that has the 
exact same API as `base-4.9` but works with GHC 9.10.


Today, GHC 9.10 comes with a specific version of base, /and you can't 
change it/. The original reason for that was, I recall, that GHC knows 
the precise place where (say) the type Int is declared, and it'll get 
very confused if that data type definition moves around.


But now we have `ghc-internal`, all these "things that GHC magically 
knows" are in `ghc-internal`, not `base`.


*Hence my question: what (now) stops us making `base` behave like any 
other library*?  That would be a big step forward, because it would mean 
that a newer GHC could compile old libraries against their old dependencies.


(Some changes would still be difficult.  If, for example, we removed 
Monad and replaced it with classes Mo1 and Mo2, it might be hard to 
simulate the old `base` with a shim.  But getting 99% of the way there 
would still be fantastic.)


Simon


--
Adam Gundry, Haskell Consultant
Well-Typed LLP, https://www.well-typed.com/

Registered in England & Wales, OC335890
27 Old Gloucester Street, London WC1N 3AX, England

___
ghc-devs mailing list
ghc-devs@haskell.org
http://mail.haskell.org/cgi-bin/mailman/listinfo/ghc-devs


Re: Understanding how warnings on unused functions work

2023-03-27 Thread Adam Gundry

Hi,

I think the function you are looking for is reportUnusedNames in 
GHC.Rename.Names [1], which calls findUses [2]. You may find the Note 
[Tracking unused binding and imports] in GHC.Tc.Types [3] helpful.


Roughly, GHC's renamer traverses the syntax tree to resolve names and 
along the way collects sets of defined/used names in tcg_dus. There are 
also a set of names used by the typechecker in tcg_keep. After 
typechecking, findUses takes the transitive closure of the set of uses 
and reportUnusedNames then issues unused-top-binds warnings for all 
names defined in the current module that are not in the set.


Hope this helps,

Adam


[1] 
https://gitlab.haskell.org/ghc/ghc/-/blob/e1fb56b24e2fe45a6f628f651bfc12b2b9743378/compiler/GHC/Rename/Names.hs#L1579


[2] 
https://gitlab.haskell.org/ghc/ghc/-/blob/e1fb56b24e2fe45a6f628f651bfc12b2b9743378/compiler/GHC/Types/Name/Set.hs#L203


[3] 
https://gitlab.haskell.org/ghc/ghc/-/blob/e1fb56b24e2fe45a6f628f651bfc12b2b9743378/compiler/GHC/Tc/Types.hs#L710



On 14/03/2023 17:13, Razetime wrote:
This message is related to a feature in the futhark compiler 
<https://github.com/diku-dk/futhark/issues/550>, which is a language 
with a similar module system.


I am trying to create a tool to detect unused top-level functions in 
Futhark, and I have been looking in the ghc codebase to find out how it 
does so. So far I have found these areas:


 1. warning flag

<https://gitlab.haskell.org/ghc/ghc/-/blob/master/compiler/GHC/Driver/Flags.hs#L556-557>
 2. WarnUnusedTopBinds

<https://gitlab.haskell.org/ghc/ghc/-/blob/master/compiler/GHC/Rename/Utils.hs#L355>

The project is a bit too big for my machine to take, so HLS is not 
helping me with finding where the computation of unused bindings 
(specifically functions) is done.
I would like some help with finding where ghc does this computation, and 
maybe some help with the naming conventions of the code.


Thanks in advance.


--
Adam Gundry, Haskell Consultant
Well-Typed LLP, https://www.well-typed.com/

Registered in England & Wales, OC335890
27 Old Gloucester Street, London WC1N 3AX, England

___
ghc-devs mailing list
ghc-devs@haskell.org
http://mail.haskell.org/cgi-bin/mailman/listinfo/ghc-devs


Re: Preventing inert caching in TC plugin?

2023-02-04 Thread Adam Gundry

Hi Sandy,

I don't think you can easily prevent GHC from caching solutions to 
specific constraints, and even if you could, the simplifier feels free 
to common-up dictionaries with the same type and hence would potentially 
introduce unintended semantic changes to the program.


You might be able to address at least the second issue by using implicit 
parameters instead. But I think a more robust approach would be to 
introduce an extra type parameter:


class KnownAnnotations a where
  rawAnnotationsVal :: [Annotation]

Now each rawAnnotationsVal call site will introduce a fresh unification 
variable α and the wanted constraint `KnownAnnotations α`. Your plugin 
can spot this constraint and solve it, while leaving α uninstantiated. 
Hence each call site will have a different constraint type, and the 
problems should go away.


Perhaps you could even make the parameter invisible/inferred, so it 
wouldn't change the user-visible API. Something like this might work, 
but I haven't tested it:


type KnownAnnotations :: forall {a} . Constraint
class KnownAnnotations where
  rawAnnotationsVal :: [Annotation]

Hope this helps,

Adam



On 04/02/2023 09:29, Sandy Maguire via ghc-devs wrote:

Hi all,

I'm working on a little TC plugin that solves the following class:

```haskell
class KnownAnnotations where
   rawAnnotationsVal :: [Annotation]
```

Like `HasCallStack`, the result of `KnownAnnotations` is unique to each
callsite. Thus, I am trying to prevent my TC plugin from adding the
solved constraint to the inert set.

As an example, the following program:

```
t2 :: ([Annotation], Int)
t2 = (rawAnnotationsVal, test2)

t3 :: ([Annotation], Int)
t3 = (rawAnnotationsVal, test3)
```

dumps this output with `-ddump-cs-trace`:

```

Step 1[l:0,d:0] Kept as inert:
 [WD] $dKnownAnnotations_a18UB {0}:: KnownAnnotations
Step 2[l:0,d:0] Dict equal (keep inert):
 [WD] $dKnownAnnotations_a18UD {0}:: KnownAnnotations
```

Is there some trick to prevent the inert caching for these solved
wanteds?

Cheers,
Sandy



--
Adam Gundry, Haskell Consultant
Well-Typed LLP, https://www.well-typed.com/

Registered in England & Wales, OC335890
27 Old Gloucester Street, London WC1N 3AX, England

___
ghc-devs mailing list
ghc-devs@haskell.org
http://mail.haskell.org/cgi-bin/mailman/listinfo/ghc-devs


Re: Trying to create a regression test for #20261

2021-08-29 Thread Adam Gundry
Hi Alexis,


On 29/08/2021 17:42, Alexis Praga wrote:
> As a intermediate beginner in Haskell, I want to try to tackle a small
> issue for contributing to GHC.

Welcome! Thanks for trying your hand at contributing to GHC.


> I've looked into https://gitlab.haskell.org/ghc/ghc/-/issues/20261, but
> it's not clear how to integrate a running time in a regression test.
> 
> From what I understand, regression tests should be compared against
> sucess/failure or a given output.
> Should there be a hard-coded limit ? If that's the case, how could I
> create such a test ?  I could not find a similar test in the source
> code.

Have a look in testsuite/tests/perf/compiler. That directory contains a
number of compiler performance tests, which work by having the test
runner compare metrics about the compilation run for the current commit
vs. previous commits. The all.T file lists the tests to run and the
metrics to collect.

There's more details on the wiki:
https://gitlab.haskell.org/ghc/ghc/-/wikis/building/running-tests/adding#performance-tests

Hope this helps,

Adam


-- 
Adam Gundry, Haskell Consultant
Well-Typed LLP, https://www.well-typed.com/

Registered in England & Wales, OC335890
118 Wymering Mansions, Wymering Road, London W9 2NF, England
___
ghc-devs mailing list
ghc-devs@haskell.org
http://mail.haskell.org/cgi-bin/mailman/listinfo/ghc-devs


Re: Plan for GHC 9.2

2021-02-15 Thread Adam Gundry
[Re-sending from the correct address, apologies!]

It would be great to get RecordDotSyntax for selection into 9.2.

As I just commented on !4532 [1] there's one awkward point to resolve,
which is that 9.2 will probably not have `setField`, on which
RecordDotSyntax updates depend.

Cheers,

Adam

[1] https://gitlab.haskell.org/ghc/ghc/-/merge_requests/4532#note_330581)


On 15/02/2021 10:33, Simon Peyton Jones via ghc-devs wrote:
> Ben
> 
> Can we get record dot syntax into 9.2?
> 
> * Shayne is really nearly there in !4532; he has been working
>   hard and recently.
> * It depends on my !4981 (was 4722) which fixes some bugs and
>   I'm keen to commit.
> 
> 
> So, is it ok in principle to pull to trigger on !4981, and hopefully !4532?
> 
> Simon
> 
> |  -Original Message-
> |  From: ghc-devs  On Behalf Of Ben Gamari
> |  Sent: 04 February 2021 18:56
> |  To: GHC developers 
> |  Subject: Plan for GHC 9.2
> |  
> |  
> |  tl;dr. Provisional release schedule for 9.2 enclosed. Please discuss,
> | especially if you have something you would like merged for
> |  9.2.1.
> |  
> |  Hello all,
> |  
> |  With GHC 9.0.1 at long-last out the door, it is time that we start
> |  turning attention to GHC 9.2. I would like to avoid making the mistake
> |  made in the 9.0 series in starting the fork in a state that required a
> |  significant amount of backporting to be releaseable. Consequently, I
> |  want to make sure that we have a fork schedule that is realistic given
> |  the things that need to be merged for 9.2. These include:
> |  
> |   * Update haddock submodule in `master` (Ben)
> |   * Bumping bytestring to 0.11 (#19091, Ben)
> |   * Finishing the rework of sized integer primops (#19026, John
> |  Ericson)
> |   * Merge of ghc-exactprint into GHC? (Alan Zimmerman, Henry)
> |   * Merge BoxedRep (#17526, Ben)
> |   * ARM NCG backend and further stabilize Apple ARM support? (Moritz)
> |   * Some form of coercion zapping (Ben, Simon, Richard)
> |   * Tag inference analysis and tag check elision (Andreas)
> |  
> |  If you see something that you would like to see in 9.2.1 please do
> |  holler. Otherwise, if you see your name in this list it would be great
> |  if you could let me know when you think your project may be in a
> |  mergeable state.
> |  
> |  Ideally we would strive for a schedule like the following:
> |  
> |  4 February 2021:   We are here
> | ~4 weeks pass
> |  3 March 2021:  Release branch forked
> | 1 week passes
> |  10 March 2021: Alpha 1 released
> | 3 weeks pass
> |  31 March 2021: Alpha 2 released
> | 2 weeks pass
> |  14 April 2021: Alpha 3 released
> | 2 weeks pass
> |  28 April 2021: Alpha 4 released
> | 1 week passes
> |  5 May 2021:Beta 1 released
> | 1 week passes
> |  12 May 2021:   Release candidate 1 released
> | 2 weeks pass
> |  26 May 2021:   Final release
> |  
> |  This provides ample time for stabilization while avoiding deviation
> |  from the usual May release timeframe. However, this would require that
> |  we move aggressively to start getting the tree into shape since the
> |  fork would be less than four weeks away. I would appreciate
> |  contributors'
> |  thoughts on the viability of this timeline.
> |  
> |  Cheers,
> |  
> |  - Ben


-- 
Adam Gundry, Haskell Consultant
Well-Typed LLP, https://www.well-typed.com/

Registered in England & Wales, OC335890
118 Wymering Mansions, Wymering Road, London W9 2NF, England
___
ghc-devs mailing list
ghc-devs@haskell.org
http://mail.haskell.org/cgi-bin/mailman/listinfo/ghc-devs


Re: PSA: type checker plugins to be affected by upcoming changes to GHC

2020-11-19 Thread Adam Gundry
Thanks Richard!

I'm happy to see this as I've always felt the flattening story to be
unnecessarily complex for type-checker plugins, and in fact uom-plugin
has been broken for a long time because of a previous GHC change to
flattening skolems.[1,2] I haven't checked but I am moderately hopeful
your work will at least make it easier to fix, if not fix it outright.

As part of removing Deriveds, do you plan to change the type-checker
plugin interface to drop the redundant argument? Although I suppose GHC
could simply pass an empty list to the plugin.

Ideally we would have a spec (or at least some tests!) for what
constraints get presented to plugins. The current implementation is
rather dependent on whatever GHC's solver happens to produce. In my
paper [3] I tried to specify plugins based on the published description
of OutsideIn(X), but that's far enough from the reality of GHC that it
isn't much help in practice. (One point it lacks is any treatment of
Deriveds, so I'm happy to see them go!)

Cheers,

Adam

[1] https://github.com/adamgundry/uom-plugin/issues/43
[2] https://gitlab.haskell.org/ghc/ghc/-/issues/15147
[3] https://adam.gundry.co.uk/pub/typechecker-plugins/




On 19/11/2020 04:45, Ben Gamari wrote:
> Iavor and Christiaan are two plugin authors that come to mind.
> 
> Ideally the patch would also include some migration documentation in the
> release notes.
> 
> Cheers,
> 
> - Ben
> 
> On November 18, 2020 11:20:40 PM EST, Richard Eisenberg
>  wrote:
> 
> Hi all,
> 
> I'm hard at work on two significant refactorings within GHC's constraint 
> solver. The first is at 
> https://gitlab.haskell.org/ghc/ghc/-/merge_requests/4149. It removes 
> flattening meta-variables and flattening skolems. This is a very nice 
> simplification. Instead, it just reduces type families directly. My other 
> patch (held up by the first) is at 
> https://gitlab.haskell.org/ghc/ghc/-/tree/wip/derived-refactor and will 
> remove Derived constraints, to be replaced by a little bit of cleverness in 
> suppressing certain confusing error messages. My guess is that either or both 
> of these will invalidate the current behavior of type-checker plugins. Sadly, 
> this is not just a case of finding a new function that replicates the old 
> behavior -- enough is changing under the hood that you might actually have to 
> rewrite chunks of your code.
> 
> I have never written a type-checker plugin, and so I don't currently have 
> advice for you. But if you are a plugin author affected by this change and 
> want help, please reach out -- I would be happy to walk you through the 
> changes, and then hopefully make a little video explaining the process to 
> other plugin authors.
> 
> Neither patch will make it for 9.0, but I expect both to be in 9.2. There 
> may be more where this came from 
> (https://gitlab.haskell.org/ghc/ghc/-/issues/18965) in the future, but it's 
> all for a good cause.
> 
> (I have bcc'd plugin authors that I'm aware of. Just adding this in case 
> you're surprised at receiving this email.)
> 
> Thanks,
> Richard



-- 
Adam Gundry, Haskell Consultant
Well-Typed LLP, https://www.well-typed.com/

Registered in England & Wales, OC335890
118 Wymering Mansions, Wymering Road, London W9 2NF, England
___
ghc-devs mailing list
ghc-devs@haskell.org
http://mail.haskell.org/cgi-bin/mailman/listinfo/ghc-devs


Re: Parser depends on DynFlags, depends on Hooks, depends on TcM, DsM, ...

2020-09-18 Thread Adam Gundry
Right, we don't want plugins to continue to have access to the *current*
DynFlags, because I agree that needs to be cut down. I certainly
wouldn't retain it as-is just for plugins.

Instead, I'm merely arguing that we should continue to let plugins
modify (a) the command-line-flags-as-AST (which is what I presume
DynFlags should become), and (b) the Hooks (perhaps with one or two
additions, e.g. log_action). Apologies if I caused confusion earlier by
referring to (a) as "DynFlags".

Cheers,

Adam


On 18/09/2020 14:56, Moritz Angermann wrote:
> I'm not certain anything in HEAD actually breaks any plugin today. But
> the whole idea of plugins having full access to what currently is
> "DynFlags" is not something I believe we can sustain. @Sylvain Henry
> <mailto:sylv...@haskus.fr> is currently cleaning up a lot of unnecessary
> DynFlags usage. I'm not against keeping the necessary infrastructure for
> hooks and other interfaces with plugins, but I'd like to advocate
> towards not expecting DynFlags to keep existing in eternity. If we
> assume a subset of what used to be in DynFlags to be relevant to
> Plugins, let's collect that in say PluginHooks, but let's keep that
> interface minimal. And maybe that can be specified to stay stable.
> 
> DynFlags is our state kitchensink in GHC, and it is everywhere. The
> state is threaded through everything and the module is gargantuous. So
> far there seemed to be broad support in removing this wart.
> 
> Cheers, 
> Moritz
> 
> On Fri, Sep 18, 2020 at 5:52 PM Adam Gundry  <mailto:a...@well-typed.com>> wrote:
> 
> On 14/09/2020 13:02, Moritz Angermann wrote:
> > I believe this to already be broken in HEAD. DynFlags already got
> quite
> > an overhaul/break. I'd rather we drop supporting DynFlagPlugins. And
> > offer alternative stable interfaces. Though to be honest, I
> believe our
> > Plugin story is rather poor so far.
> >
> > Do you happen to know of DynFlagPlugins, Adam?
> 
> A few have been mentioned in the thread now. What specifically do you
> believe is broken in HEAD regarding DynFlags plugins, and is there an
> issue for it? AFAICS the hooks-plugin test which corresponds to the
> user's guide text is still there.
> 
> I think it is important to retain the ability for plugins to manipulate
> both DynFlags and Hooks, whether the latter are separated out of the
> former or not. Both have legitimate use cases, and plugins necessarily
> involve using unstable interfaces (at least until someone designs a
> stable interface). I agree that the current state of plugins/hooks is
> somewhat ad-hoc and could do with more effort put into the design (like
> much else in the GHC API!) but that doesn't mean we should remove things
> that work already.
> 
> Slightly tangential note: discussing this with Alp I learned about the
> log_action/dump_action/trace_action fields of DynFlags, which also seem
> to violate Simon's "We should think of DynFlags as an abstract syntax
> tree." And indeed it would be useful for plugins to be able to override
> log_action, especially combined with #18516, as then we would have a
> nice story for plugins overriding error message generation to allow for
> domain-specific error messages.
> 
> Cheers,
> 
> Adam
> 
> 
> > On Mon, Sep 14, 2020 at 7:09 PM Adam Gundry  <mailto:a...@well-typed.com>
> > <mailto:a...@well-typed.com <mailto:a...@well-typed.com>>> wrote:
> >
> >     I'm supportive of the goal, but a complication with removing
> hooks from
> >     DynFlags is that GHC currently supports "DynFlags plugins"
> that allow
> >     plugins to install custom hooks
> >   
>  
> (https://downloads.haskell.org/~ghc/latest/docs/html/users_guide/extending_ghc.html#dynflags-plugins).
> >     I guess this can be worked around, perhaps by passing hooks
> separately
> >     to DynFlags and providing a separate plugin interface to
> modify hooks.
> >     But doing so will necessarily break existing plugins.
> >
> >     Adam
> >
> >
> >     On 14/09/2020 11:25, Simon Peyton Jones via ghc-devs wrote:
> >     > I thought I’d sent a message about this DynFlags thing, but
> I can’t
> >     > trace it now.   So here’s a resend.
> >     >
> >     >  
> >     >
> >     > Currently
> >     >
> >     >   * The DynFlags record includes Hooks
> >     >   * Hook

Re: Parser depends on DynFlags, depends on Hooks, depends on TcM, DsM, ...

2020-09-18 Thread Adam Gundry
On 14/09/2020 13:02, Moritz Angermann wrote:
> I believe this to already be broken in HEAD. DynFlags already got quite
> an overhaul/break. I'd rather we drop supporting DynFlagPlugins. And
> offer alternative stable interfaces. Though to be honest, I believe our
> Plugin story is rather poor so far.
> 
> Do you happen to know of DynFlagPlugins, Adam?

A few have been mentioned in the thread now. What specifically do you
believe is broken in HEAD regarding DynFlags plugins, and is there an
issue for it? AFAICS the hooks-plugin test which corresponds to the
user's guide text is still there.

I think it is important to retain the ability for plugins to manipulate
both DynFlags and Hooks, whether the latter are separated out of the
former or not. Both have legitimate use cases, and plugins necessarily
involve using unstable interfaces (at least until someone designs a
stable interface). I agree that the current state of plugins/hooks is
somewhat ad-hoc and could do with more effort put into the design (like
much else in the GHC API!) but that doesn't mean we should remove things
that work already.

Slightly tangential note: discussing this with Alp I learned about the
log_action/dump_action/trace_action fields of DynFlags, which also seem
to violate Simon's "We should think of DynFlags as an abstract syntax
tree." And indeed it would be useful for plugins to be able to override
log_action, especially combined with #18516, as then we would have a
nice story for plugins overriding error message generation to allow for
domain-specific error messages.

Cheers,

Adam


> On Mon, Sep 14, 2020 at 7:09 PM Adam Gundry  <mailto:a...@well-typed.com>> wrote:
> 
> I'm supportive of the goal, but a complication with removing hooks from
> DynFlags is that GHC currently supports "DynFlags plugins" that allow
> plugins to install custom hooks
> 
> (https://downloads.haskell.org/~ghc/latest/docs/html/users_guide/extending_ghc.html#dynflags-plugins).
> I guess this can be worked around, perhaps by passing hooks separately
> to DynFlags and providing a separate plugin interface to modify hooks.
> But doing so will necessarily break existing plugins.
> 
> Adam
> 
> 
> On 14/09/2020 11:25, Simon Peyton Jones via ghc-devs wrote:
> > I thought I’d sent a message about this DynFlags thing, but I can’t
> > trace it now.   So here’s a resend.
> >
> >  
> >
> > Currently
> >
> >   * The DynFlags record includes Hooks
> >   * Hooks in contains functions, that mention TcM, DsM etc
> >
> >  
> >
> > This is bad.  We should think of DynFlags as an *abstract syntax
> tree*. 
> > That is, the result of parsing the flag strings, yes, but not much
> > more.  So for hooks we should have an algebraic data type representing
> > the hook /specification/, but it should not be the hook functions
> > themselves.  HsSyn, for example, after parsing, is just a tree with
> > strings in it.  No TyCons, Ids, etc. That comes much later.
> >
> >  
> >
> > So DynFlags should be a collection of algebraic data types, but should
> > not depend on anything else.
> >
> >  
> >
> > I think that may cut a bunch of awkward loops.
> >
> >  
> >
> > Simon
> >
> >  
> >
> > *From:*Simon Peyton Jones
> > *Sent:* 10 September 2020 14:17
> > *To:* Sebastian Graf  <mailto:sgraf1...@gmail.com>>; Sylvain Henry
> > mailto:sylv...@haskus.fr>>
> > *Cc:* ghc-devs mailto:ghc-devs@haskell.org>>
> > *Subject:* RE: Parser depends on DynFlags, depends on Hooks,
> depends on
> > TcM, DsM, ...
> >
> >  
> >
> > And for sure the **parser** should not depend on the **desugarer** and
> > **typechecker**.   (Which it does, as described below.)
> >
> >  
> >
> 
> https://downloads.haskell.org/~ghc/latest/docs/html/users_guide/extending_ghc.html#dynflags-plugins
> > S
> >
> >  
> >
> > *From:*ghc-devs  <mailto:ghc-devs-boun...@haskell.org>
> > <mailto:ghc-devs-boun...@haskell.org
> <mailto:ghc-devs-boun...@haskell.org>>> *On Behalf Of *Sebastian Graf
> > *Sent:* 10 September 2020 14:12
> > *To:* Sylvain Henry mailto:sylv...@haskus.fr>
> <mailto:sylv...@haskus.fr <mailto:sylv...@haskus.fr>>>
> > *Cc:* ghc-devs mailto:ghc-devs@haskell.org>
> <mailto:ghc-devs@haskell.org <

Re: Parser depends on DynFlags, depends on Hooks, depends on TcM, DsM, ...

2020-09-14 Thread Adam Gundry
On 14/09/2020 12:45, Simon Peyton Jones via ghc-devs wrote:
> |  I guess this can be worked around, perhaps by passing hooks separately
> |  to DynFlags and providing a separate plugin interface to modify hooks.
> |  But doing so will necessarily break existing plugins.
> 
> I'm not sure it will break anything.  It's hard to find just what the API 
> that plugin authors use.. where is it documented?

The link to the GHC user's guide [1] I gave includes an example of a
DynFlags plugin overriding a hook. This was the key reason for
introducing DynFlags plugins originally [2] and is definitely used in
the wild [3]. TBH I'm not sure DynFlags plugins are yet used for
anything *other* than overriding hooks.


> All I'm arguing is that Hooks (which contains the hook functions themselves) 
> should not be part of DynFlags.  So for example:
> 
> dsForeigns :: [LForeignDecl GhcTc]
>-> DsM (ForeignStubs, OrdList Binding)
> dsForeigns fos = getHooked dsForeignsHook dsForeigns' >>= ($ fos)
> 
> Here
>   getHooked :: (Functor f, HasDynFlags f) => (Hooks -> Maybe a) -> a -> f a
> 
> Why HasDynFlags?  Yes, DsM might need a Hooks field, but that's easy. No need 
> to stuff it in DynFlags!

Right, I agree completely. I don't know about the initial implementation
of hooks, but I suspect DynFlags was picked as a convenient way to pass
them throughout the compiler, at the cost of introducing more cycles.

I just suggest that either the Plugin type gets extended with a new
hooksPlugin field of type

[CommandLineOption] -> Hooks -> IO Hooks

or the type of the existing dynflagsPlugin field is changed to something
like

[CommandLineOption] -> (DynFlags, Hooks) -> IO (DynFlags, Hooks)

so that plugins can still set up hooks. Though perhaps there is an even
better story lurking here about combining Hooks and Plugins rather than
having a somewhat artificial separation between them...

Adam


[1]
https://downloads.haskell.org/~ghc/latest/docs/html/users_guide/extending_ghc.html#dynflags-plugins
[2] https://gitlab.haskell.org/ghc/ghc/-/merge_requests/1580
[3] https://gitlab.haskell.org/alp/ghc-external-splices-plugin


> Simon
> 
> |  -Original Message-
> |  From: ghc-devs  On Behalf Of Adam Gundry
> |  Sent: 14 September 2020 12:08
> |  To: ghc-devs@haskell.org
> |  Subject: Re: Parser depends on DynFlags, depends on Hooks, depends on
> |  TcM, DsM, ...
> |  
> |  I'm supportive of the goal, but a complication with removing hooks from
> |  DynFlags is that GHC currently supports "DynFlags plugins" that allow
> |  plugins to install custom hooks
> |  (https://nam06.safelinks.protection.outlook.com/?url=https:%2F%2Fdownlo
> |  ads.haskell.org%2F~ghc%2Flatest%2Fdocs%2Fhtml%2Fusers_guide%2Fextending
> |  _ghc.html%23dynflags-
> |  pluginsdata=02%7C01%7Csimonpj%40microsoft.com%7C0e358ff421a64763f5
> |  4808d8589e9b50%7C72f988bf86f141af91ab2d7cd011db47%7C1%7C0%7C63735678554
> |  4069988sdata=d8wJidQddoONBhJXnv6iI1%2FD4gVDq%2FvC7cgSxJoBOFQ%3D
> |  p;reserved=0).
> |  I guess this can be worked around, perhaps by passing hooks separately
> |  to DynFlags and providing a separate plugin interface to modify hooks.
> |  But doing so will necessarily break existing plugins.
> |  
> |  Adam
> |  
> |  
> |  On 14/09/2020 11:25, Simon Peyton Jones via ghc-devs wrote:
> |  > I thought I’d sent a message about this DynFlags thing, but I can’t
> |  > trace it now.   So here’s a resend.
> |  >
> |  >
> |  >
> |  > Currently
> |  >
> |  >   * The DynFlags record includes Hooks
> |  >   * Hooks in contains functions, that mention TcM, DsM etc
> |  >
> |  >
> |  >
> |  > This is bad.  We should think of DynFlags as an *abstract syntax
> |  tree*.
> |  > That is, the result of parsing the flag strings, yes, but not much
> |  > more.  So for hooks we should have an algebraic data type
> |  representing
> |  > the hook /specification/, but it should not be the hook functions
> |  > themselves.  HsSyn, for example, after parsing, is just a tree with
> |  > strings in it.  No TyCons, Ids, etc. That comes much later.
> |  >
> |  >
> |  >
> |  > So DynFlags should be a collection of algebraic data types, but
> |  should
> |  > not depend on anything else.
> |  >
> |  >
> |  >
> |  > I think that may cut a bunch of awkward loops.
> |  >
> |  >
> |  >
> |  > Simon
> |  >
> |  >
> |  >
> |  > *From:*Simon Peyton Jones
> |  > *Sent:* 10 September 2020 14:17
> |  > *To:* Sebastian Graf ; Sylvain Henry
> |  > 
> |  > *Cc:* ghc-devs 
> |  > *Subject:* RE: Parser depends on DynFlags, depends on Hooks, depends
>

Re: Parser depends on DynFlags, depends on Hooks, depends on TcM, DsM, ...

2020-09-14 Thread Adam Gundry
I'm supportive of the goal, but a complication with removing hooks from
DynFlags is that GHC currently supports "DynFlags plugins" that allow
plugins to install custom hooks
(https://downloads.haskell.org/~ghc/latest/docs/html/users_guide/extending_ghc.html#dynflags-plugins).
I guess this can be worked around, perhaps by passing hooks separately
to DynFlags and providing a separate plugin interface to modify hooks.
But doing so will necessarily break existing plugins.

Adam


On 14/09/2020 11:25, Simon Peyton Jones via ghc-devs wrote:
> I thought I’d sent a message about this DynFlags thing, but I can’t
> trace it now.   So here’s a resend.
> 
>  
> 
> Currently
> 
>   * The DynFlags record includes Hooks
>   * Hooks in contains functions, that mention TcM, DsM etc
> 
>  
> 
> This is bad.  We should think of DynFlags as an *abstract syntax tree*. 
> That is, the result of parsing the flag strings, yes, but not much
> more.  So for hooks we should have an algebraic data type representing
> the hook /specification/, but it should not be the hook functions
> themselves.  HsSyn, for example, after parsing, is just a tree with
> strings in it.  No TyCons, Ids, etc. That comes much later.
> 
>  
> 
> So DynFlags should be a collection of algebraic data types, but should
> not depend on anything else.
> 
>  
> 
> I think that may cut a bunch of awkward loops.
> 
>  
> 
> Simon
> 
>  
> 
> *From:*Simon Peyton Jones
> *Sent:* 10 September 2020 14:17
> *To:* Sebastian Graf ; Sylvain Henry
> 
> *Cc:* ghc-devs 
> *Subject:* RE: Parser depends on DynFlags, depends on Hooks, depends on
> TcM, DsM, ...
> 
>  
> 
> And for sure the **parser** should not depend on the **desugarer** and
> **typechecker**.   (Which it does, as described below.)
> 
>  
> https://downloads.haskell.org/~ghc/latest/docs/html/users_guide/extending_ghc.html#dynflags-plugins
> S
> 
>  
> 
> *From:*ghc-devs  <mailto:ghc-devs-boun...@haskell.org>> *On Behalf Of *Sebastian Graf
> *Sent:* 10 September 2020 14:12
> *To:* Sylvain Henry mailto:sylv...@haskus.fr>>
> *Cc:* ghc-devs mailto:ghc-devs@haskell.org>>
> *Subject:* Parser depends on DynFlags, depends on Hooks, depends on TcM,
> DsM, ...
> 
>  
> 
> Hey Sylvain,
> 
>  
> 
> In https://gitlab.haskell.org/ghc/ghc/-/merge_requests/3971
> <https://nam06.safelinks.protection.outlook.com/?url=https%3A%2F%2Fgitlab.haskell.org%2Fghc%2Fghc%2F-%2Fmerge_requests%2F3971=02%7C01%7Csimonpj%40microsoft.com%7C0c3760e72fad4200d39408d8558b3871%7C72f988bf86f141af91ab2d7cd011db47%7C1%7C0%7C637353404753453548=fVpIzJgaqFfWaJ5ppCE5daHwdETTQF03o1h0uNtDxGA%3D=0>
> I had to fight once more with the transitive dependency set of the
> parser, the minimality of which is crucial for ghc-lib-parser
> <https://nam06.safelinks.protection.outlook.com/?url=https%3A%2F%2Fhackage.haskell.org%2Fpackage%2Fghc-lib-parser=02%7C01%7Csimonpj%40microsoft.com%7C0c3760e72fad4200d39408d8558b3871%7C72f988bf86f141af91ab2d7cd011db47%7C1%7C0%7C637353404753463506=HZMaqK6t7PLifc26wf%2BqcUef4Ko%2BQcaPRx4o7XLcVq8%3D=0>
> and tested by the CountParserDeps test.
> 
>  
> 
> I discovered that I need to make (parts of) `DsM` abstract, because it
> is transitively imported from the Parser for example through Parser.y ->
> Lexer.x -> DynFlags -> Hooks -> {DsM,TcM}.
> 
> Since you are our mastermind behind the "Tame DynFlags" initiative, I'd
> like to hear your opinion on where progress can be/is made on that front.
> 
>  
> 
> I see there is https://gitlab.haskell.org/ghc/ghc/-/issues/10961
> <https://nam06.safelinks.protection.outlook.com/?url=https%3A%2F%2Fgitlab.haskell.org%2Fghc%2Fghc%2F-%2Fissues%2F10961=02%7C01%7Csimonpj%40microsoft.com%7C0c3760e72fad4200d39408d8558b3871%7C72f988bf86f141af91ab2d7cd011db47%7C1%7C0%7C637353404753463506=sn9zv1MO8p%2FSbwsm1NDaSiUaumE%2FvTo4NkGreYOjITA%3D=0>
> and https://gitlab.haskell.org/ghc/ghc/-/issues/11301
> <https://nam06.safelinks.protection.outlook.com/?url=https%3A%2F%2Fgitlab.haskell.org%2Fghc%2Fghc%2F-%2Fissues%2F11301=02%7C01%7Csimonpj%40microsoft.com%7C0c3760e72fad4200d39408d8558b3871%7C72f988bf86f141af91ab2d7cd011db47%7C1%7C0%7C637353404753463506=vFTEuEzIQLJTtpu7%2BuwFnOEWMPv8eY%2B%2FvgbrrV18uss%3D=0>
> which ask a related, but different question: They want a DynFlags-free
> interface, but I even want a DynFlags-free *module*.
> 
>  
> 
> Would you say it's reasonable to abstract the definition of `PState`
> over the `DynFlags` type? I think it's only used for pretty-printing
> messages, which is one of your specialties (the treatment of DynFlags in
> there, at least).
> 
> Anyway, can you think of or perhaps point me to an existing road 

Re: TTG: Handling Source Locations

2019-02-12 Thread Adam Gundry
us add more than one; that's redundant but not harmful.
>>>>
>>>> I disagree here. If we add locations to a node twice, then we'll have to 
>>>> use dL twice to find the underlying constructor. This is another case 
>>>> there the type system offers the producer flexibility but hamstrings the 
>>>> consumer.
>>>>
>>>>
>>>>> On Feb 12, 2019, at 7:32 AM, Vladislav Zavialov  
>>>>> wrote:
>>>>>
>>>>> I claim an SrcSpan makes sense everywhere, so this is not a useful
>>>>> distinction. Think about it as code provenance, an AST node always
>>>>> comes from somewhere
>>>>
>>>> I agree with this observation. Perhaps SrcSpan is a bad name, and 
>>>> SrcProvenance is better. We could even imagine using the new HasCallStack 
>>>> feature to track where generated code comes from (perhaps only in DEBUG 
>>>> compilers). Do we need to do this today? I'm not sure there's a crying 
>>>> need. But philosophically, we are able to attach a provenance to every 
>>>> slice of AST, so there's really no reason for uninformative locations.
>>>>
>>>>> My concrete proposal: let's just put SrcSpan in the extension fields
>>>>> of each node
>>>>
>>>> I support this counter-proposal. Perhaps if it required writing loads of 
>>>> extra type instances, I wouldn't be as much in favor. But we already have 
>>>> to write those instances -- they just change from NoExt to SrcSpan. This 
>>>> seems to solve all the problems nicely, at relatively low cost. And, I'm 
>>>> sure it's more efficient at runtime than either the previous ping-pong 
>>>> style or the current scenario, as we can pattern-match on constructors 
>>>> directly, requiring one less pointer-chase or function call.
>>>>
>>>> One downside of this proposal is that it means that more care will have to 
>>>> be taken when setting the extension field of AST nodes after a pass, 
>>>> making sure to preserve the location. (This isn't really all that 
>>>> different from location-shuffling today.) A quick mock-up shows that 
>>>> record-updates can make this easier:
>>>>
>>>>> data Phase = Parsed | Renamed
>>>>>
>>>>> data Exp p = Node (XNode p) Int
>>>>>
>>>>> type family XNode (p :: Phase)
>>>>> type instance XNode p = NodeExt p
>>>>>
>>>>> data NodeExt p where
>>>>>  NodeExt :: { flag :: Bool, fvs :: RenamedOnly p String } -> NodeExt p
>>>>>
>>>>> type family RenamedOnly p t where
>>>>>  RenamedOnly Parsed _ = ()
>>>>>  RenamedOnly Renamed t = t
>>>>>
>>>>> example :: Exp Parsed
>>>>> example = Node (NodeExt { flag = True, fvs = () }) 5
>>>>>
>>>>> rename :: Exp Parsed -> Exp Renamed
>>>>> rename (Node ext n) = Node (ext { fvs = "xyz" }) n
>>>>
>>>> Note that the extension point is a record type that has a field available 
>>>> only after renaming. We can then do a type-changing record update when 
>>>> producing the renamed node, preserving the flag in the code above. What's 
>>>> sad is that, if there were no renamer-only field, we couldn't do a 
>>>> type-changing empty record update as the default case. (Haskell doesn't 
>>>> have empty record updates. Perhaps it should. They would be useful in 
>>>> doing a type-change on a datatype with a phantom index. A clever compiler 
>>>> could even somehow ensure that such a record update is completely compiled 
>>>> away.) In any case, this little example is essentially orthogonal to my 
>>>> points above, and the choice of whether to use records or other structures 
>>>> are completely local to the extension point. I just thought it might make 
>>>> for a nice style.
>>>>
>>>> Thanks,
>>>> Richard

-- 
Adam Gundry, Haskell Consultant
Well-Typed LLP, https://www.well-typed.com/
___
ghc-devs mailing list
ghc-devs@haskell.org
http://mail.haskell.org/cgi-bin/mailman/listinfo/ghc-devs


Re: A type checker plugin for row types

2017-09-11 Thread Adam Gundry
Hi Nick,

This is great work, and I look forward to seeing the code once it is
ready. I've had a quick glance over your wiki page, and thought I should
send you some initial comments, though it deserves deeper attention
which I will try to find time to give it. :-)

I don't see a reference to Iavor's paper "Improving Haskell Types with
SMT" (http://yav.github.io/publications/improving-smt-types.pdf). If
you've not come across it, it might give a useful alternative
perspective on how plugins work, especially with regard to derived
constraints.

The following is based on my faulty memory, so apologies if it is out of
date or misleading...

> When/where exactly do Derived constraints arise?

Suppose I have a class with an equality superclass

class a ~ b => C a b

and a wanted constraint `C alpha Int`, for some touchable variable
`alpha`. This leads to a derived constraint `alpha ~ Int` thanks to the
superclass (derived means we don't actually need evidence for it in
order to build the core term, but solving it might help fill in some
touchable variables).  Sorry if this is obvious and not exact enough!

> When do touchables "naturally" arise in Given constraints?

Do you mean "touchable" or "unification variable" here (and elsewhere?).
A skolem is always untouchable, but the converse is not true.

I think that unification variables can arise in Given constraints, but
that they will always be untouchable. Suppose we have defined

f :: forall a b . ((a ~ b) => a -> b) -> Int

(never mind that it is ambiguous) and consider type-checking the call `f
id`. We end up checking `id` against type `a -> b` with given `a ~ b`
where `a` and `b` are unification variables. They must be untouchable,
however, otherwise we might unify them, which would be wrong.

Hope this helps,

Adam


On 10/09/17 23:24, Nicolas Frisby wrote:
> Hi all. I've been spending my free time for the last couple months on a
> type checker plugin for row types. The free time waxes and wanes;
> sending an email like this one was my primary goal for the past couple
> weeks.
> 
> At the very least, I hoped this project would let me finally get some
> hands on experience with OutsideIn. And I definitely have. But I've also
> made more progress than I anticipated, and I think the plugin is
> starting to have legs!
> 
> I haven't uploaded the code yet to github -- it's not quite ready to
> share. But I did do a write up on the dev wiki.
> 
>  
> https://ghc.haskell.org/trac/ghc/wiki/Plugins/TypeChecker/RowTypes/Coxswain
> 
> I would really appreciate and questions, comments, and --- boy, oh boy
> --- answers.
> 
> I hope to upload within a week or so, and I'll update that wiki page and
> reply to this email when I do.
> 
> Thanks very much. -Nick
> 
> P.S. -- I've CC'd and BCC'd people who I anticipate would be
> specifically interested in this (e.g. plugins, row types, etc). Please
> feel free to forward to others that come to mind; I know some inboxes
> abjectly can't afford default list traffic.
> 
> P.P.S. -- One hold up for the upload is: which license? I intend to
> release under BSD3, mainly to match GHC since one ideal scenario would
> involve being packaged with/integrated into GHC. But my brief recent
> research suggests that the Apache license might be more conducive to
> eventual widespread adoption. If you'd be willing to advise or even just
> refer me to other write ups, please feel free to email me directly or to
> start a separate thread on a more appropriate distribution list (CC'ing
> me, please). Thanks again.


-- 
Adam Gundry, Haskell Consultant
Well-Typed LLP, http://www.well-typed.com/
___
ghc-devs mailing list
ghc-devs@haskell.org
http://mail.haskell.org/cgi-bin/mailman/listinfo/ghc-devs


Overloaded record fields for 8.2?

2017-01-30 Thread Adam Gundry
Hi Ben, devs,

I'd like to propose that we merge my latest ORF patch
(https://phabricator.haskell.org/D2708) for 8.2 (modulo any code review
improvements, of course). The corresponding proposal discussion
(https://github.com/ghc-proposals/ghc-proposals/pull/6) seems to have
reached a broad consensus. But it's not clear to me if some kind of
final approval from the GHC committee is needed, or how to obtain that.

This patch makes breaking changes to the experimental OverloadedLabels
extension, and I think it would be good to make those quickly rather
than having another release with the "old" version. Moreover, the work
has been in progress for a long time, so it would be great to see it
finally make it into a GHC release.

Any objections?

Adam


-- 
Adam Gundry, Haskell Consultant
Well-Typed LLP, http://www.well-typed.com/
___
ghc-devs mailing list
ghc-devs@haskell.org
http://mail.haskell.org/cgi-bin/mailman/listinfo/ghc-devs


OverloadedRecordFields request for review

2016-11-16 Thread Adam Gundry
Hi devs,

I just wanted to draw your attention to the OverloadedRecordFields
proposal [1], which I've just updated and now has a corresponding
implementation ready for review on Phab [2]. I'd very much appreciate
any feedback on either the design or implementation.

Thanks,

Adam

[1] https://github.com/ghc-proposals/ghc-proposals/pull/6
[2] https://phabricator.haskell.org/D2708


-- 
Adam Gundry, Haskell Consultant
Well-Typed LLP, http://www.well-typed.com/
___
ghc-devs mailing list
ghc-devs@haskell.org
http://mail.haskell.org/cgi-bin/mailman/listinfo/ghc-devs


Re: Can we offer ~ without GADTs or type families?

2016-08-05 Thread Adam Gundry
On 05/08/16 19:08, David Feuer wrote:
> It seems to me that equality constraints could potentially be supported
> by an implementation with neither GADTs nor type families. Type families
> don't really seem to have much to do with it, and GADTs are strictly
> heavier (GADTs ~= ExistentialQuantification + TypeEquality).
> 
> Could we get a separate LANGUAGE pragma just for equality constraints?

I suggested this in #10431 [1], where there is some discussion of the
implications. I still think it is a good idea, and I don't think the
implementation would be very difficult.

All the best,

Adam

[1] https://ghc.haskell.org/trac/ghc/ticket/10431


-- 
Adam Gundry, Haskell Consultant
Well-Typed LLP, http://www.well-typed.com/
___
ghc-devs mailing list
ghc-devs@haskell.org
http://mail.haskell.org/cgi-bin/mailman/listinfo/ghc-devs


Re: Rank2Types example not typechecking w/ GHC8. Bug or feature?

2016-05-29 Thread Adam Gundry
Hi Michael,

I think that GHC 8 is correct to reject this program without
ImpredicativeTypes (and if you enable ImpredicativeTypes, all bets are
off). Previous versions incorrectly accepted such programs, because they
didn't look through type synonyms. If you expand the type of the second
createResolver, you get

((String -> forall state . State state -> IO String), Dummy)

which has a quantified type in the first argument of the pair (i.e. it
requires impredicativity). See ticket #10194 for more details.

All the best,

Adam


On 29/05/16 17:47, Michael Karg wrote:
> Hi devs,
> 
> could you please have a look at the following code snippet (modeled
> after a real-world app of mine)? There's a rank2type involved, and it
> doesn't type-check anymore when the type is e.g. part of a tuple,
> whereas everything's fine when it's the "outermost" type.
> 
> With GHC7.10 both variants type-check. Could anyone shed some light on
> what's behind this? Is the way the types are used in the snippet
> considered dispreferred or wrong under GHC8?
> 
> Thanks for having a look and hopefully pointing me to a page/ticket/...
> providing insight,
> Michael
> 
> 
> 
> {-# LANGUAGE Rank2Types #-}
> 
> module TestTypes where
> 
> data State a= State a
> 
> data Dummy  = Dummy
> 
> type Handler result = forall state . State state -> IO result
> 
> type Resolver   = String -> Handler String
> 
> 
> eventRouter :: Resolver -> String -> IO ()
> eventRouter resolver event =
> resolver event state >> return ()
>   where
> state :: State ()
> state = undefined
> 
> {-
> -- does type check
> createResolver :: Resolver
> createResolver = \event state -> return "result"
> 
> processor :: IO ()
> processor =
> getLine >>= eventRouter resolver >> processor
>   where
> resolver = createResolver
> -}
> 
> 
> -- does not type check when the rank 2 type isn't the "outermost" one?
> createResolver :: (Resolver, Dummy)
> createResolver = (\event state -> return "result", Dummy)
> 
> processor :: IO ()
> processor =
> getLine >>= eventConsumer resolver >> processor
>   where
> (resolver, _) = createResolver
> 
> {-
> • Couldn't match type ‘t’ with ‘Resolver’
>   ‘t’ is a rigid type variable bound by
> the inferred type of resolver :: t at TestTypes.hs:41:5
>   Expected type: (t, Dummy)
> Actual type: (Resolver, Dummy)
> -}


-- 
Adam Gundry, Haskell Consultant
Well-Typed LLP, http://www.well-typed.com/
___
ghc-devs mailing list
ghc-devs@haskell.org
http://mail.haskell.org/cgi-bin/mailman/listinfo/ghc-devs


Re: VisibleTypeApplication and AllowAmbiguousTypes

2016-03-15 Thread Adam Gundry
On 15/03/16 14:53, Richard Eisenberg wrote:
> I'm happy to change the setting, but my logic was this: With
> -XTypeApplications, there are no ambiguous types. Of course, there
> may be types that are unambiguous locally but might be ambiguous in
> downstream modules that don't use -XTypeApplications, and this is
> what Andrew is (quite validly) getting at.

Yes. I think the key point here is that TypeApplications is typically
required at use sites, whereas AllowAmbiguousTypes is needed at
definition sites.

AllowAmbiguousTypes is a heuristic: ambiguity can arise without it, and
even before TypeApplications there were cases in which
AllowAmbiguousTypes was necessary to permit polymorphic definitions that
could be used unambiguously in some contexts. With TypeApplications, any
definition can be used given a suitable context, but ambiguity is still
a useful rule for deciding whether a definition is likely to lead to
type inference trouble.

TL;DR I agree that we should drop the implication.

All the best,

Adam


> Richard
> 
> On Mar 15, 2016, at 9:16 AM, Ben Gamari <b...@smart-cactus.org>
> wrote:
> 
>> Andrew Martin <andrew.thadd...@gmail.com> writes:
>> 
>>> I'm posting this because Richard said it would be the best place
>>> to raise this issue. I know it's a little bit late in the GHC8
>>> development process, but I would like to get the idea out there.
>>> 
>>> To my knowledge, in GHC8, turning on VisibleTypeApplication will
>>> also turn on AllowAmbiguousTypes. I think that a better behavior
>>> for most end users would be to not have this happen.
>>> 
>>> I need AllowAmbiguousTypes turned on in modules where I want to
>>> declare functions that are presently uncalled. At a call site
>>> though, I only need VisibleTypeApplication, not both extensions.
>>> The only reason I bring this up is because having
>>> AllowAmbiguousTypes on is usually bad. It makes it possible to
>>> write functions that don't work, but you don't learn that until 
>>> you get to the call site. If I write libraries that require 
>>> VisibleTypeAppliaction to call certain functions, I don't want
>>> users accidentally ending up turning on AllowAmbiguousTypes in
>>> their own code.
>>> 
>> It sounds reasonable to me to require that those defining functions
>> to be used with type application enable AllowAmbiguousTypes. In my
>> opinion we should avoid adding extension implications unless the
>> extensions in question are essentially always used together and
>> this doesn't appear to be true in the case of TypeApplications and 
>> AllowAmbiguousTypes.
>> 
>> Cheers, - Ben



-- 
Adam Gundry, Haskell Consultant
Well-Typed LLP, http://www.well-typed.com/
___
ghc-devs mailing list
ghc-devs@haskell.org
http://mail.haskell.org/cgi-bin/mailman/listinfo/ghc-devs


Re: ORF and renamer

2016-02-23 Thread Adam Gundry
Hi Alan,

I certainly made a few changes to the renamer as part of the ORF work,
but I wouldn't expect them to cause the behaviour you describe. Name
resolution for ambiguous record selectors is deferred to the typechecker
when the DuplicateRecordFields extension is enabled, but unambiguous
selectors (as in your example) are resolved by the renamer. Moreover, I
didn't touch anything to do with uniques. The record selector will be in
scope once, with a single Name containing a single Unique.

Unfortunately, I can't reproduce the behaviour you describe with:

ghc-8.0.0.20160204 Field1.hs -ddump-rn-trace -dppr-debug

Can you describe in more detail how to reproduce the problem?

All the best,

Adam


On 23/02/16 13:35, Alan & Kim Zimmerman wrote:
> I am working on updating HaRe for GHC 8.0.1, and have hit an issue
> with the following file
> 
> ---
> module Field1 where
> 
> --Rename field name 'pointx' to 'pointx1'
> 
> data Point = Pt {pointx, pointy :: Float}
> 
> absPoint :: Point -> Float
> absPoint p = sqrt (pointx p * pointx p +
>   pointy p * pointy p)
> --
> 
> It seems that after the renamer, each of the three instances of
> `pointx` has a different `nameUnique` value.
> 
> Is this because the final resolution is now done during type checking?
>  If so it makes the RenamedSource much harder to work with.
> 
> Alan



-- 
Adam Gundry, Haskell Consultant
Well-Typed LLP, http://www.well-typed.com/
___
ghc-devs mailing list
ghc-devs@haskell.org
http://mail.haskell.org/cgi-bin/mailman/listinfo/ghc-devs


Re: Impredicative types and forall regression in 8.0 ?

2016-02-15 Thread Adam Gundry
Hi Vincent,

On 15/02/16 09:44, Vincent Hanquez wrote:
> I recently came across a compilation problem on 8.0 that do not trigger
> on < 8.0. I've reduced the issue to this snippet with Rank2Types:
> 
> type Action = (forall a . Show a => a) -> Int
> data Command = Command (Maybe Action)
> 
> com1 = Command Nothing
> 
> With GHC 7.0 to 7.10, this compile and works.

This shouldn't be accepted without ImpredicativeTypes[*], but GHC
versions prior to 8.0 failed to look through the type synonym. Note that
if you say

data Command = Command (Maybe ((forall a . Show a => a) -> Int))

then earlier versions report an error. Take a look at GHC ticket #10194
for more details: https://ghc.haskell.org/trac/ghc/ticket/10194

> However on GHC 8.0, I now have:
> 
> Test.hs:19:16: error:
> • Cannot instantiate unification variable ‘a0’
>   with a type involving foralls: Action
> GHC doesn't yet support impredicative polymorphism
> • In the first argument of ‘Command’, namely ‘Nothing’
>   In the expression: Command Nothing
>   In an equation for ‘com1’: com1 = Command Nothing
> 
> I workarounded it by "inlining" the maybe and creating a proper wrapping
> type:
> 
> data Action2 = Action2 ((forall a . Show a => a) -> Int) | NoAction2
> data Command2 = Command2 Action2
> 
> com2 = Command2 NoAction2

This looks like a reasonable alternative.

Hope this helps,

Adam

[*] And if you turn on ImpredicativeTypes, GHC may or may not do the
right thing.


-- 
Adam Gundry, Haskell Consultant
Well-Typed LLP, http://www.well-typed.com/
___
ghc-devs mailing list
ghc-devs@haskell.org
http://mail.haskell.org/cgi-bin/mailman/listinfo/ghc-devs


Re: Feature status for GHC 8.0

2015-11-19 Thread Adam Gundry
Hi Ben,

On 17/11/15 15:22, Ben Gamari wrote:
>  * OverloadedRecordFields      Adam Gundry
>   Documentation needed
>   Full ORF not happening?

That's right. The full status is described on the wiki [1]. The short
story is that the DuplicateRecordFields and OverloadedLabels extensions
are implemented and I will document them, but that I don't expect to
have the constraint solver extensions ready for 8.0.

While I think about it, are there any Template Haskell experts out there
who could advise on #11103 [2]?

Thanks,

Adam

[1] https://ghc.haskell.org/trac/ghc/wiki/Records/OverloadedRecordFields
[2] https://ghc.haskell.org/trac/ghc/ticket/11103

-- 
Adam Gundry, Haskell Consultant
Well-Typed LLP, http://www.well-typed.com/
___
ghc-devs mailing list
ghc-devs@haskell.org
http://mail.haskell.org/cgi-bin/mailman/listinfo/ghc-devs


Status of OverloadedRecordFields [Was: Re: Proposal: Dot as Postfix Function Apply]

2015-11-05 Thread Adam Gundry
[copying ghc-devs as the following is perhaps more relevant there]

On 04/11/15 10:44, Jeremy wrote:
> Dot as Postfix Function Apply
> (https://ghc.haskell.org/trac/ghc/wiki/Records/DeclaredOverloadedRecordFields/DotPostfix)
> was originally part of ORF, but dropped to keep the scope/discussion more
> manageable. Now that ORF is in for GHC 8.0, I would like to re-propose dot
> postfix syntax.

I'm going to stay out of the discussion over the proposed syntax
extension, but I feel like I should do some expectation management about
ORF in GHC 8.0. As the wiki page [1] makes clear, ORF has been divided
into three components, and so far only one of these is in GHC HEAD.

The first part is a language extension, DuplicateRecordFields, which
allows field names to be duplicated provided they are unambiguous. (It
does not allow any kind of polymorphism over record fields.) This is
already in HEAD.

The second part is another language extension, OverloadedLabels, which
introduces a new syntax #x for an identifier whose type is inferred and
may depend on the text of the identifier. This is implemented and up on
Phabricator [2], but not yet in HEAD. It should make GHC 8.0, but I'm
wary about giving cast-iron guarantees.

The third part, which is necessary for the OverloadedLabels syntax to be
used for polymorphism over record fields, involves typeclass constraints
that capture when a type has a particular field, with the instances
provided automatically by GHC. This is not yet implemented in the
intended form, and is unlikely to make GHC 8.0 (barring me having a
sudden outbreak of free time...).

Adam


[1] https://ghc.haskell.org/trac/ghc/wiki/Records/OverloadedRecordFields
[2] https://phabricator.haskell.org/D1331

-- 
Adam Gundry, Haskell Consultant
Well-Typed LLP, http://www.well-typed.com/
___
ghc-devs mailing list
ghc-devs@haskell.org
http://mail.haskell.org/cgi-bin/mailman/listinfo/ghc-devs


Re: A process for reporting security-sensitive issues

2015-09-03 Thread Adam Gundry
On 03/09/15 08:22, Michael Smith wrote:
> I feel there should be some process for reporting security-sensitive issues
> in GHC -- for example, #9562 and #10826 in Trac. Perhaps something like the
> SensitiveTicketsPlugin [3] could be used?
> 
> [1] https://ghc.haskell.org/trac/ghc/ticket/9562
> [2] https://ghc.haskell.org/trac/ghc/ticket/10826
> [3] https://trac-hacks.org/wiki/SensitiveTicketsPlugin

Thanks for raising this. While I see where you are coming from, I'm
going to argue against it, because I think it creates a false impression
of the security guarantees GHC provides. Such a process may give the
impression that there are people directly tasked with handling such
security bugs, which is not currently the case.

I think it is unreasonable for the security of a system to depend on GHC
having no type soundness bugs, particularly since GHC is actively used
for developing experimental type system features. #9562 has been open
for a year and we don't have a good solution.

Relatedly, I think the Safe Haskell documentation should prominently
warn about the existence of #9562 and the possibility of other type
soundness bugs, like it does for compilation safety issues.

What do others think?

Adam


-- 
Adam Gundry, Haskell Consultant
Well-Typed LLP, http://www.well-typed.com/
___
ghc-devs mailing list
ghc-devs@haskell.org
http://mail.haskell.org/cgi-bin/mailman/listinfo/ghc-devs


Re: TypeChecker plugins and the TcS monad; given constraints in GHC HEAD

2015-05-29 Thread Adam Gundry
On 27/05/15 23:42, Simon Peyton Jones wrote:
 I like EvTerm as a distinct data type precisely because it
 enumerates all the forms of evidence we need.  Once you allow arbitrary
 CoreExprs, we lose that explicit enumeration. 
 
 Now, that might be OK.   Evidence terms are proof terms: any well typed
 EvTerm is a proof of the predicate described by its type.  So if a
 CoreExpr has the right type, then it **should** be the case that the
 evidence term is OK (provided it’s not bottom).  So I can’t quite nail
 down why I like the explicit EvTerm; except the relatively minor reason
 that it puts off desugaring to the desugarer.

I can see the appeal of the explicit enumeration of the evidence
generated by GHC itself, but it's obviously not modular. I don't have a
particularly strong opinion about whether the current special-case
constructors should be subsumed by the embedding; this might remove some
boilerplate, but we could also keep the embedding for plugins only. I'm
also open to Richard's idea that we use HsExpr Id instead of CoreExpr if
the former works out more easily.

Can you elaborate on your parenthetical remark provided it’s not
bottom? I was under the impression that EvTerms could be bottom (e.g.
`EvDelayedError`) without compromising type safety.


 On a separate matter, can we eliminate the trap-door from
 TcPluginM to TcM?  That is, can you enumerate all the things you need
 from TcPluginM?  The reason I ask is that I’d like to stop building TcS
 on top of TcM, and instead make it a free-standing monad.  Reason: we
 increasingly want to do constraint solving outside a full-blown
 typechecker.  Eg. when doing pattern-match exhaustiveness checks. 
 Faking up a TcM monad with lots of error thunks seems wrong.

It's convenient to have an escape hatch, but I think making TcS a
separate monad is a good idea and I wouldn't like plugins to hold this
up. I don't think plugins are likely to need anything not also needed by
TcS (arbitrary IO perhaps excepted, but hopefully that shouldn't be a
problem).

Beyond the changes in Phab:D909, the only thing I'm still using
unsafeTcPluginTcM for is newUnique (in order to generate new skolem
variables), which could easily be added to the TcPluginM interface as
well. Anyone need anything else?

Adam

-- 
Adam Gundry, Haskell Consultant
Well-Typed LLP, http://www.well-typed.com/
___
ghc-devs mailing list
ghc-devs@haskell.org
http://mail.haskell.org/cgi-bin/mailman/listinfo/ghc-devs


Re: Proposed changes to typechecker plugins API

2015-05-28 Thread Adam Gundry
Hi Jan,

Sadly I think your problems are more to do with lack of documentation in
the GHC API in general, than the specific extension I'm proposing! Your
use case should be possible with the existing setup: the easiest way is
probably something like this (untested):

mkNonCanonical $ CtDerived (mkTcEqPred (mkTyVarTy tyVar) ty) loc
  where loc = ctev_loc (cc_ev c)

Here `c` should be the `Ct` you started with; reusing its `ctev_loc`
makes it appear as if the new derived constraint came from the same
location. Non-canonical constraints are those that have not yet been
processed by GHC's constraint solver, which is fine for constraints
generated by plugins.

Hope this helps,

Adam


On 28/05/15 13:35, Jan Bracker wrote:
 Hi Adam, Hi Eric,
 
 At least for what I want to use them for [1] it would be nice if there
 was an easy way to say:
 
 If you are stuck on solving Constraint a b TypeC, then you should
 pick, e.g.:
  - a ~ TypeA and b ~ TypeB (What I am actually trying to say is: use
 the instance Constraint TypeA TypeB TypeC)
  - a ~ b
 
 Currently I am producing equality constraints, like this:
 
 mkDerivedTypeEqCt :: TcTyVar - TcType - TcPluginM Ct
 mkDerivedTypeEqCt tyVar ty = do
   (_, lclEnv) - getEnvs
   return $ CTyEqCan 
 { cc_ev = CtDerived -- :: CtEvidence
   { ctev_pred = ty -- :: TcPredType
   -- This matches type-wise, but I have no idea what actually
 belongs here.
   , ctev_loc = mkGivenLoc topTcLevel (UnifyForAllSkol [tyVar] ty)
 lclEnv -- :: CtLoc
   -- Again no idea what actually belongs here:
   --   topTcLevel :: TcLevel
   -- To what does this relate? I guess top level 
   -- is ok for equality constraints
   --   (UnifyForAllSkol [tyVar] ty) :: SkolemInfo
   -- This one matches what we have at disposal (no idea if it is
 the right one).
   --   lclEnv :: TcLclEnv
   -- I just use the only one I know.
   }
 , cc_tyvar = tyVar -- :: TcTyVar
 , cc_rhs = ty -- :: TcType
 , cc_eq_rel = NomEq -- :: EqRel
 -- Alternative would be ReprEq. Whats the difference?
 }
 
 Which seems to be working, but still leaves a lot of open questions (see
 comments).
 
 Maybe my problems using the API are more related to missing
 documentation, then lack of API functionality.
 
 Jan
 
 [1]: https://mail.haskell.org/pipermail/ghc-devs/2015-February/008414.html


-- 
Adam Gundry, Haskell Consultant
Well-Typed LLP, http://www.well-typed.com/
___
ghc-devs mailing list
ghc-devs@haskell.org
http://mail.haskell.org/cgi-bin/mailman/listinfo/ghc-devs


Re: [Haskell-cafe] Disambiguating a Num/RealFrac instance

2015-05-28 Thread Adam Gundry
On 28/05/15 18:28, Brandon Allbery wrote:
 On Tue, May 26, 2015 at 8:35 PM, amin...@gmail.com
 mailto:amin...@gmail.com wrote:
 
 Is there any way (without IncoherentInstances or Rebindablesyntax)
 that I can let the user write e.g. giveGPA 4.0 (and giveGPA 4)
 and get back F 4 without getting type errors that 4.0's type is
 ambiguous? I can guarantee there won't be any additional instances
 to ToGPA
 
 
 A typeclass with only one instance is nonsensical, and often a symptom
 of trying to use typeclasses as OO classes. All it's doing here is
 hurting you.

Like Brandon, I suspect this probably isn't what you should do. But if
you *really* want to do it, this works:

{-# LANGUAGE ExtendedDefaultRules, FlexibleInstances #-}

default (Float)

data GPA = F Float | Excuse String

class ToGPA a where
  giveGPA :: a - GPA

instance ToGPA Float where
  giveGPA = F

instance ToGPA String where
  giveGPA = Excuse

x = giveGPA 4
y = giveGPA 4.0
z = giveGPA Hello

For more information:
https://downloads.haskell.org/~ghc/latest/docs/html/users_guide/interactive-evaluation.html#extended-default-rules

Hope this helps,

Adam


-- 
Adam Gundry, Haskell Consultant
Well-Typed LLP, http://www.well-typed.com/
___
ghc-devs mailing list
ghc-devs@haskell.org
http://mail.haskell.org/cgi-bin/mailman/listinfo/ghc-devs


Proposed changes to typechecker plugins API

2015-05-27 Thread Adam Gundry
Hi devs,

I thought I should flag up some proposed changes relating to typechecker
plugins, which Christiaan, Iavor and I have been discussing. The quick
summary:

 * make it possible for plugins to create constraints (Phab:D909);

 * make it easier for plugins to define special type families;

 * embed CoreExpr in EvTerm.

For more details, see the wiki page:
https://ghc.haskell.org/trac/ghc/wiki/Plugins/TypeChecker#Post-7.10changestoTcPluginMAPI

Questions/review/comments very welcome.

Adam


-- 
Adam Gundry, Haskell Consultant
Well-Typed LLP, http://www.well-typed.com/
___
ghc-devs mailing list
ghc-devs@haskell.org
http://mail.haskell.org/cgi-bin/mailman/listinfo/ghc-devs


Re: Interactions between type-checker plugins

2015-05-19 Thread Adam Gundry
Hi Christiaan,

It may well be that we can improve the typechecker plugin interface. In
particular, I at least haven't devoted a great deal of thought to how
multiple plugins work together in practice (although theoretically it is
possible to compose constraint solvers, even establishing termination of
the composition is a bit fiddly).

The thing that surprises me about your email is that when your plugin A sees
[W] GCD 6 8 + x ~ x + GCD 9 6
it emits
[W] GCD 6 8 ~ GCD 9 6
without solving the original constraint, so we end up with
[W] GCD 6 8 + x ~ x + GCD 9 6
[W] GCD 6 8 ~ GCD 9 6
and proceed from there. What I'd expect is for the original constraint
to be declared as solved (because it follows from the new wanted). Then
plugin B will run and notice that the constraint is insoluble; there's
no need for plugin A to keep track of the relationship between the
constraints. Does that make sense? Why does plugin A need to remember
the relationship between constraints it has seen on previous iterations?

Hope this helps,

Adam




On 19/05/15 11:35, Christiaan Baaij wrote:
 Hi,
 
 My apologies for this rather lengthy email.
 
 I have a question about how type-checker plugins should interact.
 My situation is the following:
 I have two type-checker plugins, one that can solve equalities involving the 
 standard type-level operations on kind Nat (+,*,-,^), and another 
 type-checker plugin that can prove equalities involving a new type-level 
 operation GCD.
 In the following, the type-checker plugin involving the stand type-level 
 operations is called ‘A’, and the type checker involving the new GCD 
 operations is called ‘B’.
 
 When type-checker plugin A encounters:
 [W] GCD 6 8 + x ~ x + GCD 9 6
 
 It knows that (+) is commutative, so it can prove the above equality _given_ 
 that GCD 6 8 ~ GCD 9 6” holds.
 So what type-checker plugin A does now is emits a new wanted constraints:
 [W] GCD 6 8 ~ GCD 9 6
 And remembers that it emitted this wanted constraint.
 In type-checker plugin lingo, it returns:
 TcPluginOk [] [[W] GCD 6 8 ~ GCD 9 6”]
 
 Now whenever type-checker plugin encounters
 [W] GCD 6 8 + x ~ x + GCD 9 6
 again, it checks to see if the discharged constraint
 [W] GCD 6 8 ~ GCD 9 6
 Is already solved, is disproven, or unsolved.
 If the discharged constraint is solved, it will return:
 TcPluginOk [[W] GCD 6 8 + x ~ x + GCD 9 6”] []
 If the discharged constraint is dis proven, it returns:
 TcPluginContradiction [[W] GCD 6 8 + x ~ x + GCD 9 6”]
 And otherwise, it doesn’t do anything:
 TcPluginOk [] []
 
 Now, type-checker plugin B, the one that knows about GCD, comes along.
 It sees:
 [W] GCD 6 8 + x ~ x + GCD 9 6
 [W] GCD 6 8 ~ GCD 9 6
 I doesn’t know anything about (+); but it does know about GCD, and clearly 
 sees that GCD 6 8 is not equal to GCD 9 6.
 So it tells that GCD 6 8 ~ GCD 9 6 is insoluble.
 In type-checker plugin lingo it says:
 TcPluginContradiction [[W] GCD 6 8 ~ GCD 9 6”]
 
 According to 
 https://github.com/ghc/ghc/blob/228ddb95ee137e7cef02dcfe2521233892dd61e0/compiler/typecheck/TcInteract.hs#L547
 What happens next is that the insoluble constraint
 [W] GCD 6 8 ~ GCD 9 6
 is taken of the work list for all plugins.
 However! the same thing happens when a plugin is able to prove a constraint.
 That is, if B would have (erroneously) returned: 
 TcPluginOk [[W] GCD 6 8 ~ GCD 9 6”] []
 
 Then there is _no_ way for type-checker plugin A to know what happened.
 Were its discharged constraints taken from the work-list because it was 
 insoluble? or because it was solved?
 This is a problem because to me it seems that the work list is the only way 
 that two type-checker plugins can communicate.
 
 I guess my question is: if not through the work list, how are type-checker 
 plugins supposed to communicate?
 Am I going about it all wrong in terms how I should be writing type-checker 
 plugins?
 Or, should the type of ‘TcPluginSolver’ 
 (https://downloads.haskell.org/~ghc/7.10.1/docs/html/libraries/ghc-7.10.1/TcRnTypes.html#t:TcPluginSolver)
  be updated to?:
 
 ```
 type TcPluginSolver =  [Ct] -- ^ Given constraints
 - [Ct] -- ^ Derived constraints
 - [Ct] -- ^ Wanted constraints
 - [Ct] -- ^ Insoluble constraints
 - TcPluginM TcPluginResult
 ```
 
 Best regards,
 
 Christiaan


-- 
Adam Gundry, Haskell Consultant
Well-Typed LLP, http://www.well-typed.com/
___
ghc-devs mailing list
ghc-devs@haskell.org
http://mail.haskell.org/cgi-bin/mailman/listinfo/ghc-devs


Re: Building patches that change format of interface files

2015-04-15 Thread Adam Gundry
[oops, forgot to cc ghc-devs]

Hi Janek,

On 15/04/15 13:20, Jan Stolarek wrote:
 Once I compile GHC I use the cd ghc; make 2 trick to re-compile any
changes. That however does
 not work if my code changes modify the format of interface files - the
resulting compiler is
 unusable. In fact the only way to get a working compiler after
interface format change that I am
 aware of is to rebuild everything from scratch. Needless to say, this
takes a lot of time. Is
 there a better way?

I hit this problem quite often as well, but unfortunately don't know of
a general solution other than rebuilding everything.


 I'd like to test D841 but if this means rebuilding everything then
I'll have to pass.

In the case of D841, the interface file changes are very minor
(specifically, a new tag is used when an empty closed type family is
compiled, but of course none of them exist yet). Thus you might be able
to get away without a full rebuild. (I think ./configure incorporates
some git revision data into the interface file version number, so
presumably you have to avoid that as well.)

All the best,

Adam

-- 
Adam Gundry, Haskell Consultant
Well-Typed LLP, http://www.well-typed.com/
___
ghc-devs mailing list
ghc-devs@haskell.org
http://mail.haskell.org/cgi-bin/mailman/listinfo/ghc-devs


Re: Desugaring introduces

2015-02-21 Thread Adam Gundry
Hi Gabor,

Interesting! While in principle it is true that t1 is the desugaring of
t2, GHC does typechecking before desugaring (even with RebindableSyntax)
in the interests of generating error messages that reflect what the user
actually wrote. The typechecker probably should treat t1 and t2
identically, but in practice this may be difficult to ensure. In this
case, I suspect the typechecking rules for do-notation assume that (=)
has a more usual type.

The user's guide section on RebindableSyntax says

 In all cases (apart from arrow notation), the static semantics should
 be that of the desugared form, even if that is a little unexpected.

so on that basis you're probably justified in reporting this as a bug.

Hope this helps,

Adam


On 21/02/15 11:42, Gabor Greif wrote:
 Hi devs,
 
 before I file a bug, I'd like to double check on a strange desugaring
 behaviour with RankNTypes and RebindableSyntax.
 
 Here is the snippet
 {{{
 {-# LANGUAGE RankNTypes, RebindableSyntax #-}
 {-# LANGUAGE ImpredicativeTypes #-}
 
 import qualified Prelude as P
 
 (=) :: a - ((forall b . b) - c) - c
 a = f = f P.undefined
 return a = a
 fail s = P.undefined
 
 t1 = 'd' = (\_ - 'k')
 
 t2 = do _ - 'd'
 'k'
 
 main = P.putStrLn [t1, t2]
 }}}
 
 Without ImpredicativeTypes I get this error:
 {{{
 rebindtest.hs:13:9:
 Cannot instantiate unification variable ‘t0’
 with a type involving foralls: forall b. b
   Perhaps you want ImpredicativeTypes
 In a stmt of a 'do' block: _ - 'd'
 In the expression:
   do { _ - 'd';
'k' }
 In an equation for ‘t2’:
 t2
   = do { _ - 'd';
  'k' }
 }}}
 
 t1 is supposed to be the desugaring of t2. Strangely t2 only compiles
 with ImpredicativeTypes. Why? Isn't desugaring a purely syntactic
 transformation (esp. with RebindableSyntax)?
 
 Any hints welcome!
 
 Cheers,
 
 Gabor


-- 
Adam Gundry, Haskell Consultant
Well-Typed LLP, http://www.well-typed.com/
___
ghc-devs mailing list
ghc-devs@haskell.org
http://mail.haskell.org/cgi-bin/mailman/listinfo/ghc-devs


Re: Proposal: ValidateMonoLiterals - Initial bikeshed discussion

2015-02-09 Thread Adam Gundry
Hi Merijn,

Thanks for persevering with explaining things to me. :-)


On 09/02/15 09:47, Merijn Verstraaten wrote:
 On 6 Feb 2015, at 21:31, Adam Gundry a...@well-typed.com wrote: 
 What does all monomorphic cases mean? Is the idea what GHC would
 typecheck an expression involving a literal integer, determine that
 the occurrence had type Even, then evaluate the TH splice *after* 
 typechecking? Whereas if the occurrence had a non-ground type, it
 would do something else?
 
 Yes, Typed TH already runs *after* type checking, which is what
 allows you to do validation based on the resulting type. The main
 reason why I was only proposing to do this for monomorphic values is,
 because, how could you possible validate a polymorphic literal? Which
 validation function would you use?
 
 You could ban polymorphic literals, but that'd involve eliminating
 most uses of polymorphic Num functions (as I mentioned as another
 email), which would break so much code it would render the extension
 unusable in real code. I'm open to better ideas on how to tackle
 this, the main reason I started this discussion is because I don't
 really like this polymorphic literals fail at compile time thing
 either. I just don't see how to solve it without going all dependent
 types on the problem.

In the absence of a coherent story for polymorphism, I think the right
thing to do is to be able to specify a particular validator, rather than
try to have type inference determine a monomorphic type and otherwise
get stuck...


 None of this is particularly persuasive, I'm afraid. Is it
 worthwhile introducing something new just to avoid having to write
 a quasi quote?
 
 Actually, I would be mildly ok with quasi quoters, BUT there
 currently is no Typed TH quasi quoter (as mentioned on the wiki
 page), additionally, such a quoter does not have access to Lift
 instances for all but a handful of datatypes until we have a more
 comprehensive way to generate Lift instances. I think both of these
 points are also highly relevant for this dicussion.

...so is the right solution to introduce Typed TH quasiquoters for
expressions? Sorry, I presumed such a thing existed, as Typed TH is
rather regrettably underdocumented. Is there any particular difficulty
with them, or is it just a Small Matter of Programming?

I think the lack of Lift instances is a separate problem; while it looks
like 7.10 will be better in this respect and dataToExpQ goes a fair way,
I agree that making them easier to generate would be nice. Perhaps a
generics-based default method combined with DeriveAnyClass would make
deriving Lift possible?

Adam

-- 
Adam Gundry, Haskell Consultant
Well-Typed LLP, http://www.well-typed.com/
___
ghc-devs mailing list
ghc-devs@haskell.org
http://www.haskell.org/mailman/listinfo/ghc-devs


Re: Question about implementing `Typeable` (with kinds)

2015-02-09 Thread Adam Gundry
On 09/02/15 13:20, Simon Peyton Jones wrote:
 I think just add a new constructor to EvTerm.
 
 Yes, it’s special-purpose, but the **solver* *is special-purpose too. 
  And it does mean that we know exactly what forms of evidence we can
 generate!

Is there any problem in principle with allowing arbitrary HsExprs inside
an EvTerm? Hopefully that would subsume the type-lits, Typeable and
possible future cases. The grand plan with typechecker plugins is to
make it possible to implement such special-purpose constraint solvers
outside GHC itself; at the moment we can do that for equality
constraints, but not very easily for other sorts of constraints (like
Typeable).

Adam


 
 *From:*Iavor Diatchki [mailto:iavor.diatc...@gmail.com]
 *Sent:* 07 February 2015 20:11
 *To:* Simon Peyton Jones; ghc-devs@haskell.org
 *Subject:* Question about implementing `Typeable` (with kinds)
 
  
 
 Hello,
 
  
 
 I started adding custom solving for `Typeable` constraints, to work
 around the problem where kind parameters were missing from the
 representation of types. 
 
  
 
 The idea is as follows:
 
  
 
   1. Add a new filed to `TypeRep` that remembers _kind_ parameters:
 
  
 
   TypeRep Fingerprint TyCon [TypeRep]{-kinds-} [TypeRep]{-types-}
 
  
 
   2. Modify the constraint solver, to solve constraints like this:
 
  - Kind-polymorphic type constructors don't get `Typeable` instances
 on their own
 
  - GHC can solve `Typeable` constraints on  _/concrete uses/_ of
 polymorphic type constructors.
 
   More precisely, GHC can solve constraints of the form `Typeable k
 (TC @ ks)`, as long as:
 
(1) `k` is not a forall kind,
 
(2) the `ks` are all concrete kinds (i.e., they have no free kind
 variables).
 
  
 
 This all seems fairly straight-forward, but I got stuck on the actual
 implementation, in particular:
 
  
 
 *what `**EvTerm` should I use when discharging a `**Typeable` constraint?*
 
  
 
 I can create a an `HsSyn` value for the required method (i.e., a
 function of type `Proxy# t - TypeRep`).
 
 I can also cast this into a `Typeable` dictionary value.
 
 The issue is that I am left with an `HsSyn` expression, and not an `EvTerm`.
 
  
 
 So is there a way to treat an arbitrary expression as an `EvTerm`?
 
  
 
 In the implementation of the type-lits, I just added custom evidence,
 but this does not scale well (also, in that case the evidence is just a
 simple value, while here
 
 it is a bit more complex).
 
  
 
 Suggestions would be most appreciated!
 
  
 
 -Iavor


-- 
Adam Gundry, Haskell Consultant
Well-Typed LLP, http://www.well-typed.com/
___
ghc-devs mailing list
ghc-devs@haskell.org
http://www.haskell.org/mailman/listinfo/ghc-devs


Re: Proposal: ValidateMonoLiterals - Initial bikeshed discussion

2015-02-06 Thread Adam Gundry
If we go for a separate syntax, what do we gain over normal quasiquotes
or $$(validate x)? Sure, literals could be made a little more beautiful,
but I'm not sure it justifies stealing more syntax (and what would that
syntax be?).

Without a separate syntax, I'm not sure I understand the original
proposal. The wiki page says GHC would replace
fromString/fromInteger/fromList expressions originating from literals
with a Typed TH splice along the lines of validate for all monomorphic
cases. What does all monomorphic cases mean? Is the idea what GHC
would typecheck an expression involving a literal integer, determine
that the occurrence had type Even, then evaluate the TH splice *after*
typechecking? Whereas if the occurrence had a non-ground type, it would
do something else?

None of this is particularly persuasive, I'm afraid. Is it worthwhile
introducing something new just to avoid having to write a quasiquote?

I *am* attracted to the idea of indexed classes in place of IsString/Num

  class KnownSymbol s = IsIndexedString (a :: *) (s :: Symbol) where
fromIndexedString :: a

  class KnownInteger i = IsIndexedInteger (a :: *) (i :: Integer) where
 fromIndexedInteger :: a

These have a smooth upgrade path from the existing class instances via

default fromIndexedString :: (KnownSymbol s, IsString a) = a
fromIndexedString = fromString (symbolVal (Proxy :: Proxy s))

default fromIndexedInteger :: (KnownInteger i, Num a) = a
fromIndexedInteger = fromInteger (integerVal (Proxy :: Proxy i))

and other instances can take advantage of the additional type
information. perhaps we need to bring Dependent Haskell a bit closer
before this is justifiable...

Adam


On 06/02/15 17:24, Dan Doel wrote:
 Assuming a separate syntax, I believe that the criterion would be as
 simple as ensuring that no ValidateFoo constraints are left outstanding.
 The syntax would add the relevant validate call, and type variables
 involved in a ValidateFoo constraint would not be generalizable, and
 would have to be defaulted or inferred from elsewhere, similar to the
 monomorphism restriction. I'm not sure how difficult that would be to
 implement.
 
 I'm not terribly gung ho on this, though. It feels very ad hoc. Making
 validation vs. non-validation syntactic rather than just based on
 polymorphism seems somewhat less so, though; so I'd prefer that
 direction. Finding unused syntax is always a problem, of course.


-- 
Adam Gundry, Haskell Consultant
Well-Typed LLP, http://www.well-typed.com/
___
ghc-devs mailing list
ghc-devs@haskell.org
http://www.haskell.org/mailman/listinfo/ghc-devs


Re: GHC support for the new record package

2015-01-27 Thread Adam Gundry
On 27/01/15 09:16, Simon Peyton Jones wrote:
 Adam, are you willing to update the wiki page to reflect the latest state of 
 the conversation, identifying remaining choices? That would be v helpful.

I'm on it now. It'll take a little while because I'm merging plans A and
B into a single coherent story.

Adam


-- 
Adam Gundry, Haskell Consultant
Well-Typed LLP, http://www.well-typed.com/
___
ghc-devs mailing list
ghc-devs@haskell.org
http://www.haskell.org/mailman/listinfo/ghc-devs


Re: GHC support for the new record package

2015-01-27 Thread Adam Gundry
Yes, we can't make IV the magic class for which instances are generated.
As I pointed out earlier in the thread, we need to give an instance for
the function space that enforces the functional dependency (either with
an actual fundep or a type family), and keep a distinguished HasField
class. AFAICS it's still an open question as to whether that instance
should provide

(a) selector functions r - a
(b) lenses (a - f b) - s - f t
(c) both
(d) neither

but I'm starting to think (b) is the sanest option.

Otherwise, I think we've more or less converged on the issues (apart
from the syntax question) and I'll update the wiki page appropriately.

On the syntax question, Edward, could you say more about how you would
expect the magic imports to work? If a module both declares (or imports)
a record field `x` and magically imports `x`, what does a use of `x`
mean? (In the original ORF, we didn't have the magic module, but just
said that record fields were automatically polymorphic... that works but
is a bit fiddly in the renamer, and isn't a conservative extension.)

Adam


On 27/01/15 00:59, Edward Kmett wrote:
 I'm also rather worried, looking over the IV proposal, that it just
 doesn't actually work.
 
 We actually tried the code under Haskell 98 records back when Gundry
 first started his proposal and it fell apart when you went to compose them.
 
 A fundep/class associated type in the class is a stronger constraint
 that a type equality defined on an individual instance.
 
 I don't see how
 
 @foo . @bar . @baz
 
 (or #foo . #bar . #baz as would be written under the concrete proposal
 on the wiki) 
 
 is ever supposed to figure out the intermediate types when working
 polymorphically in the data type involved. 
 
 What happens when the type of that chain of accessors is left to
 inference? You get stuck wallowing in AllowAmbiguousTypes territory:
 
 (#foo . #bar . #baz) :: (IV foo (c - d), IV bar (b - c), IV baz
 (a - b)) = a - d
 
 has a variables 'b' and 'c' that don't occur on the right hand side, and
 which are only determinable by knowing that the instances you expect to
 see look something like:
 
 instance (a ~ Bool) = IV x (S - a) where
   iv (MkS x) = x
 
 but that is too weak to figure out that S determines a unless S is
 already known, even if we just limit ourselves to field accessors as
 functions.
 
 -Edward


-- 
Adam Gundry, Haskell Consultant
Well-Typed LLP, http://www.well-typed.com/
___
ghc-devs mailing list
ghc-devs@haskell.org
http://www.haskell.org/mailman/listinfo/ghc-devs


Re: GHC support for the new record package

2015-01-27 Thread Adam Gundry
On 27/01/15 09:19, Adam Gundry wrote:
 On 27/01/15 09:16, Simon Peyton Jones wrote:
 Adam, are you willing to update the wiki page to reflect the latest state of 
 the conversation, identifying remaining choices? That would be v helpful.
 
 I'm on it now. It'll take a little while because I'm merging plans A and
 B into a single coherent story.

Done. As I understand it, the key remaining choices (flagged up with the
phrase Design question are):

1. What are the IV instances provided in base? These could give selector
functions, lenses, both or neither.

2. How do we identify implicit values? Either we have a syntactic cue
(like `#` or `@`) or we do some magic in the renamer.

  - If the former, are normal unambiguous record selectors available as
well? Or do we allow/generate definitions like x = #x, as Neil suggests?

  - If the latter, what happens when a record field and an implicit
value are both in scope?

Adam


-- 
Adam Gundry, Haskell Consultant
Well-Typed LLP, http://www.well-typed.com/
___
ghc-devs mailing list
ghc-devs@haskell.org
http://www.haskell.org/mailman/listinfo/ghc-devs


Re: GHC support for the new record package

2015-01-23 Thread Adam Gundry
Thanks for the feedback, Iavor!

On 23/01/15 19:30, Iavor Diatchki wrote:
 2. I would propose that we simplify things further, and provide just one
 class for overloading:
 
 class Field (name :: Symbol)
 rec   rec'
 field field'
   | name rec - field
   , name rec'- field'
   , name rec  field' - rec'
   , name rec' field  - rec
   where
   field :: Functor f = Proxy name - (field - f field') -
   (rec   - f rec')
 
 I don't think we need to go into lenses at all, the `field` method
 simply provides a functorial
 update function similar to `mapM`.   Of course, one could use the `lens`
 library to
 get more functionality but this is entirely up to the programmer.
 
 When the ORF extension is enabled, GHC should simply generate an
 instance of the class,
 in a similar way to what the lens library does.

There's something to be said for the simplicity of this approach,
provided we're happy to commit to this representation of lenses. I'm
attracted to the extra flexibility of the IsRecordField class -- I just
noticed that it effectively gives us a syntax for identifier-like Symbol
singletons, which could be useful in completely different contexts --
and I'd like to understand the real costs of the additional complexity
it imposes.


 3. I like the idea of `#x` desugaring into `field (Proxy :: Proxy x)`,
 but I don't like the concrete symbol choice:
   - # is a valid operator and a bunch of libraries use it, so it won't
 be compatible with existing code.

Ah. I didn't realise that, but assumed it was safe behind -XMagicHash.
Yes, that's no good.


   - @x might be a better choice; then you could write things like:
 view @x  rec
   set  @x 3rec
   over @x (+2) rec

This could work, though it has the downside that we've been informally
using @ for explicit type application for a long time! Does anyone know
what the status of the proposed ExplicitTypeApplication extension is?


   - another nice idea (due to Eric Mertens, aka glguy), which allows us
 to avoid additional special syntax is as follows:
 - instead of using special syntax, reuse the module system
 - designate a magic module name (e.g., GHC.Records)
 - when the renamer sees a name imported from that module, it
 resolves the name by desugaring it into whatever we want
 - For example, if `GHC.Records.x` desugars into `field (Proxy ::
 Proxy x)`, we could write things like this:
 
 import GHC.Records as R
 
 view R.x  rec
 set  R.x 3rec
 over R.x (+2) rec

Interesting; I think Edward suggested something similar earlier in this
thread. Avoiding a special syntax is a definite advantage, but the need
for a qualified name makes composing the resulting lenses a bit tiresome
(R.x.R.y.R.z or R.x . R.y . R.z). I suppose one could do

import GHC.Records (x, y, z)
import MyModule hiding (x, y, z)

but having to manually hide the selector functions and bring into scope
the lenses is also annoying.

Adam

-- 
Adam Gundry, Haskell Consultant
Well-Typed LLP, http://www.well-typed.com/
___
ghc-devs mailing list
ghc-devs@haskell.org
http://www.haskell.org/mailman/listinfo/ghc-devs


Re: GHC support for the new record package

2015-01-23 Thread Adam Gundry
On 23/01/15 10:17, Simon Marlow wrote:
 On 23/01/2015 04:12, Johan Tibell wrote:


 On Wed, Jan 21, 2015 at 5:48 PM, Simon Marlow marlo...@gmail.com
 mailto:marlo...@gmail.com wrote:

 On 21/01/2015 16:01, Johan Tibell wrote:

 My thoughts mostly mirror those of Adam and Edward.

 1) I want something that is backwards compatible.


 Backwards compatible in what sense?  Extension flags provide
 backwards compatibility, because you just don't turn on the
 extension until you want to use it.  That's how all the other
 extensions work; most of them change syntax in some way or other
 that breaks existing code.


 In this case in the sense of avoiding splitting code into a new-Haskell
 vs old-Haskell. This means that existing records should work well (and
 ideally also get the improved name resolution when used in call sites
 that have the pragma enabled) in the new record system.
 
 I understand that position, but it does impose some pretty big
 constraints, which may mean the design has to make some compromises.
 It's probably not worth discussing this tradeoff until there's actually
 a concrete proposal so that we can quantify how much old code would fail
 to compile and the cost of any compromises.

In this spirit, I've started to prepare a concrete proposal for a
revised OverloadedRecordFields design, based on recent feedback:

https://ghc.haskell.org/trac/ghc/wiki/Records/OverloadedRecordFields/Redesign

This would not necessarily include anonymous records at first, but they
do fit nicely as a potential later extension, and it would work well
with a slightly amended version of the record library in the meantime.
I'd be very interested to hear what you think of this.

Also, if someone would be prepared to flesh out a proposal based on the
anonymous records idea, that might be a useful point of comparison.

Adam

-- 
Adam Gundry, Haskell Consultant
Well-Typed LLP, http://www.well-typed.com/
___
ghc-devs mailing list
ghc-devs@haskell.org
http://www.haskell.org/mailman/listinfo/ghc-devs


Re: GHC support for the new record package

2015-01-22 Thread Adam Gundry
On 21/01/15 21:51, Simon Marlow wrote:
 Adam, do you have any measurements for how much code gets generated for
 a record declaration with ORF, compared with a record declaration in GHC
 right now?  That's one thing that has been a nagging worry for me with
 ORF, but I just don't have any idea if it's a real problem or not.

Yes, that was something that was a bit unsatisfying about the original
implementation, though unfortunately I don't have hard numbers comparing
the relative code sizes. But Simon PJ and I have realised that we can be
much more efficient: the only things that need to be generated for
record declarations are selector functions (as at present) and updater
functions (one per field, per type). Everything else (typeclass dfuns,
type family axioms) can be made up on-the-fly in the typechecker. So I
don't think it will make a practical difference.


 Under Nikita's proposal, zero code is generated for a record
 declaration (since there isn't one), and the lenses are tiny
 expressions too.  There's some boilerplate in the library, but it's
 generated once and for all, and isn't that huge anyway.  The
 lightweightness of it from a code-size standpoint is very attractive.

Agreed. I'm coming to see how much of a virtue it is to be lightweight!
I'm a bit worried, however, that if we want

newtype T = MkT {| foo :: Int |}
x = view [l|foo|] (MkT 42) -- whatever our syntax for [l|...|] is

to be well-typed, we need an instance for FieldOwner foo Int T to be
generated somewhere (perhaps via GND), so I suspect the code generation
cost for non-anonymous overloaded records is about the same as with ORF.
Nikita's proposal would let you choose whether to pay that cost at
declaration time, which has advantages and disadvantages, of course.

Adam

-- 
Adam Gundry, Haskell Consultant
Well-Typed LLP, http://www.well-typed.com/
___
ghc-devs mailing list
ghc-devs@haskell.org
http://www.haskell.org/mailman/listinfo/ghc-devs


Re: GHC support for the new record package

2015-01-22 Thread Adam Gundry
On 22/01/15 05:38, Edward Kmett wrote:
 On Wed, Jan 21, 2015 at 4:34 PM, Adam Gundry wrote:
 
 I'm surprised and interested that you view this as a major source of
 complexity. From my point of view, I quite liked how the ability to
 overload fields as both selector functions and arbitrary lenses turned
 out. Compared to some of the hairy GHC internal details relating to name
 resolution, it feels really quite straightforward. Also, I've recently
 worked out how to simplify and generalise it somewhat (see [1] and [2]
 if you're curious).
 
 
 I'm actually reasonably happy with the design we wound up with, but the
 need to mangle all the uses of the accessor with a combinator to extract
 from the data type is a perpetual tax paid, that by giving in and
 picking a lens type and not having to _also_ provide a normal field
 accessor we could avoid.

That's a fair point, at least provided one is happy to work with the
canonical lens type we choose, because all others will require a
combinator. ;-)

Actually, the simplifications I recently came up with could allow us to
make uses of the field work as van Laarhoven lenses, other lenses *and*
selector functions. In practice, however, I suspect this might lead to
somewhat confusing error messages, so it might not be desirable.

Adam


 [1] 
 https://github.com/adamgundry/records-prototype/blob/master/NewPrototype.hs
 [2] 
 https://github.com/adamgundry/records-prototype/blob/master/CoherentPrototype.hs

-- 
Adam Gundry, Haskell Consultant
Well-Typed LLP, http://www.well-typed.com/
___
ghc-devs mailing list
ghc-devs@haskell.org
http://www.haskell.org/mailman/listinfo/ghc-devs


Re: GHC support for the new record package

2015-01-21 Thread Adam Gundry
[I should say, in case it wasn't clear from my previous email, that I'm
impressed by Nikita's work, excited to see this discussion revived, and
very keen to find the best solution, whether that builds on ORF or not.
Anyway, back down the rabbit hole...]


On 21/01/15 16:48, Simon Marlow wrote:
 On 21/01/2015 16:01, Johan Tibell wrote:
 My thoughts mostly mirror those of Adam and Edward.

 1) I want something that is backwards compatible.
 
 Backwards compatible in what sense?  Extension flags provide backwards
 compatibility, because you just don't turn on the extension until you
 want to use it.  That's how all the other extensions work; most of them
 change syntax in some way or other that breaks existing code.

Well, it's nice if turning on an extension flag doesn't break existing
code (as far as possible, and stolen keywords etc. excepted). In
ORF-as-is, this is mostly true, except for corner cases involving
higher-rank fields or very polymorphic code. I think it's something to
aim for in any records design, anonymous or not.


 4) There are issues with strictness and unpacking.
 
 Yes - probably the major drawbacks, along with the lack of type-changing
 updates.

Is there any reason why Nikita's proposal couldn't be extended to
support type-changing update, just like ORF? Though the cases in which
it can and cannot apply are inevitably subtle.

Also, I'd add fields with higher-rank types to the list of missing
features. From a user's perspective, it might seem a bit odd if

data T = MkT { foo :: forall a . a }

was fine but

data T = MkT {| foo :: forall a . a |}

would not be a valid declaration. (Of course, ORF can't overload foo
either, and maybe this is an inevitable wart.)


 5) I don't know if I want to commit the *language* to a particular lens
 type.

Agreed, but I don't think this need be an issue for either proposal. We
know from ORF that we can make fields sufficiently polymorphic to be
treated as selector functions or arbitrary types of lenses (though
treating them as van Laarhoven lenses requires either some clever
typeclass trickery in the base library, or using a combinator to make a
field into a lens at use sites).


Adam

-- 
Adam Gundry, Haskell Consultant
Well-Typed LLP, http://www.well-typed.com/
___
ghc-devs mailing list
ghc-devs@haskell.org
http://www.haskell.org/mailman/listinfo/ghc-devs


Re: GHC support for the new record package

2015-01-21 Thread Adam Gundry
On 21/01/15 18:14, Edward Kmett wrote:
  5) I don't know if I want to commit the *language* to a particular lens
  type.
 
 Agreed, but I don't think this need be an issue for either proposal. We
 know from ORF that we can make fields sufficiently polymorphic to be
 treated as selector functions or arbitrary types of lenses (though
 treating them as van Laarhoven lenses requires either some clever
 typeclass trickery in the base library, or using a combinator to make a
 field into a lens at use sites).
 
 
 Admittedly that has also been the source of 90% of the complexity of the
 ORF proposal. There we _had_ to do this in order to support the use as a
 regular function.

I'm surprised and interested that you view this as a major source of
complexity. From my point of view, I quite liked how the ability to
overload fields as both selector functions and arbitrary lenses turned
out. Compared to some of the hairy GHC internal details relating to name
resolution, it feels really quite straightforward. Also, I've recently
worked out how to simplify and generalise it somewhat (see [1] and [2]
if you're curious).


 There is a large design space here, and the main thing Nikita's proposal
 brings to the table is slightly different assumptions about what such
 records should mean. This _could_ let us shed some of the rougher edges
 of ORF, at the price of having to lock in a notion of lenses.

Yes. It's good to explore the options. For what it's worth, I'm
sceptical about blessing a particular notion of lenses unless it's
really necessary, but I'm prepared to be convinced otherwise.


 I'm on the fence about whether it would be a good idea to burden
 Nikita's proposal in the same fashion, but I think it'd be wise to
 explore it in both fashions. My gut feeling though is that if we tie it
 up with the same restrictions as ORF you just mostly get a less useful
 ORF with anonymous record sugar thrown in.

I think there's a sensible story to tell about an incremental plan that
starts with something like ORF and ends up with something like Nikita's
anonymous records. I'll try to tell this story when I can rub a few more
braincells together...

Adam

[1]
https://github.com/adamgundry/records-prototype/blob/master/NewPrototype.hs
[2]
https://github.com/adamgundry/records-prototype/blob/master/CoherentPrototype.hs

-- 
Adam Gundry, Haskell Consultant
Well-Typed LLP, http://www.well-typed.com/
___
ghc-devs mailing list
ghc-devs@haskell.org
http://www.haskell.org/mailman/listinfo/ghc-devs


Re: GHC support for the new record package

2015-01-21 Thread Adam Gundry
As someone with quite a lot of skin in this game, I thought it might be
useful to give my perspective on how this relates to ORF. Apologies that
this drags on a bit...

Both approaches use essentially the same mechanism for resolving
overloaded field names (typeclasses indexed by type-level strings,
called Has/Upd or FieldOwner). ORF allows fields to be both selectors
and various types of lenses, whereas the record library always makes
them van Laarhoven lenses, but this isn't really a fundamental difference.

The crucial difference is that ORF adds no new syntax, and solves
Has/Upd constraints for existing datatypes, whereas the record library
adds a new syntax for anonymous records and their fields that is
completely separate from existing datatypes, and solves FieldOwner
constraints only for these anonymous records (well, their desugaring).

On the one hand, anonymous records are a very desirable feature, and in
some ways making them separate is a nice simplification. However, they
are not as expressive as the existing Haskell record datatypes (sums,
strict/unpacked fields, higher-rank fields), and having two records
mechanisms is a little unsatisfying. Do we really want to distinguish

data Foo = MkFoo { bar :: Int, baz :: Bool }
data Foo = MkFoo {| bar :: Int, baz :: Bool |}

(where the first is the traditional approach, and the second is a
single-argument constructor taking an anonymous record in Edward's
proposed syntax)?

It might be nice to have a syntactic distinction between record fields
and normal functions (the [l|...|] in the record library), because it
makes name resolution much simpler. ORF was going down the route of
adding no syntax, so name resolution becomes more complex, but we could
revisit that decision and perhaps make ORF simpler. But I don't know
what the syntax should be.

I would note that if we go ahead with ORF, the record library could
potentially take advantage of it (working with ORF's Has/Upd classes
instead of defining its own FieldOwner class). Then we could
subsequently add anonymous records to GHC if there is enough interest
and implementation effort. However, I don't want to limit the
discussion: if there's consensus that ORF is not the right approach,
then I'm happy to let it go the way of all the earth. ;-)

(Regarding the status of ORF, Simon PJ and I had a useful meeting last
week where we identified a plan for getting it back on track, including
some key simplifications to the sticking points in the implementation.
So there might be some hope for getting it in after all.)

Adam


On 20/01/15 21:44, Simon Marlow wrote:
 For those who haven't seen this, Nikita Volkov proposed a new approach
 to anonymous records, which can be found in the record package on
 Hackage: http://hackage.haskell.org/package/record
 
 It had a *lot* of attention on Reddit:
 http://nikita-volkov.github.io/record/
 
 Now, the solution is very nice and lightweight, but because it is
 implemented outside GHC it relies on quasi-quotation (amazing that it
 can be done at all!).  It has some limitations because it needs to parse
 Haskell syntax, and Haskell is big.  So we could make this a lot
 smoother, both for the implementation and the user, by directly
 supporting anonymous record syntax in GHC.  Obviously we'd have to move
 the library code into base too.
 
 This message is by way of kicking off the discussion, since nobody else
 seems to have done so yet.  Can we agree that this is the right thing
 and should be directly supported by GHC?  At this point we'd be aiming
 for 7.12.
 
 Who is interested in working on this?  Nikita?
 
 There are various design decisions to think about.  For example, when
 the quasi-quote brackets are removed, the syntax will conflict with the
 existing record syntax.  The syntax ends up being similar to Simon's
 2003 proposal
 http://research.microsoft.com/en-us/um/people/simonpj/Haskell/records.html
 (there are major differences though, notably the use of lenses for
 selection and update).
 
 I created a template wiki page:
 https://ghc.haskell.org/trac/ghc/wiki/Records/Volkov
 
 Cheers,
 Simon


-- 
Adam Gundry, Haskell Consultant
Well-Typed LLP, http://www.well-typed.com/
___
ghc-devs mailing list
ghc-devs@haskell.org
http://www.haskell.org/mailman/listinfo/ghc-devs


Re: [Diffusion] [Build Failed] rGHC726ea08a6e58: Amend TcPluginM interface

2014-12-18 Thread Adam Gundry
On 18/12/14 09:25, Phabricator wrote:
 Harbormaster failed to build B2694: rGHC726ea08a6e58: Amend TcPluginM 
 interface!

Sigh. It seems perf/compiler T3294 is failing on Harbormaster, but it
validated fine locally on x86_64. I doubt my commit is responsible, and
suspect this is just a wobbly test, but it's hard to know without the
full build log.

Is there a way to get full logs for failing Harbormaster builds? They
seem to be uploaded only if the build succeeds.

Thanks,

Adam


 USERS
   adamgundry (Author)
   GHC - Driver (Auditor)
   GHC - Type checker/inferencer (Auditor)
 
 COMMIT
   https://phabricator.haskell.org/rGHC726ea08a6e58
 
 EMAIL PREFERENCES
   https://phabricator.haskell.org/settings/panel/emailpreferences/
 
 To: adamgundry, GHC - Driver, GHC - Type checker/inferencer


-- 
Adam Gundry, Haskell Consultant
Well-Typed LLP, http://www.well-typed.com/
___
ghc-devs mailing list
ghc-devs@haskell.org
http://www.haskell.org/mailman/listinfo/ghc-devs


Typechecker plugins and 7.10

2014-12-12 Thread Adam Gundry
Hi Austin, devs,

I'm not sure what stage the 7.10 branch split and RC have got to, but if
possible I'd like to get Phab:D553 included (my special pleading is that
it makes relatively small, self-contained changes that will make it
slightly harder to shoot oneself in the foot when writing a plugin). I
realise you have to say no at some point though!

More generally, at Richard's suggestion I propose that (at least for
7.10) we explicitly make no guarantees about the stability of the
TcPluginM API, and advertise that we may choose to make breaking changes
between minor GHC releases. After all, this feature is highly
experimental and tightly coupled to GHC itself.

All the best,

Adam


-- 
Adam Gundry, Haskell Consultant
Well-Typed LLP, http://www.well-typed.com/
___
ghc-devs mailing list
ghc-devs@haskell.org
http://www.haskell.org/mailman/listinfo/ghc-devs


Serialising evidence generated by typechecker plugins

2014-12-11 Thread Adam Gundry
Hi folks,

I've just discovered tcIfaceCoAxiomRule, which is used when typechecking
a coercion from an interface file. It turns out that CoAxiomRules are
represented in interface files by their names, so tcIfaceCoAxiomRule
looks up this name in a map containing all the built-in typeNatCoAxiomRules.

Unfortunately, this lookup fails when a plugin has defined its own
CoAxiomRule (as both uom-plugin and type-nat-solver do)! This means that
if a module uses a plugin and exports some of the evidence generated via
an unfolding, importing the module may result in a tcIfaceCoAxiomRule panic.

At the moment, both plugins generate fake CoAxiomRules that can prove
the equality of any types, so one workaround would be to expose this
ability in the TcCoercion type (i.e. add the equivalent of UnivCo). In
the future, however, it would be nice if plugins could actually generate
bona fide evidence based on their own axioms (e.g. the abelian group
laws, for uom-plugin).

We can't currently serialise CoAxiomRule directly, because it contains a
function in the coaxrProves field. Could we support an alternative
first-order representation that could be serialised? This probably
wouldn't be as expressive, in particular it might not cover the built-in
axioms that define type-level comparison functions and arithmetic
operators, but it would allow plugins to axiomatize algebraic theories.

Any thoughts?

Adam

P.S. I've updated https://phabricator.haskell.org/D553 with the
TcPluginM changes we discussed.


-- 
Adam Gundry, Haskell Consultant
Well-Typed LLP, http://www.well-typed.com/
___
ghc-devs mailing list
ghc-devs@haskell.org
http://www.haskell.org/mailman/listinfo/ghc-devs


Re: Typechecker plugins: request for review and another workflow question

2014-11-18 Thread Adam Gundry
I hope Iavor will forgive me for stealing his thunder, but given the
looming 7.10 branch I thought I'd pick this up.  I've created a Phab
revision (https://phabricator.haskell.org/D489) with the latest
typechecker plugins implementation, and updated the wiki page
(https://ghc.haskell.org/trac/ghc/wiki/Plugins/TypeChecker).

Adam


On 17/11/14 15:45, Austin Seipp wrote:
 This looks excellent! Thank you Adam and Eric for helping Iavor with this.
 
 But I'm slightly confused by the git history since it seems to be
 cluttered with a few merges, and it seems like Eric has pushed the
 latest changes to all this. The branch is also a little bit behind
 master too.
 
 Iavor, would you like to:
 
   1) Incorporate all of the changes from Eric and Adam,
   2) Rebase your work on HEAD so we can read it in a digestible way?
 
 I still need to read over all the changes since my first review, since
 Adam addressed them. The 7.10 branch is at the end of this week, but
 this would be a really cool feature to have.
 
 Thanks!
 
 The branch for 7.10 is now the end of this week! It would be nice to get this 
 in


-- 
Adam Gundry, Haskell Consultant
Well-Typed LLP, http://www.well-typed.com/
___
ghc-devs mailing list
ghc-devs@haskell.org
http://www.haskell.org/mailman/listinfo/ghc-devs


Re: Typechecker plugins: request for review and another workflow question

2014-11-17 Thread Adam Gundry
Thanks Eric!


On 16/11/14 19:33, Eric Seidel wrote:
 I've made a few changes based on your branch. Specifically I've
 removed the call to runTcPlugins inside solveFlats, and have replaced
 it with a specific runTcPluginsGiven that runs in a loop inside
 solveFlatsGiven much like your runTcPluginsFinal runs inside
 solveFlatsWanted. I think this is a bit cleaner given that you've
 split the wanteds-solving out already.

 The changes are at
https://github.com/gridaphobe/ghc/tree/wip/tc-plugins-els
 since I don't have commit access to GHC

I agree that it's better to split these out separately, if you and Iavor
are happy to work with only the unflattened constraints. I'm not
completely convinced by how solveFlatGivens/runTcPluginsGiven work now
though. It seems slightly odd that solveFlatGivens calls solveFlats and
runTcPluginsGiven with the same set of givens, even though solveFlats
might have simplified/discarded some.

Also, if I understand correctly, when a plugin generates new facts,
those will be added to the worklist directly, but also passed back in
the result Cts to be added to the worklist again. In the
solveFlatWanteds case, new facts are added to the worklist but not the
result Cts, which is fine.

Finally, I think the plugin should be allowed to solve given
constraints, just by discarding them (e.g. if the constraint is
something like 1 * x ~ x). It's slightly odd that the interface
requires the plugin to provide an evidence term in this case, but the
evidence associated with the CtGiven (ctev_evtm) will do.

I'll put some commits together shortly to address these fine details.


 Iavor and I also have a question about your change to the last
 statement in solveFlatWanteds. You're putting the unsolved wanteds in
 a field called wc_flat, which suggests that they ought to be
 flattened. But the unsolved wanteds were just unflattened a few lines
 above! Is this a problem, or is the wc_flat field in need of a new
 name? :)

This is a slightly unfortunate clash of terminology. I believe wc_flat
is so named because the constraints are flat in the sense of not
implications rather than contain no type family applications. This is
also the reason for the name solveFlats.

Adam


-- 
Adam Gundry, Haskell Consultant
Well-Typed LLP, http://www.well-typed.com/
___
ghc-devs mailing list
ghc-devs@haskell.org
http://www.haskell.org/mailman/listinfo/ghc-devs


Re: Typechecker plugins: request for review and another workflow question

2014-11-14 Thread Adam Gundry
Thanks, Simon! I've been convinced that TcS is more than we need, and I
think the right thing to do is to (optionally) invoke the plugin after
the constraints have been unflattened anyway. I've just pushed a commit
to wip/tc-plugins-amg that does this. Iavor, Eric, your views on how
convenient this alternative is would be most welcome. I'm also wondering
if the plugin should be told how many times it has been called, to make
it easier to prevent infinite loops.

I'm very keen to get this into 7.10, appropriately branded as a very
experimental feature. Austin, have I sufficiently addressed your
concerns about the hs-boot file and multiple flags? Is there anything
else we need, apart perhaps from tests and documentation, which I'll put
together next week?

Thanks,

Adam


On 12/11/14 11:16, Simon Peyton Jones wrote:
 Iavor, Adam, Eric
 
  
 
 I’m happy with the general direction of the plugins stuff, so I’m mostly
 going to leave it to you guys to plough ahead; but I am happy to respond
 to questions.
 
  
 
  * I still think it would be better to provide an escape hatch to the
 TcS, not merely the TcM, alongside the nice TcPluginM wrapper. Notably,
 Simon's new TcFlatten.unflatten needs TcS...
 
  
 
 It think the only reason for this is that ‘unflatten’ needs to set
 evidence bindings, which in turn requires access to tcs_ev_binds.  I
 think that everything else is in TcM.  So I suppose you could carry
 around the EvBindsVar if you really didn’t want TcS.  (And I can see why
 you wouldn’t; TcS has a lot of stuff you don’t need.)
 
  
 
 Simon
 
  
 
  
 
 *From:*Iavor Diatchki [mailto:iavor.diatc...@gmail.com]
 *Sent:* 10 November 2014 19:15
 *To:* Adam Gundry
 *Cc:* ghc-devs@haskell.org; Simon Peyton Jones
 *Subject:* Re: Typechecker plugins: request for review and another
 workflow question
 
  
 
 Hi,
 
  
 
 On Mon, Nov 10, 2014 at 1:31 AM, Adam Gundry a...@well-typed.com
 mailto:a...@well-typed.com wrote:
 
 
 On the subject of that nearly, I'm interested to learn whether you
 have a suggestion to deal with unflattening, because the interface still
 works with flat constraints only. Simon's changes should make it more
 practical to unflatten inside the plugin, but it would be far easier (at
 least for my purposes) if it was simply given unflattened constraints. I
 realise that this would require the plugin loop to be pushed further
 out, however, which has other difficulties.
 
  
 
 Not sure what to do about this.  With the current setup, I think either
 way, the plugin would have to do some extract work.   Perhaps we should
 run the plugins on the unflattened constraints, and leave to the plugins
 to manually temporarily flatten terms from external theories?  For
 example, if the type-nat plugin saw `2 * F a ~ 4`, it could temporarily
 work with `2 * x ~ 4`, and then when it figures out that `x ~ 2`, it
 could emit `F a ~ 2` (non-canonical), which would go around again, and
 hopefully get fully simplified.
 
  
 
  
 
 
 A few other issues, of lesser importance:
 
  * I still think it would be better to provide an escape hatch to the
 TcS, not merely the TcM, alongside the nice TcPluginM wrapper. Notably,
 Simon's new TcFlatten.unflatten needs TcS...
 
 I don't mind that but, if I recall correctly, to do this without more
 recursive modules, we had to split `TCSMonad` in two parts, one with the
 types, and one with other stuff.  Eric and I did this once, but we
 didn't commit it, because it seemed like an orthogonal change.
 
  
 
  
 
  * Is there a way for my plugin to solve a given constraint (e.g. to
 discard the uninformative a * 1 ~ a)?
 
 Yes, you'd say something like: `TcPluginOK [(evidence, a * 1 ~ a)] []`
 
  
 
 The first field of `TcPluginOK` are things that are solved, the second
 one is new work.
 
  
 
  * It is unfortunately easy to create infinite loops by writing plugins
 that emit wanteds but make no useful progress. Perhaps there should be a
 limit on the number of times round the loop (like SubGoalDepth but for
 all constraints)?
 
  
 
 Indeed, if a plugin keeps generating new work, we could go in a loop, so
 maybe a limit of some sort is useful.  However, note that if the plugin
 generates things that are already in the inert set, GHC should notice
 this and filter them, so we won't keep going.
 
  
 
  
 
  * Perhaps runTcPlugin should skip invoking the plugin if there are no
 wanteds?
 
  
 
 Oh, there is an important detail here that needs documentation!   GHC
 will call the plugin twice: once to improve the givens, and once to
 solve the wanteds.   The way to distinguish the two is exactly by the
 presence of the wanteds.
 
  
 
 Why might you want to improve the givens?  Suppose you had something
 like `x * 2 ~ 4` as a given:  then you'd really want the plugin to
 generate another given: `x ~ 2`, as this is likely to help the rest of
 the constraint solving process.
 
  
 
 
  * The use

Re: Typechecker plugins: request for review and another workflow question

2014-11-11 Thread Adam Gundry
I've just pushed wip/tc-plugins-amg, in which I remove the hs-boot file
and unify the core2core and typechecker plugins under a single -fplugin
flag. This did involve making a separate module for plugins, which I
think is probably a good thing.

I looked at using a hook instead, with a plugin to modify hooks, but I'm
not sure exactly where such a plugin should be invoked. Ideally we want
the typechecker modifications to work on a per-module basis, but most of
the hooks seem to have a wider scope than that.

Adam


On 11/11/14 03:08, Eric Seidel wrote:
 
 On Nov 10, 2014, at 07:58, Austin Seipp aus...@well-typed.com wrote:

 Hi Iavor;

 I took a very, very cursory glance. Naturally I am not a typechecker
 guru, but I did look over the mechanical changes/extensions to thread
 things around. Two things popped out to me:

 - 1) Ugh, a new .hs-boot file. I assume this wasn't added without
 good reason, but ideally we would be eliminating them quicker than we
 add them. :) I want to take a closer look at this; perhaps we can
 refactor something for you to remove the need for this.

 - 2) I am kind of not a fan of having separate 'plugins for
 core2core' and 'plugins for typechecking' flags, AKA -ftc-plugin and
 -fplugin. Ideally I would think all plugins could be uniformly
 specified by simply saying '-fplugin'. This mostly avoids the need for
 duplication and a naming convention/slew of flags for each case (which
 we have to catalog and document). There may be an easy way to make
 this the case; I haven't looked closely yet (it has been some time
 since I starred at the plugin code, even though Max wrote it and I
 helped get it merged!)
 
 I looked into this again a bit today and recall now why we made a separate
 Plugin type and flag; adding the TcPlugin to the existing Plugin type defined
 in CoreMonad creates a circular dependency between CoreMonad and TcRnTypes.
 
 We could (and perhaps should) move all of the plugin types into a separate
 module, but that would require pulling other types (e.g. CoreToDo) out of
 CoreMonad to break the circularity.



-- 
Adam Gundry, Haskell Consultant
Well-Typed LLP, http://www.well-typed.com/
___
ghc-devs mailing list
ghc-devs@haskell.org
http://www.haskell.org/mailman/listinfo/ghc-devs


Re: Typechecker plugins: request for review and another workflow question

2014-11-10 Thread Adam Gundry
Hi Iavor,


On 10/11/14 00:17, Iavor Diatchki wrote:
 Hello,

 I just finished merging HEAD into the branch implementing constraint
 solver plugins (`wip/tc-plugins`), so things should be fully up to
 date.  For ease of review, I squashed everything into a single commit:

 https://github.com/ghc/ghc/commit/31729d092c813edc4ef5682db2ee18b33aea6911

 could interested folks (I know of SimonPJ, Richard, and Adam) have a
 look and let me know if things look reasonable?

Thanks, this is fantastic work! It will be great to see this in HEAD and
7.10. I've been tracking your branch for a while to build my plugin for
units of measure, and I'm happy to report that it is nearly working...

On the subject of that nearly, I'm interested to learn whether you
have a suggestion to deal with unflattening, because the interface still
works with flat constraints only. Simon's changes should make it more
practical to unflatten inside the plugin, but it would be far easier (at
least for my purposes) if it was simply given unflattened constraints. I
realise that this would require the plugin loop to be pushed further
out, however, which has other difficulties.

A few other issues, of lesser importance:

 * I still think it would be better to provide an escape hatch to the
TcS, not merely the TcM, alongside the nice TcPluginM wrapper. Notably,
Simon's new TcFlatten.unflatten needs TcS...

 * Is there a way for my plugin to solve a given constraint (e.g. to
discard the uninformative a * 1 ~ a)?

 * It is unfortunately easy to create infinite loops by writing plugins
that emit wanteds but make no useful progress. Perhaps there should be a
limit on the number of times round the loop (like SubGoalDepth but for
all constraints)?

 * Perhaps runTcPlugin should skip invoking the plugin if there are no
wanteds?

 * The use of ctev_evar in runTcPlugin is partial, and fails with a
nasty error if the plugin returns a non-wanted in the solved constraints
list. Would it be worth catching this error and issuing a sensible
message that chastises the plugin author appropriately?

 * Finally, I presume the comment on runTcPlugin that The plugin is
provided only with CTyEq and CFunEq constraints is simply outdated and
should be removed?

Apologies for the deluge of questions - please take them as evidence of
my eagerness to use this feature!

All the best,

Adam


-- 
Adam Gundry, Haskell Consultant
Well-Typed LLP, http://www.well-typed.com/
___
ghc-devs mailing list
ghc-devs@haskell.org
http://www.haskell.org/mailman/listinfo/ghc-devs


Re: Typechecker plugins: request for review and another workflow question

2014-11-10 Thread Adam Gundry
On 10/11/14 15:58, Austin Seipp wrote:
  - 2) I am kind of not a fan of having separate 'plugins for
 core2core' and 'plugins for typechecking' flags, AKA -ftc-plugin and
 -fplugin. Ideally I would think all plugins could be uniformly
 specified by simply saying '-fplugin'. This mostly avoids the need for
 duplication and a naming convention/slew of flags for each case (which
 we have to catalog and document). There may be an easy way to make
 this the case; I haven't looked closely yet (it has been some time
 since I starred at the plugin code, even though Max wrote it and I
 helped get it merged!)

FWIW, I originally envisaged reusing the existing plugins machinery, and
I don't think there are any great problems in doing so (see
https://ghc.haskell.org/trac/ghc/wiki/Plugins/TypeChecker). In fact, I
had an early implementation of typechecker plugins that did exactly this.

I have been wondering, however, about another possible approach. We could:

 1. make the constraint solver changes use a *hook*, instead of the
plugins directly, and

 2. make it possible for plugins to install/modify hooks at some
suitable point in the compilation pipeline.

I don't know the hooks machinery very well, but if this is feasible it
seems like it would provide more power (plugins could modify any part of
GHC for which a hook is available) and avoid having multiple overlapping
ways of extending GHC. In the future, I can imagine wanting plugins to
hook into other parts of GHC (e.g. error message postprocessing for
domain-specific error reporting), and this seems like a good way to
achieve that.

Adam


 On Sun, Nov 9, 2014 at 6:17 PM, Iavor Diatchki iavor.diatc...@gmail.com 
 wrote:
 Hello,

 I just finished merging HEAD into the branch implementing constraint solver
 plugins (`wip/tc-plugins`), so things should be fully up to date.  For ease
 of review, I squashed everything into a single commit:

 https://github.com/ghc/ghc/commit/31729d092c813edc4ef5682db2ee18b33aea6911

 could interested folks (I know of SimonPJ, Richard, and Adam) have a look
 and let me know if things look reasonable?

 On a related note:  I know that we are using phabricator for code review,
 but I don't know how to use it yet, so please let me know if I can do
 something to make the review easier.

 -Iavor


-- 
Adam Gundry, Haskell Consultant
Well-Typed LLP, http://www.well-typed.com/
___
ghc-devs mailing list
ghc-devs@haskell.org
http://www.haskell.org/mailman/listinfo/ghc-devs


Re: Current description of Core?

2014-10-22 Thread Adam Gundry
Hi,

On 22/10/14 10:26, Sophie Taylor wrote:
 Is the current description of Core still System FC_2 (described in
 https://www.seas.upenn.edu/~sweirich/papers/popl163af-weirich.pdf)?

There have been a few extensions since then, described in these papers:

http://research.microsoft.com/en-us/um/people/simonpj/papers/ext-f/

Also, if you're interested in the gory details of what GHC *really*
implements, as opposed to the sanitized academic version, Richard
Eisenberg put together a nice description of Core that you can find in
the GHC repository:

https://github.com/ghc/ghc/blob/master/docs/core-spec/core-spec.pdf?raw=true

Hope this helps,

Adam


-- 
Adam Gundry, Haskell Consultant
Well-Typed LLP, http://www.well-typed.com/
___
ghc-devs mailing list
ghc-devs@haskell.org
http://www.haskell.org/mailman/listinfo/ghc-devs


Re: Overloaded record fields: we ought to change the name of FldTy

2014-04-30 Thread Adam Gundry
Thanks Johan,

On 30/04/14 09:33, Johan Tibell wrote:
 I'm currently watching the Skills Matter talks you gave (great!) and I'm
 slightly worried that we're going to expose beginners (and intermediate
 users!) to a name called FldTy, which might be OK for a GHC-internal
 name, but is not very readable.
 
 If this is going to be exposed to users, could we name is something
 readable, like FieldType, Field, or something similar?

While I'm hoping to avoid exposing these names to most users more than
at present, I think you're right: we should pick slightly more
informative names. (Various others have also made this suggestion!) Perhaps:

Has   |-  HasField
FldTy |-  FieldType
Upd   |-  UpdatedField
UpdTy |-  UpdatedType

If possible, I'd prefer to merge the existing patches more or less as
is, then I can work on the library design without continually needing to
fix merge conflicts.

Apart from naming, I'd like to tweak the design to improve the
presentation of inferred types that result from ORF code. Since I
originally implemented it, the constraint solver seems to have changed
in a way that makes the functional dependencies via type families
approach used lead to less pretty types. Obviously we need something a
little less brittle.

Cheers,

Adam

-- 
Adam Gundry, Haskell Consultant
Well-Typed LLP, http://www.well-typed.com/
___
ghc-devs mailing list
ghc-devs@haskell.org
http://www.haskell.org/mailman/listinfo/ghc-devs


Re: OverloadedRecordFields

2014-04-18 Thread Adam Gundry
 rationalisation may make sense
 
 Simon
   
 | -Original Message-
 | From: Adam Gundry [mailto:a...@well-typed.com]
 | Sent: 24 February 2014 08:37
 | To: Simon Peyton Jones
 | Subject: Re: OverloadedRecordFields
 | 
 | Hi Simon,
 | 
 | My OverloadedRecordFields branches[1,2,3] are up to date with HEAD as of
 | last Saturday. Validate on linux x86_64 reports only one failure, the
 | haddock.Cabal perf test, which might well be due to my Haddock changes,
 | and I will investigate. I'm not sure how to run the Haddock test suite?
 | 
 | I am keen to get the code reviewed and into HEAD as soon as is
 | convenient, but I'm aware these are substantial changes, and don't want
 | to rush things. In particular, I would understand if you'd rather hold
 | them back until after the 7.8 final release.
 | 
 | How would you like to proceed?
 | 
 | Adam
 | 
 | [1] https://github.com/adamgundry/ghc
 | [2] https://github.com/adamgundry/packages-base
 | [3] https://github.com/adamgundry/haddock


-- 
Adam Gundry, Haskell Consultant
Well-Typed LLP, http://www.well-typed.com/
___
ghc-devs mailing list
ghc-devs@haskell.org
http://www.haskell.org/mailman/listinfo/ghc-devs


Re: OverloadedRecordFields

2014-02-25 Thread Adam Gundry
Simon,

On 25/02/14 16:18, Simon Peyton Jones wrote:
 I'm very happy to hear that... good stuff.
 
 I'm under water with ICFP submissions (deadline Sat).  Moreover I think it is 
 clearly too later to put this into 7.8; RC1 is out and I expect RC2 any day.
 
 So I suggest we plan to merge after 7.8 is out.  

Right. I wasn't suggesting to put OverloadedRecordFields in 7.8, just
the 7.9 master branch! But given the potential need to backport fixes
from master to the 7.8 branch, I think it makes sense to wait.


 Are the wiki pages up to date? 
 Records/OverloadedRecordFields
 Records/OverloadedRecordFields/Implementation
 Records/OverloadedRecordFields/Plan
 
 The first does not point to the latter two; Plan may mean Design... I 
 feel some rationalisation may make sense

They are up to date but I'll go over them and tidy them up. Alas, it
doesn't look like Trac supports redirects...

Adam


 | -Original Message-
 | From: Adam Gundry [mailto:a...@well-typed.com]
 | Sent: 24 February 2014 08:37
 | To: Simon Peyton Jones
 | Subject: Re: OverloadedRecordFields
 | 
 | Hi Simon,
 | 
 | My OverloadedRecordFields branches[1,2,3] are up to date with HEAD as of
 | last Saturday. Validate on linux x86_64 reports only one failure, the
 | haddock.Cabal perf test, which might well be due to my Haddock changes,
 | and I will investigate. I'm not sure how to run the Haddock test suite?
 | 
 | I am keen to get the code reviewed and into HEAD as soon as is
 | convenient, but I'm aware these are substantial changes, and don't want
 | to rush things. In particular, I would understand if you'd rather hold
 | them back until after the 7.8 final release.
 | 
 | How would you like to proceed?
 | 
 | Adam
 | 
 | [1] https://github.com/adamgundry/ghc
 | [2] https://github.com/adamgundry/packages-base
 | [3] https://github.com/adamgundry/haddock
 | 
 | 
 | On 17/01/14 10:55, Simon Peyton Jones wrote:
 |  Yes that sounds ok, thanks.  I'd prefer to have a write-up of what
 | goes wrong with the 2-parameter story, so that we don't forget.
 | 
 |  Simon
 | 
 |  | -Original Message-
 |  | From: Adam Gundry [mailto:a...@well-typed.com]
 |  | Sent: 17 January 2014 10:15
 |  | To: Simon Peyton Jones
 |  | Subject: OverloadedRecordFields
 |  |
 |  | Hi Simon,
 |  |
 |  | I'm conscious that things have gone off the boil a little wrt
 |  | OverloadedRecordFields, partially as a consequence of the delayed
 |  | 7.8 release but also my lack of time for other projects since
 |  | starting work for Well-Typed. With that in mind, I'd like to propose
 |  | a plan to get back on track:
 |  |
 |  | 1. Revert to the three-parameter story, where we have
 |  |
 |  | t ~ FldTy r f = Has r f t
 |  |
 |  | rather than
 |  |
 |  | Has r f.
 |  |
 |  | The two-parameter version generates significantly worse error
 |  | messages, and there are some other unresolved problems, so I'm not
 |  | sure it is worth the minor simplification.
 |  |
 |  | 2. Roll back some of the refactoring that I've struggled to get
 |  | right (in particular, trying to make the generated FldTy/UpdTy
 |  | axioms implicitTyThings). We can always revisit this in the future
 | though.
 |  |
 |  | 3. Merge HEAD into my branch: I suspect this will be a bit painful
 |  | by now, but presumably with 7.8 imminent there won't be many major
 |  | changes coming for a while?
 |  |
 |  | 4. Review the proposed changes with you and fix any show-stopping
 |  | problems.
 |  |
 |  | 5. Merge into HEAD after 7.8 is released.
 |  |
 |  | Does this sound plausible? I'm happy to Skype if you like.
 |  |
 |  | Cheers,
 |  |
 |  | Adam

-- 
Adam Gundry, Haskell Consultant
Well-Typed LLP, http://www.well-typed.com/
___
ghc-devs mailing list
ghc-devs@haskell.org
http://www.haskell.org/mailman/listinfo/ghc-devs


Re: workaround to get both domain-specific errors and also multi-modal type inference?

2014-02-11 Thread Adam Gundry
-devs
 
 
 
 ___
 ghc-devs mailing list
 ghc-devs@haskell.org
 http://www.haskell.org/mailman/listinfo/ghc-devs
 


-- 
Adam Gundry, Haskell Consultant
Well-Typed LLP, http://www.well-typed.com/
___
ghc-devs mailing list
ghc-devs@haskell.org
http://www.haskell.org/mailman/listinfo/ghc-devs


Re: workaround to get both domain-specific errors and also multi-modal type inference?

2014-02-07 Thread Adam Gundry
 that information in a domain-specific error message which is
 actually a no instance of IndicativeClassName tag1 tag2 message.
 
  friendlier_plus :: (FriendlyEqCheck tag1 tag2 tag3,Num a) = U
 tag1 a - U tag2 a - U tag3 a
 
 The `friendlier_plus' function gives useful error messages if tag1,
 tag2, and tag3 are all fully monomorphic.
 
 However, it does not support inference:
 
  zero :: Num a = U tag a
  zero = U 0
  
  x = U 4 :: U [Foo,Baz] Int
 
  -- both of these are ill-typed :(
  should_work1 = (friendlier_plus zero x) `asTypeOf` x-- tag1 is
 unconstrained
  should_work2 = friendlier_plus x x -- tag3 is unconstrained
 
 Neither of those terms type-check, since FriendlyEqCheck can't
 determine if the unconstrained `tag' variable is equal to the other
 tags.
 
 So, can we get the best of both worlds?
 
  best_plus ::
( FriendlyEqCheck tag1 tag2 tag3
  , tag1 ~ tag2, tag2 ~ tag3, Num a) = U tag1 a - U tag2 a - U
 tag3 a
 
 No, unfortunately not. Now the `should_work*' functions type-check,
 but an ill-typed use of `best_plus' gives the same poor error
 message that `plus' would give.
 
 Hence, the question at the top of this email.
 
 = two ideas =
 
 If my question at the top of this email cannot be answered in the
 affirmative, perhaps a small patch to GHC would make this sort of
 approach a viable lightweight workaround for improving
 domain-specific error messages.
 
 (I cannot estimate how difficult this would be to implement in the
 type-checker.)
 
 Two alternative ideas.
 
 1) An ALWAYS_ATTEMPT PRAGMA that you can place on the class C so
 that it is attempted even if a related ~ constraint fails.
 
 2) An OrElse constraint former, offering *very* constrained
 back-tracking.
 
  possibleAsTypeOf :: ((a ~ b) `OrElse` C a b) = a - b - a
  possibleAsTypeOf x _ = x
 
 Requirements: C must have no superclasses, no methods, and no fundeps.
 
 Specification: If (a~b) fails and (C a b) is satisfiable, then the
 original inequality error message would be shown to the user. Else,
 C's error message is used.
 
 ===
 
 You made it to the bottom of the email! Thanks again.
 
 
 
 
 ___
 ghc-devs mailing list
 ghc-devs@haskell.org
 http://www.haskell.org/mailman/listinfo/ghc-devs
 


-- 
Adam Gundry, Haskell Consultant
Well-Typed LLP, http://www.well-typed.com/
___
ghc-devs mailing list
ghc-devs@haskell.org
http://www.haskell.org/mailman/listinfo/ghc-devs


Re: OverloadedRecordFields status update

2013-09-09 Thread Adam Gundry
On 09/09/13 16:53, Gabor Greif wrote:
 On 9/9/13, Adam Gundry adam.gun...@strath.ac.uk wrote:
 On 06/09/13 17:45, Gabor Greif wrote:
 On 9/5/13, Adam Gundry adam.gun...@strath.ac.uk wrote:

 I have been working on a new extension, OverloadedRecordFields, and it
 is now essentially feature-complete. Unfortunately, I doubt it will make
 it into 7.8, as the changes are quite extensive, but I hope to get it in
 HEAD soon thereafter.

 That would be great, it appears to be a great example of
 -XConstraintKinds, too!

 It relies quite heavily on type-level strings (-XDataKinds), if that's
 what you mean?
 
 No, I mean ConstraintKinds, I assume all the Has* constraints have
 kind 'Constraint' ?
 (I strongly assume so).

Ah, I see your point. Yes, the kind of `Has` is

* - Symbol - * - Constraint

although it is actually implemented as a typeclass in the base library,
so in this respect it's not particularly special. (Instances of the
typeclass are automatically generated and cannot be given in the normal
way, however.)

Cheers,

Adam

 ...

___
ghc-devs mailing list
ghc-devs@haskell.org
http://www.haskell.org/mailman/listinfo/ghc-devs


Re: OverloadedRecordFields status update

2013-09-09 Thread Adam Gundry
On 06/09/13 17:45, Gabor Greif wrote:
 On 9/5/13, Adam Gundry adam.gun...@strath.ac.uk wrote:

 I have been working on a new extension, OverloadedRecordFields, and it
 is now essentially feature-complete. Unfortunately, I doubt it will make
 it into 7.8, as the changes are quite extensive, but I hope to get it in
 HEAD soon thereafter.
 
 That would be great, it appears to be a great example of
 -XConstraintKinds, too!

It relies quite heavily on type-level strings (-XDataKinds), if that's
what you mean?


 I would really appreciate comments on the design [1], for which there is
 
 How would such a feature interact with
 http://hackage.haskell.org/package/vinyl ?

The short answer is that it shouldn't. Vinyl (which I wasn't aware of
before now, thanks for the pointer!) basically replaces the Haskell
records system wholesale, making records entirely separate from data
declarations, and turning fields into first-class objects. This should
peaceably coexist with -XOverloadedRecordFields but I don't see a way to
integrate the two.

Adam


 Gabor
 
 a prototype implementation [2] that works in GHC 7.6.3. If you'd like to
 review the code, you can check out the overloaded-record-fields branch
 from the Github repositories [3, 4], and there are notes on the
 implementation on the wiki [5].

 Thanks!

 Adam

 [1]
 http://ghc.haskell.org/trac/ghc/wiki/Records/OverloadedRecordFields/Plan
 [2] https://github.com/adamgundry/records-prototype
 [3] https://github.com/adamgundry/ghc
 [4] https://github.com/adamgundry/packages-base
 [5]
 http://ghc.haskell.org/trac/ghc/wiki/Records/OverloadedRecordFields/Implementation


___
ghc-devs mailing list
ghc-devs@haskell.org
http://www.haskell.org/mailman/listinfo/ghc-devs