Re: Options for targeting Windows XP?

2021-03-24 Thread Clinton Mead
Thanks all for your replies. Just going through what Ben has said step by
step:

My sense is that if you don't need the threaded runtime system it would
> probably be easiest to just try to make a modern GHC run on Windows XP.
>

Happy to run non-threaded runtime. A good chunk of these machines will be
single or dual core anyway.


> As Tamar suggested, it likely not easy, but also not impossible. WinIO
> is indeed problematic, but thankfully the old MIO IO manager is still
> around (and will be in 9.2).
>

"Is still around"? As in it's in the code base and just dead code, or can I
trigger GHC to use the old IO manager with a GHC option?

The possible reasons for Windows XP incompatibility that I can think of
> off the top of my head are:
>
>  * Timers (we now use QueryPerformanceCounter)
>

This page suggests that QueryPerformanceCounter

should
run on XP. Is this incorrect?


>  * Big-PE support, which is very much necessary for profiled builds
>

I don't really need profiled builds


>  * Long file path support (mostly a build-time consideration as Haskell
>build systems tend to produce very long paths)
>
>
I don't need to build on Windows XP either. I just need to run on Windows
XP so hopefully this won't be an issue. Although if GHC was modified for
long file path support so it could build itself with long file path support
presumably it will affect everything else it builds also.


> There may be others, but I would start looking there. I am happy to
> answer any questions that might arise.
>
>
I'm guessing the way forward here might be a patch with two options:

1. -no-long-path-support/-long-path-support (default -long-path-support)
2. -winxp

The winxp option shall:

- Require -no-long-path-support
- Conflicts with -threaded
- Conflicts with profiled builds
- Uses the old IO manager (I'm not sure if this is an option or how this is
done).

What do you think (roughly speaking)?
___
ghc-devs mailing list
ghc-devs@haskell.org
http://mail.haskell.org/cgi-bin/mailman/listinfo/ghc-devs


Re: Options for targeting Windows XP?

2021-03-24 Thread Ben Gamari
Clinton Mead  writes:

> I'm currently trying to bring my company around to using a bit of Haskell.
> One issue is that a number of our clients are based in South East Asia and
> need software that runs on Windows XP.
>
Ooph, that is quite tricky. Indeed we dropped XP support for Windows
8.0, at which point XP had already been EoL'd for seven years.

> Unfortunately it seems the last version of GHC that produces executables
> that run on Windows XP is GHC 7.10. Whilst this table
>  suggests the
> issue may only running GHC 8.0+ on Windows XP, I've confirmed that GHC 8.0
> executables (even "Hello World") will not run on Windows XP, presumably
> because a non-XP WinAPI call in the runtime.
>
Indeed. The dropping of XP support was prompted by the need to use a
newer Win32 interface (I can't recall which in particular).

> My first thought would be to restrict myself to GHC 7.10 features (i.e.
> 2015). This would be a slight annoyance but GHC 7.10 still presents a
> reasonable language. But my concern would be that increasingly I'll run
> into issues with libraries that use extensions post GHC 7.10, particularly
> libraries with large dependency lists.
>
I would also be concerned about this. I wouldn't expect to be able to
get very far with GHC 7.10 in 2021.

> So there's a few options I've considered at this point:
>
> 1. Use GHCJS to compile to Javascript, and then dig out a version of NodeJS
> that runs on Windows XP. GHCJS seems to at least have a compiler based on
> GHC 8.6.
>
This is an option, although only you know whether this would fit your
application given your memory and CPU constraints. I also have no idea
how easy it would be to find a functional version of NodeJS.

> But then I had a thought. If GHC Core isn't supposed to change much between
> versions is it? Which made me come up with these approaches:
>
> 3. Hack up a script to compile programs using GHC 9 to Core, then feed that
> Core output into GHC 7.10. OR
>
> 4. Produce a chimera style GHC by importing the GHC 9.0 API and the GHC
> 7.10 API, and making a version of GHC that does Haskell -> Core in GHC 9.0
> and the rest of the code generation in GHC 7.10.
>

Sadly, I suspect this isn't going to work. While Core itself doesn't
change (that much), the primops do. Even getting Core produced by GHC
9.0 to build under GHC 8.10 would require a considerable amount of work.

> One issue with 4 will be that presumably that because I'm importing GHC 9.0
> API and the 7.10 API separately, all their data types will technically be
> separate, so I'll need to basically deep copy the GHC 9.0 core datatype
> (and perhaps others) to GHC 7.10 datatypes. But presuming their largely
> similar this should be fairly mechanical.
>
I'm not sure how mechanical this would be, to be honest.

> So are any of these approaches (well, particularly 2 and 4) reasonable? Or
> am I going to run into big problems with either of them? Is there another
> approach I haven't thought of?
>
My sense is that if you don't need the threaded runtime system it would
probably be easiest to just try to make a modern GHC run on Windows XP.
As Tamar suggested, it likely not easy, but also not impossible. WinIO
is indeed problematic, but thankfully the old MIO IO manager is still
around (and will be in 9.2).

The possible reasons for Windows XP incompatibility that I can think of
off the top of my head are:

 * Timers (we now use QueryPerformanceCounter)
 * Big-PE support, which is very much necessary for profiled builds
 * Long file path support (mostly a build-time consideration as Haskell
   build systems tend to produce very long paths)

There may be others, but I would start looking there. I am happy to
answer any questions that might arise.

Cheers,

 - Ben


signature.asc
Description: PGP signature
___
ghc-devs mailing list
ghc-devs@haskell.org
http://mail.haskell.org/cgi-bin/mailman/listinfo/ghc-devs


Re: Options for targeting Windows XP?

2021-03-24 Thread Phyx
Hi,

>  XP. GHCJS seems to at least have a compiler based on GHC 8.6.
> 2. Patch GHC with an additional command line argument to produce XP/Vista
compatible executables, perhaps by looking at the changes between 7.10 ->
8.0, and re-introducing the XP approach as an option.

This would be somewhat hard but not impossible for 8.0.. Which If I
recalled drop XP for some linker functionality. The higher you go the more
difficult it would become though.

When you get to 9.0 you don't have much hope as there it's not just the
linker, but the RTS itself heavily relies on functionality not available in
XP, including how we manage memory and do synchronization.

It's however not just GHC that would need patching but libraries such as
process as well. That is not to say it's impossible, just you'd have to
find ways to work around the bugs that caused us to change APIs to begin
with...

I can't speak for the community, but I wouldn't want to re-introduce XP as
a supported options in mainline. Parts of e.g. 9.0 (like winio) just won't
work on XP. The design itself is centered around new APIs. So supporting XP
means essentially a new design.

Kind regards,
Tamar

Sent from my Mobile

On Wed, Mar 24, 2021, 14:09 Clinton Mead  wrote:

> I'm currently trying to bring my company around to using a bit of Haskell.
> One issue is that a number of our clients are based in South East Asia and
> need software that runs on Windows XP.
>
> Unfortunately it seems the last version of GHC that produces executables
> that run on Windows XP is GHC 7.10. Whilst this table
>  suggests
> the issue may only running GHC 8.0+ on Windows XP, I've confirmed that GHC
> 8.0 executables (even "Hello World") will not run on Windows XP, presumably
> because a non-XP WinAPI call in the runtime.
>
> My first thought would be to restrict myself to GHC 7.10 features (i.e.
> 2015). This would be a slight annoyance but GHC 7.10 still presents a
> reasonable language. But my concern would be that increasingly I'll run
> into issues with libraries that use extensions post GHC 7.10, particularly
> libraries with large dependency lists.
>
> So there's a few options I've considered at this point:
>
> 1. Use GHCJS to compile to Javascript, and then dig out a version of
> NodeJS that runs on Windows XP. GHCJS seems to at least have a compiler
> based on GHC 8.6.
> 2. Patch GHC with an additional command line argument to produce XP/Vista
> compatible executables, perhaps by looking at the changes between 7.10 ->
> 8.0, and re-introducing the XP approach as an option.
>
> The issue with 1 is that is that as well as being limited by how up to
> date GHCJS is, this will increase install size, memory usage and decrease
> performance on Windows XP machines, which are often in our environments
> quite old and resource and memory constrained.
>
> Approach 2 is something I'd be willing to put some work into if it was
> practical, but my thought is that XP support was removed for a reason,
> presumably by using newer WinAPI functions simplified things significantly.
> By re-adding in XP support I'd be complicating GHC once again, and GHC will
> effectively have to maintain two approaches. In addition, in the long term,
> whenever a new WinAPI call is added one would now have to check whether
> it's available in Windows XP, and if it's not produce a Windows XP
> equivalent. That might seem like just an extra burden of support for
> already busy GHC developers. But on the other hand, if the GHC devs would
> be happy to merge a patch and keep up XP support this would be the cleanest
> option.
>
> But then I had a thought. If GHC Core isn't supposed to change much
> between versions is it? Which made me come up with these approaches:
>
> 3. Hack up a script to compile programs using GHC 9 to Core, then feed
> that Core output into GHC 7.10. OR
> 4. Produce a chimera style GHC by importing the GHC 9.0 API and the GHC
> 7.10 API, and making a version of GHC that does Haskell -> Core in GHC 9.0
> and the rest of the code generation in GHC 7.10.
>
> One issue with 4 will be that presumably that because I'm importing GHC
> 9.0 API and the 7.10 API separately, all their data types will technically
> be separate, so I'll need to basically deep copy the GHC 9.0 core datatype
> (and perhaps others) to GHC 7.10 datatypes. But presuming their largely
> similar this should be fairly mechanical.
>
> So are any of these approaches (well, particularly 2 and 4) reasonable? Or
> am I going to run into big problems with either of them? Is there another
> approach I haven't thought of?
>
> Thanks,
> Clinton
> ___
> ghc-devs mailing list
> ghc-devs@haskell.org
> http://mail.haskell.org/cgi-bin/mailman/listinfo/ghc-devs
>
___
ghc-devs mailing list
ghc-devs@haskell.org
http://mail.haskell.org/cgi-bin/mailman/listinfo/ghc-devs


Re: Options for targeting Windows XP?

2021-03-24 Thread Carter Schonwald
In terms of net amount of work: I suspect ghcjs targeting either node or
some sort of browser plug-in may be the most humane assuming associated
browser / node suport on xp is turn key.  I think there were some genuine
changes to the io manager (the Haskell code in base for doing efficient
file system api stuff)  on windows plus a few other things.  There may have
also been changes elsewhere that andreask and Tamar and ben gamari can
speak to better.

More broadly, there’s so many bug fixes and improvements that you’d miss
out on if you don’t try to keep yourself current within the 3 most recent
ghc major version releases wrt associated libraries.

On Wed, Mar 24, 2021 at 10:09 AM Clinton Mead  wrote:

> I'm currently trying to bring my company around to using a bit of Haskell.
> One issue is that a number of our clients are based in South East Asia and
> need software that runs on Windows XP.
>
> Unfortunately it seems the last version of GHC that produces executables
> that run on Windows XP is GHC 7.10. Whilst this table
>  suggests
> the issue may only running GHC 8.0+ on Windows XP, I've confirmed that GHC
> 8.0 executables (even "Hello World") will not run on Windows XP, presumably
> because a non-XP WinAPI call in the runtime.
>
> My first thought would be to restrict myself to GHC 7.10 features (i.e.
> 2015). This would be a slight annoyance but GHC 7.10 still presents a
> reasonable language. But my concern would be that increasingly I'll run
> into issues with libraries that use extensions post GHC 7.10, particularly
> libraries with large dependency lists.
>
> So there's a few options I've considered at this point:
>
> 1. Use GHCJS to compile to Javascript, and then dig out a version of
> NodeJS that runs on Windows XP. GHCJS seems to at least have a compiler
> based on GHC 8.6.
> 2. Patch GHC with an additional command line argument to produce XP/Vista
> compatible executables, perhaps by looking at the changes between 7.10 ->
> 8.0, and re-introducing the XP approach as an option.
>
> The issue with 1 is that is that as well as being limited by how up to
> date GHCJS is, this will increase install size, memory usage and decrease
> performance on Windows XP machines, which are often in our environments
> quite old and resource and memory constrained.
>
> Approach 2 is something I'd be willing to put some work into if it was
> practical, but my thought is that XP support was removed for a reason,
> presumably by using newer WinAPI functions simplified things significantly.
> By re-adding in XP support I'd be complicating GHC once again, and GHC will
> effectively have to maintain two approaches. In addition, in the long term,
> whenever a new WinAPI call is added one would now have to check whether
> it's available in Windows XP, and if it's not produce a Windows XP
> equivalent. That might seem like just an extra burden of support for
> already busy GHC developers. But on the other hand, if the GHC devs would
> be happy to merge a patch and keep up XP support this would be the cleanest
> option.
>
> But then I had a thought. If GHC Core isn't supposed to change much
> between versions is it? Which made me come up with these approaches:
>
> 3. Hack up a script to compile programs using GHC 9 to Core, then feed
> that Core output into GHC 7.10. OR
> 4. Produce a chimera style GHC by importing the GHC 9.0 API and the GHC
> 7.10 API, and making a version of GHC that does Haskell -> Core in GHC 9.0
> and the rest of the code generation in GHC 7.10.
>
> One issue with 4 will be that presumably that because I'm importing GHC
> 9.0 API and the 7.10 API separately, all their data types will technically
> be separate, so I'll need to basically deep copy the GHC 9.0 core datatype
> (and perhaps others) to GHC 7.10 datatypes. But presuming their largely
> similar this should be fairly mechanical.
>
> So are any of these approaches (well, particularly 2 and 4) reasonable? Or
> am I going to run into big problems with either of them? Is there another
> approach I haven't thought of?
>
> Thanks,
> Clinton
> ___
> ghc-devs mailing list
> ghc-devs@haskell.org
> http://mail.haskell.org/cgi-bin/mailman/listinfo/ghc-devs
>
___
ghc-devs mailing list
ghc-devs@haskell.org
http://mail.haskell.org/cgi-bin/mailman/listinfo/ghc-devs


Options for targeting Windows XP?

2021-03-24 Thread Clinton Mead
I'm currently trying to bring my company around to using a bit of Haskell.
One issue is that a number of our clients are based in South East Asia and
need software that runs on Windows XP.

Unfortunately it seems the last version of GHC that produces executables
that run on Windows XP is GHC 7.10. Whilst this table
 suggests the
issue may only running GHC 8.0+ on Windows XP, I've confirmed that GHC 8.0
executables (even "Hello World") will not run on Windows XP, presumably
because a non-XP WinAPI call in the runtime.

My first thought would be to restrict myself to GHC 7.10 features (i.e.
2015). This would be a slight annoyance but GHC 7.10 still presents a
reasonable language. But my concern would be that increasingly I'll run
into issues with libraries that use extensions post GHC 7.10, particularly
libraries with large dependency lists.

So there's a few options I've considered at this point:

1. Use GHCJS to compile to Javascript, and then dig out a version of NodeJS
that runs on Windows XP. GHCJS seems to at least have a compiler based on
GHC 8.6.
2. Patch GHC with an additional command line argument to produce XP/Vista
compatible executables, perhaps by looking at the changes between 7.10 ->
8.0, and re-introducing the XP approach as an option.

The issue with 1 is that is that as well as being limited by how up to date
GHCJS is, this will increase install size, memory usage and decrease
performance on Windows XP machines, which are often in our environments
quite old and resource and memory constrained.

Approach 2 is something I'd be willing to put some work into if it was
practical, but my thought is that XP support was removed for a reason,
presumably by using newer WinAPI functions simplified things significantly.
By re-adding in XP support I'd be complicating GHC once again, and GHC will
effectively have to maintain two approaches. In addition, in the long term,
whenever a new WinAPI call is added one would now have to check whether
it's available in Windows XP, and if it's not produce a Windows XP
equivalent. That might seem like just an extra burden of support for
already busy GHC developers. But on the other hand, if the GHC devs would
be happy to merge a patch and keep up XP support this would be the cleanest
option.

But then I had a thought. If GHC Core isn't supposed to change much between
versions is it? Which made me come up with these approaches:

3. Hack up a script to compile programs using GHC 9 to Core, then feed that
Core output into GHC 7.10. OR
4. Produce a chimera style GHC by importing the GHC 9.0 API and the GHC
7.10 API, and making a version of GHC that does Haskell -> Core in GHC 9.0
and the rest of the code generation in GHC 7.10.

One issue with 4 will be that presumably that because I'm importing GHC 9.0
API and the 7.10 API separately, all their data types will technically be
separate, so I'll need to basically deep copy the GHC 9.0 core datatype
(and perhaps others) to GHC 7.10 datatypes. But presuming their largely
similar this should be fairly mechanical.

So are any of these approaches (well, particularly 2 and 4) reasonable? Or
am I going to run into big problems with either of them? Is there another
approach I haven't thought of?

Thanks,
Clinton
___
ghc-devs mailing list
ghc-devs@haskell.org
http://mail.haskell.org/cgi-bin/mailman/listinfo/ghc-devs


Re: On CI

2021-03-24 Thread Andreas Klebinger

> What about the case where the rebase *lessens* the improvement? That
is, you're expecting these 10 cases to improve, but after a rebase, only
1 improves. That's news! But a blanket "accept improvements" won't tell you.

I don't think that scenario currently triggers a CI failure. So this
wouldn't really change.

As I understand it the current logic is:

* Run tests
* Check if any cross the metric thresholds set in the test.
* If so check if that test is allowed to cross the threshold.

I believe we don't check that all benchmarks listed with an expected
in/decrease actually do so.
It would also be hard to do so reasonably without making it even harder
to push MRs through CI.

Andreas

Am 24/03/2021 um 13:08 schrieb Richard Eisenberg:

What about the case where the rebase *lessens* the improvement? That is, you're expecting 
these 10 cases to improve, but after a rebase, only 1 improves. That's news! But a 
blanket "accept improvements" won't tell you.

I'm not hard against this proposal, because I know precise tracking has its own 
costs. Just wanted to bring up another scenario that might be factored in.

Richard


On Mar 24, 2021, at 7:44 AM, Andreas Klebinger  wrote:

After the idea of letting marge accept unexpected perf improvements and
looking at https://gitlab.haskell.org/ghc/ghc/-/merge_requests/4759
which failed because of a single test, for a single build flavour
crossing the
improvement threshold where CI fails after rebasing I wondered.

When would accepting a unexpected perf improvement ever backfire?

In practice I either have a patch that I expect to improve performance
for some things
so I want to accept whatever gains I get. Or I don't expect improvements
so it's *maybe*
worth failing CI for in case I optimized away some code I shouldn't or
something of that
sort.

How could this be actionable? Perhaps having a set of indicator for CI of
"Accept allocation decreases"
"Accept residency decreases"

Would be saner. I have personally *never* gotten value out of the
requirement
to list the indivial tests that improve. Usually a whole lot of them do.
Some cross
the threshold so I add them. If I'm unlucky I have to rebase and a new
one might
make it across the threshold.

Being able to accept improvements (but not regressions) wholesale might be a
reasonable alternative.

Opinions?

___
ghc-devs mailing list
ghc-devs@haskell.org
http://mail.haskell.org/cgi-bin/mailman/listinfo/ghc-devs

___
ghc-devs mailing list
ghc-devs@haskell.org
http://mail.haskell.org/cgi-bin/mailman/listinfo/ghc-devs


Re: On CI

2021-03-24 Thread Moritz Angermann
Yes, this is exactly one of the issues that marge might run into as well,
the aggregate ends up performing differently from the individual ones. Now
we have marge to ensure that at least the aggregate builds together, which
is the whole point of these merge trains. Not to end up in a situation
where two patches that are fine on their own, end up to produce a broken
merged state that doesn't build anymore.

Now we have marge to ensure every commit is buildable. Next we should run
regression tests on all commits on master (and that includes each and
everyone that marge brings into master. Then we have visualisation that
tells us how performance metrics go up/down over time, and we can drill
down into commits if they yield interesting results in either way.

Now lets say you had a commit that should have made GHC 50% faster across
the board, but somehow after the aggregate with other patches this didn't
happen anymore? We'd still expect this to somehow show in each of the
singular commits on master right?

On Wed, Mar 24, 2021 at 8:09 PM Richard Eisenberg  wrote:

> What about the case where the rebase *lessens* the improvement? That is,
> you're expecting these 10 cases to improve, but after a rebase, only 1
> improves. That's news! But a blanket "accept improvements" won't tell you.
>
> I'm not hard against this proposal, because I know precise tracking has
> its own costs. Just wanted to bring up another scenario that might be
> factored in.
>
> Richard
>
> > On Mar 24, 2021, at 7:44 AM, Andreas Klebinger 
> wrote:
> >
> > After the idea of letting marge accept unexpected perf improvements and
> > looking at https://gitlab.haskell.org/ghc/ghc/-/merge_requests/4759
> > which failed because of a single test, for a single build flavour
> > crossing the
> > improvement threshold where CI fails after rebasing I wondered.
> >
> > When would accepting a unexpected perf improvement ever backfire?
> >
> > In practice I either have a patch that I expect to improve performance
> > for some things
> > so I want to accept whatever gains I get. Or I don't expect improvements
> > so it's *maybe*
> > worth failing CI for in case I optimized away some code I shouldn't or
> > something of that
> > sort.
> >
> > How could this be actionable? Perhaps having a set of indicator for CI of
> > "Accept allocation decreases"
> > "Accept residency decreases"
> >
> > Would be saner. I have personally *never* gotten value out of the
> > requirement
> > to list the indivial tests that improve. Usually a whole lot of them do.
> > Some cross
> > the threshold so I add them. If I'm unlucky I have to rebase and a new
> > one might
> > make it across the threshold.
> >
> > Being able to accept improvements (but not regressions) wholesale might
> be a
> > reasonable alternative.
> >
> > Opinions?
> >
> > ___
> > ghc-devs mailing list
> > ghc-devs@haskell.org
> > http://mail.haskell.org/cgi-bin/mailman/listinfo/ghc-devs
>
> ___
> ghc-devs mailing list
> ghc-devs@haskell.org
> http://mail.haskell.org/cgi-bin/mailman/listinfo/ghc-devs
>
___
ghc-devs mailing list
ghc-devs@haskell.org
http://mail.haskell.org/cgi-bin/mailman/listinfo/ghc-devs


Re: GHC 8.10 backports?

2021-03-24 Thread Moritz Angermann
More like abandoned backport attempt :D

On Wed, Mar 24, 2021 at 7:29 PM Andreas Klebinger 
wrote:

> Yes, only changing the rule did indeed cause regressions.
> Whichwhen not including the string changes. I don't think it's worth
> having one without the other.
>
> But it seems you already backported this?
> See https://gitlab.haskell.org/ghc/ghc/-/merge_requests/5263
>
> Cheers
> Andreas
> Am 22/03/2021 um 07:02 schrieb Moritz Angermann:
>
> The commit message from
> https://gitlab.haskell.org/ghc/ghc/-/commit/f10d11fa49fa9a7a506c4fdbdf86521c2a8d3495,
>
> makes the changes to string seem required. Applying the commit on its own
> doesn't apply cleanly and pulls in quite a
> bit of extra dependent commits. Just applying the elem rules appears
> rather risky. Thus will I agree that having that
> would be a nice fix to have, the amount of necessary code changes makes me
> rather uncomfortable for a minor release :-/
>
> On Mon, Mar 22, 2021 at 1:58 PM Gergő Érdi  wrote:
>
>> Thanks, that makes it less appealing. In the original thread, I got no
>> further replies after my email announcing my "discovery" of that commit, so
>> I thought that was the whole story.
>>
>> On Mon, Mar 22, 2021, 13:53 Viktor Dukhovni 
>> wrote:
>>
>>> On Mon, Mar 22, 2021 at 12:39:28PM +0800, Gergő Érdi wrote:
>>>
>>> > I'd love to have this in a GHC 8.10 release:
>>> > https://mail.haskell.org/pipermail/ghc-devs/2021-March/019629.html
>>>
>>> This is already in 9.0, 9.2 and master, but it is a rather non-trivial
>>> change, given all the new work that went into the String case.  So I am
>>> not sure it is small/simple enough to make for a compelling backport.
>>>
>>> There's a lot of recent activity in this space.  See also
>>> , which is not
>>> yet merged into master, and might still be eta-reduced one more step).
>>>
>>> I don't know whether such optimisation tweaks (not a bugfix) are in
>>> scope for backporting, we certainly need to be confident they'll not
>>> cause any new problems.  FWIW, 5259 is dramatically simpler...
>>>
>>> Of course we also have
>>>  in much the
>>> same territory, but there we're still blocked on someone figuring out
>>> what's going on with the 20% compile-time hit with T13056, and whether
>>> that's acceptable or not...
>>>
>>> --
>>> Viktor.
>>> ___
>>> ghc-devs mailing list
>>> ghc-devs@haskell.org
>>> http://mail.haskell.org/cgi-bin/mailman/listinfo/ghc-devs
>>>
>> ___
>> ghc-devs mailing list
>> ghc-devs@haskell.org
>> http://mail.haskell.org/cgi-bin/mailman/listinfo/ghc-devs
>>
>
> ___
> ghc-devs mailing 
> listghc-devs@haskell.orghttp://mail.haskell.org/cgi-bin/mailman/listinfo/ghc-devs
>
> ___
> ghc-devs mailing list
> ghc-devs@haskell.org
> http://mail.haskell.org/cgi-bin/mailman/listinfo/ghc-devs
>
___
ghc-devs mailing list
ghc-devs@haskell.org
http://mail.haskell.org/cgi-bin/mailman/listinfo/ghc-devs


Re: On CI

2021-03-24 Thread Richard Eisenberg
What about the case where the rebase *lessens* the improvement? That is, you're 
expecting these 10 cases to improve, but after a rebase, only 1 improves. 
That's news! But a blanket "accept improvements" won't tell you.

I'm not hard against this proposal, because I know precise tracking has its own 
costs. Just wanted to bring up another scenario that might be factored in.

Richard

> On Mar 24, 2021, at 7:44 AM, Andreas Klebinger  
> wrote:
> 
> After the idea of letting marge accept unexpected perf improvements and
> looking at https://gitlab.haskell.org/ghc/ghc/-/merge_requests/4759
> which failed because of a single test, for a single build flavour
> crossing the
> improvement threshold where CI fails after rebasing I wondered.
> 
> When would accepting a unexpected perf improvement ever backfire?
> 
> In practice I either have a patch that I expect to improve performance
> for some things
> so I want to accept whatever gains I get. Or I don't expect improvements
> so it's *maybe*
> worth failing CI for in case I optimized away some code I shouldn't or
> something of that
> sort.
> 
> How could this be actionable? Perhaps having a set of indicator for CI of
> "Accept allocation decreases"
> "Accept residency decreases"
> 
> Would be saner. I have personally *never* gotten value out of the
> requirement
> to list the indivial tests that improve. Usually a whole lot of them do.
> Some cross
> the threshold so I add them. If I'm unlucky I have to rebase and a new
> one might
> make it across the threshold.
> 
> Being able to accept improvements (but not regressions) wholesale might be a
> reasonable alternative.
> 
> Opinions?
> 
> ___
> ghc-devs mailing list
> ghc-devs@haskell.org
> http://mail.haskell.org/cgi-bin/mailman/listinfo/ghc-devs

___
ghc-devs mailing list
ghc-devs@haskell.org
http://mail.haskell.org/cgi-bin/mailman/listinfo/ghc-devs


Re: On CI

2021-03-24 Thread Andreas Klebinger

After the idea of letting marge accept unexpected perf improvements and
looking at https://gitlab.haskell.org/ghc/ghc/-/merge_requests/4759
which failed because of a single test, for a single build flavour
crossing the
improvement threshold where CI fails after rebasing I wondered.

When would accepting a unexpected perf improvement ever backfire?

In practice I either have a patch that I expect to improve performance
for some things
so I want to accept whatever gains I get. Or I don't expect improvements
so it's *maybe*
worth failing CI for in case I optimized away some code I shouldn't or
something of that
sort.

How could this be actionable? Perhaps having a set of indicator for CI of
"Accept allocation decreases"
"Accept residency decreases"

Would be saner. I have personally *never* gotten value out of the
requirement
to list the indivial tests that improve. Usually a whole lot of them do.
Some cross
the threshold so I add them. If I'm unlucky I have to rebase and a new
one might
make it across the threshold.

Being able to accept improvements (but not regressions) wholesale might be a
reasonable alternative.

Opinions?

___
ghc-devs mailing list
ghc-devs@haskell.org
http://mail.haskell.org/cgi-bin/mailman/listinfo/ghc-devs


Re: GHC 8.10 backports?

2021-03-24 Thread Andreas Klebinger

Yes, only changing the rule did indeed cause regressions.
Whichwhen not including the string changes. I don't think it's worth
having one without the other.

But it seems you already backported this?
See https://gitlab.haskell.org/ghc/ghc/-/merge_requests/5263

Cheers
Andreas

Am 22/03/2021 um 07:02 schrieb Moritz Angermann:

The commit message from
https://gitlab.haskell.org/ghc/ghc/-/commit/f10d11fa49fa9a7a506c4fdbdf86521c2a8d3495
,

makes the changes to string seem required. Applying the commit on its
own doesn't apply cleanly and pulls in quite a
bit of extra dependent commits. Just applying the elem rules appears
rather risky. Thus will I agree that having that
would be a nice fix to have, the amount of necessary code changes
makes me rather uncomfortable for a minor release :-/

On Mon, Mar 22, 2021 at 1:58 PM Gergő Érdi mailto:ge...@erdi.hu>> wrote:

Thanks, that makes it less appealing. In the original thread, I
got no further replies after my email announcing my "discovery" of
that commit, so I thought that was the whole story.

On Mon, Mar 22, 2021, 13:53 Viktor Dukhovni
mailto:ietf-d...@dukhovni.org>> wrote:

On Mon, Mar 22, 2021 at 12:39:28PM +0800, Gergő Érdi wrote:

> I'd love to have this in a GHC 8.10 release:
>
https://mail.haskell.org/pipermail/ghc-devs/2021-March/019629.html


This is already in 9.0, 9.2 and master, but it is a rather
non-trivial
change, given all the new work that went into the String
case.  So I am
not sure it is small/simple enough to make for a compelling
backport.

There's a lot of recent activity in this space.  See also
>,
which is not
yet merged into master, and might still be eta-reduced one
more step).

I don't know whether such optimisation tweaks (not a bugfix)
are in
scope for backporting, we certainly need to be confident
they'll not
cause any new problems.  FWIW, 5259 is dramatically simpler...

Of course we also have
> in
much the
same territory, but there we're still blocked on someone
figuring out
what's going on with the 20% compile-time hit with T13056, and
whether
that's acceptable or not...

--
    Viktor.
___
ghc-devs mailing list
ghc-devs@haskell.org 
http://mail.haskell.org/cgi-bin/mailman/listinfo/ghc-devs


___
ghc-devs mailing list
ghc-devs@haskell.org 
http://mail.haskell.org/cgi-bin/mailman/listinfo/ghc-devs



___
ghc-devs mailing list
ghc-devs@haskell.org
http://mail.haskell.org/cgi-bin/mailman/listinfo/ghc-devs
___
ghc-devs mailing list
ghc-devs@haskell.org
http://mail.haskell.org/cgi-bin/mailman/listinfo/ghc-devs