Hi Pali,

Apologies for the delayed response.

I treated cloud compilation as “free” in the context of the buildbots. If we 
can cross-compile (on Amazon EC2 or the like) ghcs which run on each arch we 
have for buildbots, the buildbots themselves will have 1/5 the load. I came to 
that figure from the buildbot page, where it looked like the average compile 
time was around 80 minutes, and the average test suite run was around 20 
minutes.

I see your point about cloud cross compilation and buildbot testing not 
covering all cases of regressions. I think this is where the CI vs. nightly 
builds distinction applies well. Cloud compilation and buildbot testing may be 
fast enough to do CI on every patch set, while total regression coverage could 
be provided by nightly builds. Jenkins CI allows us to roll our own CI with our 
own machines, cloud compute services, and loads of other 
content/auditing/workflow services.

That said, while I think it would be nice to have quick CI in addition to 
nightly builds, I don’t know if it’s sensible/desired for ghc. Since Jerkins CI 
is stable yet very actively developed, it seems at least it wouldn't incur too 
much maintenance on our part. Of course, the devil is in the details, so I’d be 
happy to set it up on a few of my machines to investigate.

Will


On Jun 20, 2014, at 6:15 AM, Páli Gábor János <[email protected]> wrote:

> Hello William,
> 
> 2014-06-20 0:50 GMT+02:00 William Knop <[email protected]>:
>> 1. We have a pretty good spread of buildbots, but as far as I know there 
>> aren’t
>> very many of them. Running only the test suite would increase their utility 
>> by
>> roughly 5x (from looking at the buildbot time breakdowns [1]).
> 
> How would this increase their utility?  I naively believe the purpose
> of CI is to rebuild and test the source code after each changeset to
> see if it was bringing regressions.  Running the test suite only does
> not seem to convey this.  Many of the regressions could be observed
> build-time, which means the most safe bet would be to rebuild and test
> everything on the very same platform.
> 
>> 2. Building ghc is time and resource intensive, which makes it hard for 
>> people
>> to host buildbots. Even though my machines are relatively new, I can’t 
>> usually
>> host one because it would interfere with my other work. I would be more
>> tempted to if it was limited to just the test suite, and perhaps others 
>> would as
>> well.
> 
> My buildbots complete the steps (git clone, full build, testing) in
> about 1 hour 40 minutes (with about 1 hour 15 minutes spent in the
> compilation phase), while they run in parallel with a shift about an
> hour.  They run on the same machine, together with the coordination
> server.  This is just a 3.4-GHz 4-core Intel Core i5, with a couple of
> GBs of RAM, I would not call it a high-end box, though.
> 
> Note that it is on purpose that the builders do not use -j for builds,
> meaning that they do not parallelize the invoked make(1)-subprocesses,
> which automatically makes the builds longer.  Perhaps it would be
> worth experimenting with incremental builds and allowing for parallel
> builds as they could cut down on the build times more efficiently.

_______________________________________________
ghc-devs mailing list
[email protected]
http://www.haskell.org/mailman/listinfo/ghc-devs

Reply via email to