Hi Erik

I already lgtm'd the cl but for the record:

While jumbo support was dropped, we're happy to support our external
contributors with this, provided the changes are minimal and no support is
expected.
So undeffing macros makes sense. Renaming minor things is also possible.
However having tons of named namespaces would be out of scope.

Thanks,
Toon


On Thu, Feb 26, 2026 at 11:44 AM Erik Corry <[email protected]> wrote:

> It's not quite as radical as the jumbo build. As I recall, that tried
> to compile everything in one huge compilation. This meant it wasn't
> usable for regular development because a .cc file change would take 4
> minutes.
>
> In this case we are compiling up to 25 files at a time, and grouping
> them by directory.  That means a .cc file change is still only about
> 30s, but the saving in total CPU time is not as dramatic. It also
> means you don't get conflicts between completely unrelated files, only
> the ones that are in the same directory. The 25-file limit is
> configurable in your 'gn gen' or your args.gn - if you have a very
> large or small number of cores you may want to adjust. I don't think
> github builders have a huge number of cores for example.
>
> Sadly your numbers show how much we have regressed.  Neither the
> regular build, nor this cluster build are close to those numbers.
>
> I tried to get comparable numbers to your 8 year old test:
> * Building just d8 (not the tests)
> * -j8
> * No ccache
> * Current origin/main
> * After ninja -t clean
> * Xeon (probably much faster than your CPU of 8 years ago)
>
> On Thu, Feb 26, 2026 at 10:03 AM Ben Noordhuis <[email protected]> wrote:
> > 1. V8, clean normal build, make -j8: 6:30m wall clock time, 47:48m cpu
> time
>
> Now half that speed on faster CPUS: 13m49s wall clock time, 104:51m cpu
> time
>
> > 2. V8, clean jumbo build, make -j1: 4:37m wall clock time, 4:34m cpu time
>
> I hadn't tried a j1 cluster build until now, but here it is: Sadly
> although the cpu time
> is 4x better than non-cluster build, it's 6x slower than the old jumbo
> build.
> Cluster size turned up to max: 24m40s wall clock time, 24m31s cpu time.
>
> For a somewhat more realistic modern developer experience, here's the
> -j16 time for the same workload, default cluster size:
>
> $ time ninja -j16 -C out.gn/x64.release d8
> ninja: Entering directory `out.gn/x64.release'
> [2425/2425] LINK ./d8
>
> real 2m21.231s
> user 33m45.009s
> sys 1m37.414s
>
>
>
>
> >
> > It was so much faster in human time and CPU time - staggeringly so in
> > the case of the latter. Granted, that was 8 year ago!
> >
> > [0] https://chromium-review.googlesource.com/c/v8/v8/+/1890090
> > [1] https://github.com/nodejs/node/issues/18742
> >
> > --
> > --
> > v8-dev mailing list
> > [email protected]
> > http://groups.google.com/group/v8-dev
> > ---
> > You received this message because you are subscribed to the Google
> Groups "v8-dev" group.
> > To unsubscribe from this group and stop receiving emails from it, send
> an email to [email protected].
> > To view this discussion visit
> https://groups.google.com/d/msgid/v8-dev/CAHQurc_2dDruhqJG5RF1peNB9vVv9s5UuUwz9U_kVhnB5W-gtg%40mail.gmail.com
> .
>
>
>
> --
> --
> Erik Corry, working on V8 at Cloudflare
>
> --
> --
> v8-dev mailing list
> [email protected]
> http://groups.google.com/group/v8-dev
> ---
> You received this message because you are subscribed to the Google Groups
> "v8-dev" group.
> To unsubscribe from this group and stop receiving emails from it, send an
> email to [email protected].
> To view this discussion visit
> https://groups.google.com/d/msgid/v8-dev/CAHZxHpgypyqG-Hmz%3DSU7feAkqHr-HWwS6FfgEn-ausna4KyKsw%40mail.gmail.com
> .
>

-- 
-- 
v8-dev mailing list
[email protected]
http://groups.google.com/group/v8-dev
--- 
You received this message because you are subscribed to the Google Groups 
"v8-dev" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to [email protected].
To view this discussion visit 
https://groups.google.com/d/msgid/v8-dev/CANS-YRpKyp27PQAWwKNiX3nCO%2BpTRs9Tp0VHci_TCfy_%2BCts1w%40mail.gmail.com.

Reply via email to