On 10/28/19 9:17 PM, Marcos Caceres wrote:
On Tuesday, October 29, 2019 at 3:27:52 AM UTC+11, smaug wrote:
Quite often one has just a laptop. Not compiling tons of Rust stuff all the
time would be really nice.
(I haven't figured out when stylo decides to recompile itself - it seems to be
somewhat random.)
Probably a gross misunderstanding on my part, but the sccache project page states [1]:
"It is used as a compiler wrapper and avoids compilation when possible, storing a
cache in a remote storage using the Amazon Simple Cloud Storage Service (S3) API, the
Google Cloud Storage (GCS) API, or Redis."
I'm still (possibly naively) imagining that we will leverage the "the cloud"™️
to speed up compiles? Or am I totally misreading what the above is saying?
[1] https://github.com/mozilla/sccache#sccache---shared-compilation-cache
My experience with other distributed compilation tools (distcc, icecc)
indicates that cloud resources are going to be of very limited use here.
Compiles are just way too sensitive to network bandwidth and latency,
especially when compiling with debuginfo which tends to be extremely
large. Even if the network transfer takes way less time than the
compile, the sending/receiving scheduling never seems to work out very
well and things collapse down to a trickle.
Also, I've had very limited luck with using slow local machines. A CPU
is not a CPU -- even on a local gigabit network, farming off compiles
to slow machines is more likely to slow things down than speed them up.
Despite the fancy graphical tools, I was never completely satisfied with
my understanding of exactly why that is. It could be that a lack of
parallelism meant that everything ended up repeatedly waiting on the
slow machine to finish the last file in a directory (or whatever your
boundary of parallelism is). Or it could be network contention,
especially when your object files have massive debuginfo portions. (I
always wanted to have a way to generate split debuginfo, and not block
on the debuginfo transfers.) The tools tended to show things working
great for a while, and then slowing down to a snail's pace.
I've long thought [1] that predictive prefetching would be cool: when
you do something (eg pull from mozilla-central), a background task
starts prefetching cached build results that were generated remotely.
Your local compile would use them if they were available, or generate
them locally if not. That would at least do no harm (if you don't count
network bandwidth).
sccache's usage of S3 makes sense when running from within AWS. I'm
skeptical of its utility when running remotely. But I haven't tried
setting up sccache on my local network, and my internet connectivity
isn't great anyway.
I really ought to put my decade-old desktop into action again. My last
attempt was with icecc, and though it worked pretty well when it worked,
the pain in keeping it alive wasn't worth the benefit.
[1] Ancient history -
https://wiki.mozilla.org/Sfink/Thought_Experiment_-_One_Minute_Builds
_______________________________________________
dev-platform mailing list
dev-platform@lists.mozilla.org
https://lists.mozilla.org/listinfo/dev-platform