On Fri, Feb 19, 2010 at 11:29 AM, Kristian Rosenvold <
kristian.rosenv...@gmail.com> wrote:

> It turns out that the biggest blocker in achieving /any/ reliable
> concurrent building within maven is the java file system, which is basically
> seems limited to single threaded visibility of file updates; I'm still
> trying to figure out what the rules /are/ in this context and I'm hoping
> someone here knows ;)  (for the gory details as far as I've gotten you can
> check out this post
> http://incodewetrustinc.blogspot.com/2010/02/concurrency-in-maven.html).
>
> Essentially this problem affects both the "parallel" mode and "weave" mode.
> Weave mode has higher concurrency and is hit harder. It seems like the only
> "solution" to this problem is aggressive forking; since this seems to
> delegate the whole issue to the OS. This may also mean that reliable
> concurrent building will only be available on OS'es that support reliable
> concurrent file visibility
> (modern linuxes do this very well, don't know about the others).
>
> Just to illustrate; on the C2D build box, my primary test build fails about
> once every 3-400 times with problems of this kind. Running the same build on
> the i7 box fails 1 in 5 times (the C2D is 32 bit JVM whilst the i7 is 64 bit
> vm (server) - which may also be relevant) - but it's file visibility issues
> that's the cause of the failure.
>
> I'm still trying to understand what rules actually apply or what it is
> actually possible to trust in this regard. If I turn on full paranoia, it
> seems to me like even the current parallel artifact download mechanism
> can be hit by this problem, since the files are not being written to disk
> by the thread that later consumes them.
>
> It seems to me like the only "reliable" way to do this (that is also
> "future proof")  is to run every module single-threaded until package-phase,
> in regular reactor order (using just 1 thread *total*). Then you can fork
> "test" for all reactor modules and thereafter complete the rest of the
> reactor on the same 1 thread you had initially. I'm probably overly
> paranoid.
>
> Anyone have any thoughts/experience with this subject ? Plz send me da
> codez ;)
>

I'm wondering how Eclipse IDE gets around this. If you have say 40 java
projects interconnected through project dependencies, and you do a full
clean of all projects it rebuilds them in concurrent fashion taking the
project dependency graph into account. And since projects use each others'
class files during compilation (or so it seems) i'm thinking that at some
point for large projects you would get the same issue every now and then. So
maybe somebody over there could explain you their approach and how they
solved it (maybe they're not using javac at all ??)

HTH
Jorg

Reply via email to