On Wed, Feb 10, 2010 at 10:58 AM, John Tamplin <j...@google.com> wrote:
> On Wed, Feb 10, 2010 at 10:45 AM, Lex Spoon <sp...@google.com> wrote: >> >> Is copying source code so inconvenient that it would be worth having a >> slower build? I would have thought any of the following would work to move >> source code from one machine to another: >> >> 1. rsync >> 2. jar + scp >> 3. "svn up" on the slave machines >> >> Do any of those seem practical for your situation, Alex? >> >> Overall, it's easy to provide an extra build staging as an option, but we >> support a number of build stagings already.... >> > > What does make it difficult is that you can't have a pool of worker > machines that can build any project that are asked of them without copying > the sources to the worker for each request. For a large project, this can > get problematic especially when you have to send the transitive > dependencies. > You assume the answer here, John. The question is, just why is copying source code problematic to begin with? Can anyone put their finger on it? One concern is that the copying might take too long. However, is there any project where it would take more than a few seconds? A few seconds seems like not a big deal for any build large enough to bother with parallel building. Another possible concern is the need to do some extra build configuration. It doesn't take much *build time* to copy the dependencies, but it takes *developer time* to set it up. Here I agree that it is some amount of extra work. However, it doesn't seem like much. You have to know what your dependencies are, and you have already worked out how to copy precompilation.ser, so how much more work is it to also send over the source code? Overall, I see that it worries people to send source code to the CompilePerms nodes. Yet, it seems entirely normal to me. When you do a distributed build, all the remote workers must have their inputs copied over to them over the network. Besides, what is gained by having the user have to arrange this copying > themselves rather than the current method of sending it as part of the > compile process? For example, distributed C/C++ compilers send the > preprocessed source to the worker nodes, so they don't have to have the > source or the same include files, we currently send the AST which is a > representation of the source, etc. > Compared to the status quo, we gain much faster builds. Compared to automatically copying, we have a fully specced out proposal. :) If we try to automatically copy dependencies, how would we we know exactly what to copy? Lex -- http://groups.google.com/group/Google-Web-Toolkit-Contributors