Jason van Zyl <[EMAIL PROTECTED]> schrieb am 04.12.2006 17:43:29:
> > As I said before, I've had very good results with a build system in
> > which
> > you can specify arbitrary phases and say "this phase depends on
> > that one".
>
> And how much luck have you had showing that system to other people?
> And how much luck have other people had looking at what you've made
> without them having to consult you?
It's about as hard to explain and understand as the Maven dependency
system. It's just an odd idea at first glance.
You see, just like Maven, we have several "big" phases ("generate",
"compile", "link", "site").
Eventually, we added generate-includes, generate-source, generate-scripts,
generate-makefiles, ...
The key to make this simple was "inversion of dependency". In my project
(module) I don't say "my new phase depends on generate" but "generate
depends on my new phase". When the build tool decides to look at
"generate", it first considers all the phases which "pre-depend" on it.
Like so:
## generate:: generate-includes
Now, MetaMake will look for projects which define "generate-includes" (or
new dependencies for it) and runs those phases/goals/targets first.
This way, each developer can create their own little world, making sure
all their phases are called in the correct order. You need a second
"generate include" phase? No problem. Hooking them into the big world is
also simple because of the predefined phases which are always there.
The implementation was dead simple (4h in plain C) and worked reliable at
the first attempt. We only had to add caching because searching 20'000
Makefiles for targets was bit slow :-) My project (AROS) is using this
build system for several years, now. Currently, we have roughly 5'000
phases which, oddly enough, makes the build more *simple*.
Let me explain. You have a big maven project with hundreds of modules (we
have about 200). How do you make sure that module X is built and ready
before module Y? Or, say, I need a generated file from a module now but
not all of it? In current Maven, you have to move things around in the
parent POMs. The dependencies "leak".
In our world, I say:
Project X:
## install:: workbench-icon
## workbench-icon:: icon-generator-install
... create the workbench icon ...
Project Y:
## site: workbench-icon
... use the icon ...
Let's imagine I have a clean checkout and want to see the website. Even
though workbench hasn't been completely yet, the single icon file can be
generated after the icon generator has been installed. When I say "mmake
site", it will compile and install the icon generator, generate the
workbench icon and then the HTML files.
In maven, this level of detail is probably not possible. Maven projects
are more encapsulated than ours and I don't think it's even necessary to
go to such lengths in Maven. Operating systems usually have many small
components with heavy dependencies between them, that's why we choose the
approach described above.
What I would like to see is a simple way to "'install' depends on my new
phase/target/goal" in a Mojo and in the execute-elements in the POM. That
way, I could do extra work before something is installed, without
influencing the standard lifecycle, other projects or plugins.
> Creating a system
> where you have random interaction can potentially create a system
> with extremely high infrastructural costs. Shared infrastructure
> means lower costs and that means a predictable system.
Random? Since when is the maven dependency resolution "random"? I haven't
seen the code but from what I see, it probebly works exactly like the
MetaMake target resolution.
Regards,
--
Aaron Digulla
---------------------------------------------------------------------
To unsubscribe, e-mail: [EMAIL PROTECTED]
For additional commands, e-mail: [EMAIL PROTECTED]