Re: no more modules for specs...

2006-12-17 Thread Jason Dillon

On Dec 16, 2006, at 4:49 PM, David Jencks wrote:
Why do you want to rebuild released jars?  I certainly think the  
automated system should be rebuilding all the non-released code we  
know about, but I don't understand the point of ever rebuilding  
released code.  Is this because you think the jar in the remote  
repo will change?  I would think saving the expected hashcode and  
comparing with the actual hashcode would be more reliable.


2 reasons... one, to always be 100% sure the the codebase reflects  
the release binary, and to ensure that the codebase is always buildable.


This moves the trust from the released binary on some webserver  
somewhere back to the source code repository.



I don't really see rebuilding from source as a defense against the  
remote repo changing.  Everyone else is going to be using the  
remote repo, so even if we have a more correct locally built  
version everyone else will be screwed.


I don't see it that way at all... by building from source, if  
anything does happen to the remote artifacts, then it will be quickly  
apparent what happened... and more so the automated builds will keep  
working.  But as mentioned above, there is more to building the  
artifacts than simply to defend against artifacts being altered or  
removed.



I would think using an svn based repo or keeping our own audit  
trail (such as the hashes for every released artifact we use) would  
be more reliable.  If some released artifact changes, I think no  
automated recovery is possible: someone has to figure out why and  
figure out what to do about it, since maven allegedly guarantees  
that it will never happen.


Sure it will happen... and has happened and will continue to happen.   
Just because Maven's repo has some policy against artifact removal or  
tampering, does not mean that someone can't not hack into the system  
and change it... or a catastrophe might occur and loose a bunch of  
data, but more so... other groups run their own repositories, and  
there is no way to ensure that they follow the system set of policies  
as the Maven folks do... but even more so, even with those policies  
in place, it is still possible that artifacts may change, and thus...  
its not a really trustworthy repository of artifacts that we can rely  
upon to have the right artifacts to build past releases off of.


So, to mitigate some of that risk, I setup our components to build  
from source... so that we can always have automated builds running,  
even in interim times like we just say where artifacts have not yet  
propagated to central we could have still ensured that the build  
was functional.  But to do that effectively means that some practices  
for component integration need to be followed.




maybe I'm just being stupid but I'm not getting it yet.


No, you are not stupid...

I think my goals are simply a lot different than many others  
regarding these builds.  I want to build a system that can be used to  
run builds 100% of the time regardless of what dependencies are  
propagated into what repo, builds that can be easily distributed  
across nodes and using SNAPSHOT artifact outputs from one build to  
another.  Builds where you can easily run a set of tests on after a  
build and be able to see the entire chain of changes which were  
pulled into a build to correlate test pass/failures back to actual  
changes.


I am close to getting this done... recent work has put some extra  
problems for me to solve though.  But you will be able to run TCK  
tests on a specific CTS build, which was generated from a specific  
server build, and which was using a specific openejb2 build, etc.


Once the specs that have been released make it to central then I can  
fixed up the AH configuration and get back, just lost a few days.


BUT... when it does come a time when a spec needs to be fixed, or a  
new spec is introduced, then expect the automated builds to start  
failing, with missing dependencies if there is no SNAPSHOT  
deployed... and if there is a SNAPSHOT deployed, then start to expect  
that at times build may be using different artifacts for those  
modules... there is no way for me to determine where they came from  
or what specific code they contain.


And specs will change... this does happen... new ones do get added,  
that does happen, bugs are found, license muck might need to change  
again, blah, blah, blah.


I was trying to avoid any automated build breakage when we need to  
make any of those changes to our specs... but I guess I will just  
have to deal with it later when it does happen and shit start  
breaking.  Or maybe I will just leave it broken and let someone else  
fix it... though gut tells me its gonna be me.


--jason


Re: no more modules for specs...

2006-12-16 Thread Jason van Zyl


On 16 Dec 06, at 7:49 PM 16 Dec 06, David Jencks wrote:



On Dec 16, 2006, at 1:58 PM, Jason Dillon wrote:


On Dec 16, 2006, at 9:33 AM, Jason van Zyl wrote:
IMO, we release source code. Binary distributions and maven  
artifacts are a convenience. If users can't build our source  
code, then there's a problem.


You think your users build from sources to make their Geronimo  
servers for production or are you talking about just the specs? I  
would argue that it's rare for users to want to build everything  
from source, but even if they only built the Geronimo sources  
they still need all the binary dependencies at which point the  
quality of the repository matters. I think the discussion is  
germane in the context of your users building production systems  
from source.


The *user* that wants to build everything from source is me... for  
automated builds.  For our builds, and I had hoped for our  
releases too, that use the automated system to produce builds,  
which are always built from source (for our components) so that I  
can be 100% assured that when I make a build that I know exactly  
what code (from our components) was included.



My understanding is that geronimo (and openejb) are going to be  
using the latest released specs that we just voted on until someone  
finds a bug in one of them.


Why do you want to rebuild released jars?  I certainly think the  
automated system should be rebuilding all the non-released code we  
know about, but I don't understand the point of ever rebuilding  
released code.  Is this because you think the jar in the remote  
repo will change?  I would think saving the expected hashcode and  
comparing with the actual hashcode would be more reliable.


I don't really see rebuilding from source as a defense against the  
remote repo changing.  Everyone else is going to be using the  
remote repo, so even if we have a more correct locally built  
version everyone else will be screwed.  I would think using an svn  
based repo or keeping our own audit trail (such as the hashes for  
every released artifact we use) would be more reliable.  If some  
released artifact changes, I think no automated recovery is  
possible: someone has to figure out why and figure out what to do  
about it, since maven allegedly guarantees that it will never happen.


maybe I'm just being stupid but I'm not getting it yet.



You have it exactly. You would do exactly what large enterprises do  
in managing their own repositories. Some do it for security reasons  
and some have to do because metadata for some projects is just crap  
(Spring and Hibernate are great examples of completely hosing users  
by having bad metadata). It's not a trivial problem trying to make  
this work for the all the project using Maven but I still believe it  
is possible and not an intractable problem. Use a repo in SVN if you  
have to and layer that with the use of the central repository for  
thing you know are goods and work toward tapering off the need for a  
custom repo.


Users will eventually be the ones who will be the impetus for  
changing the quality of the repositories. As it will be a great  
indicator that a project who can't get their shit together to do  
something relatively simple like put some signed JARs on a webserver  
in a consistent way probably don't have good test coverage, don't  
have good CI, don't have API stability and don't have good level of  
quality. It will be a clear indicator of control over your processes  
and folks trying to consume various artifacts will really go find  
something that can be immediately absorbed into their system i.e. not  
use your stuff.


I'm definitely practical and I have never advocated using the central  
repository to any enterprise client I've had. I'm the first to admit  
that full audit trail, and absolute levels of security are not in  
effect. But the central repository is still highly useful especially  
when  you use it to populate the repository you do use for your  
production purposes.


Jason.


thanks
david jencks



The remote repo is still there for other users that don't need  
that assurance or don't have time to go and build everything...  
but I do want that... and I believe that it is in the best  
interest of the community to get that too.


--jason








Re: no more modules for specs...

2006-12-16 Thread David Jencks


On Dec 16, 2006, at 1:58 PM, Jason Dillon wrote:


On Dec 16, 2006, at 9:33 AM, Jason van Zyl wrote:
IMO, we release source code. Binary distributions and maven  
artifacts are a convenience. If users can't build our source  
code, then there's a problem.


You think your users build from sources to make their Geronimo  
servers for production or are you talking about just the specs? I  
would argue that it's rare for users to want to build everything  
from source, but even if they only built the Geronimo sources they  
still need all the binary dependencies at which point the quality  
of the repository matters. I think the discussion is germane in  
the context of your users building production systems from source.


The *user* that wants to build everything from source is me... for  
automated builds.  For our builds, and I had hoped for our releases  
too, that use the automated system to produce builds, which are  
always built from source (for our components) so that I can be 100%  
assured that when I make a build that I know exactly what code  
(from our components) was included.



My understanding is that geronimo (and openejb) are going to be using  
the latest released specs that we just voted on until someone finds a  
bug in one of them.


Why do you want to rebuild released jars?  I certainly think the  
automated system should be rebuilding all the non-released code we  
know about, but I don't understand the point of ever rebuilding  
released code.  Is this because you think the jar in the remote repo  
will change?  I would think saving the expected hashcode and  
comparing with the actual hashcode would be more reliable.


I don't really see rebuilding from source as a defense against the  
remote repo changing.  Everyone else is going to be using the remote  
repo, so even if we have a more correct locally built version  
everyone else will be screwed.  I would think using an svn based repo  
or keeping our own audit trail (such as the hashes for every released  
artifact we use) would be more reliable.  If some released artifact  
changes, I think no automated recovery is possible: someone has to  
figure out why and figure out what to do about it, since maven  
allegedly guarantees that it will never happen.


maybe I'm just being stupid but I'm not getting it yet.

thanks
david jencks



The remote repo is still there for other users that don't need that  
assurance or don't have time to go and build everything... but I do  
want that... and I believe that it is in the best interest of the  
community to get that too.


--jason





Re: no more modules for specs...

2006-12-16 Thread Jason Dillon
I'm sorry I did not have the luxury of waiting... this all blew up in  
my face, effectivly blocking me from progress which had verbally  
committed to being finished with.  Now I have to come up with another  
solution... or simple stop working on this.


 * * *

I wish I could have convinced you (and others) that this versioning  
scheme was going to cause problems... before it actually caused  
them.  But it is clear that any further talk about the versioning was  
going to get us no where.


--jason


On Dec 16, 2006, at 10:57 AM, Dain Sundstrom wrote:

I wish you had just waited a day while we stabilized after I  
released 32 projects (or 5 if you don't want to count the specs  
individually).  I am going to be merging everything to the staging  
repo into the release dir.


After the merge we will not have the SNAPSHOT mess that created  
this problem in the first place.  I only hope we use this as a  
learning experience and do not use SNAPSHOTS so readily.


-dain

On Dec 15, 2006, at 11:24 PM, Jason Dillon wrote:


I agree and disagree... but I 'm gonna shut up.

I think you guys are moving in positive direction with mvn...

i'm just pissed off that the recent changes to specs has derailed  
my automation efforts.


I certainly did not mean for this email to turn into a mvn ranting  
session.


and maybe tomorrow I can cope better with this and provide a  
reasonable response to this email.


I appreciate the work you guys have been doing... and while not  
perfect, IMO it is positive direction.


and I will leave it at that for now.

thanks for your time... sorry if I ruffled some feathers.







Re: no more modules for specs...

2006-12-16 Thread Jason Dillon

On Dec 16, 2006, at 9:33 AM, Jason van Zyl wrote:
IMO, we release source code. Binary distributions and maven  
artifacts are a convenience. If users can't build our source code,  
then there's a problem.


You think your users build from sources to make their Geronimo  
servers for production or are you talking about just the specs? I  
would argue that it's rare for users to want to build everything  
from source, but even if they only built the Geronimo sources they  
still need all the binary dependencies at which point the quality  
of the repository matters. I think the discussion is germane in the  
context of your users building production systems from source.


The *user* that wants to build everything from source is me... for  
automated builds.  For our builds, and I had hoped for our releases  
too, that use the automated system to produce builds, which are  
always built from source (for our components) so that I can be 100%  
assured that when I make a build that I know exactly what code (from  
our components) was included.


The remote repo is still there for other users that don't need that  
assurance or don't have time to go and build everything... but I do  
want that... and I believe that it is in the best interest of the  
community to get that too.


--jason



Re: no more modules for specs...

2006-12-16 Thread Jason Dillon

On Dec 16, 2006, at 8:26 AM, Kevan Miller wrote:
Jason Dillon, what source are you checking out? geronimo/specs/ 
trunk? I see that geronimo/specs/trunk still contains many sub- 
directories. IIUC, this is just a point in time statement. geronimo/ 
specs/trunk/pom.xml should be updated to be at 1.3-SNAPSHOT. Those  
sub-directories should be going away. Only specs which are under  
development should reside in trunk. At least, that's what I thought  
we'd agreed to... Released specs should be in tags. If you want to  
build our released specs from source, you need to get them from  
tags and build each individually. If that's not working, then I'd  
agree there's a problem.


IMO it is very, very, very unnatural to move code from its trunk  
codeline to a tag, then back when some new changes need to be made.   
I would never recommend doing that... ever.  IMO that is abuse of  
svn's tagging mechanism... and I only think that it looks desirable  
to get around issues with how we are using Maven, or how Maven's  
release support is current implemented.


--jason




Re: no more modules for specs...

2006-12-16 Thread Jason Dillon

On Dec 16, 2006, at 6:08 AM, Jason van Zyl wrote:
I think that the mvn build we have no is already fairly hard for  
folks to comprehend and would probably fall apart unless someone  
like me was here to answer everyones questions and monitor changes  
to keep things in check.  I think that is no different than if we  
were using Ant.


I think you under estimate what other people know about Maven in  
that they can apply their knowledge of the infrastructure from  
other projects here provided you're following standard conventions.  
And it's complicated because you're fighting a losing battle with  
too many SNAPSHOTs. Even if the snapshot artifact resolution was  
perfect you would still have these problems. You are getting bitten  
by some severe technical problems in Maven, no doubt, but you are  
exacerbating the situation by using the "SNAPSHOT" identifier  
everywhere which inherently means "a shifting pile of sand" to  
Maven. Start moving to eliminate them and your instabilities will  
lessen tremendously.


Why the heck are you going off on SNAPSHOTS?  The mail I send  
initially was talking about versioning of a project's modules... and  
some pitfalls with remote repositories... and you basically turned  
that into SNAPSHOT this and SNAPSHOT that...



I actually think that Ant + a few tasks to manage limited remote  
dep inclusion + groovy scripting would be a very powerful  
combination and would be at least half as complicated as our Maven  
build.


That's what people always say, but I have first hand experience  
with many large companies with large builds and they are moving  
away from their homebrew systems because they are sinking projects.  
If you used Maven in a way you describe above you would have  
stability i.e. no snapshots. If you're going to script something  
then create a plugin to walk your dependencies and replace the  
"SNAPSHOT" markers with real timestamps and you will have  
stability. You are so much closer to having something working all  
the time then you think.


Again with the SNAPSHOT muck... its more about the entire remote repo  
handling than it is about SNAPSHOTS.  Even if a project did not use  
any SNAPSHOTS, then same basic problems with remote repos exist...  
they are not part of the direct audit trail of the project, its quite  
easy for anyone to change any artifact which could completely fuck  
your build over, with out you even knowing about it.


If you take the remote repository out of the mix, then this can't  
happen... since all of your inputs must be local, and probably in  
your source control repo, you have a changelog, you can see when  
someone changes an artifact and with a build automation system setup  
you can see when those changes break things.  Your build is also not  
at the mercy of your network connectivity... and thus less likely to  
freak out when a network hickup occurs, less likely to waste your  
time building 80% of a project just to have it die on a missing  
dependency due to a repo being blacklisted. This is my point...  
nothing at all to do with using SNAPSHOTS or not.


Ant does not force a remote repo on to its projects, and IMO that is  
a huge plus over Maven.  Maven still has its plugins and reactor  
going for it though.  I still like Maven... I just want to have  
complete control over how and when it uses its remoteness to pull in  
stuff that will affect my builds.  And more so I want to be able to  
disable its default remoteness (central) and hardcode my projects to  
use a svn-based repo implementation that will always have the right  
versions of my dependencies.



I still think remote repos suck... but, maybe you guys will  
eventually find a better solution to that.


I don't think they suck, I think you're just getting bitten  
severely by the erratic snapshot handling and overuse of snapshots.  
We will eventually get the repository under control as it's clear  
now they have become a mission critical resource for many people,  
we understand that and it will be fixed. There's going to be a lot  
of bitching when we actually turn on the gauntlet. Any release, for  
example which contains a "SNAPSHOT" identifier anywhere in the  
graph will simply be rejected.


No... as explained above... its much, much, much more than the  
limited SNAPSHOT context which you seem to be stuck in.


--jason




Re: no more modules for specs...

2006-12-16 Thread Jason Dillon

On Dec 16, 2006, at 5:51 AM, Jason van Zyl wrote:
The central repository itself has always been pretty stable with no  
safeguards. I realize not having these safeguards is not great but  
things don't just disappear off that machine. We have a huge  
problem, it appears,  with the syncs we are pulling in  
automatically. Organization wide syncs are soon going to stop and  
it's going to be per-project so that when garbage appears we will  
know immediately who's polluting the repository, Archiva will also  
keep track of deletions. So yes, I agree on one hand that we need a  
watchdog in place but we are not randomly jumbling stuff around on  
the central repository. We're getting burned from our source syncs  
and the misuse of SNAPSHOT repositories for the most part.


Put the contents of central under svn, and expose the change log...  
that would be a good start.  That will help reduce the window of  
error, though it does not completely eliminate it.  If you also allow  
the central repo to be pulled via svn's versioned dav protocol, then  
you can also eliminate problems when an artifact is being changed...  
so that when the build runs, it will always use the same rev# when  
retrieving artifacts (picking whatever is the latest at the time when  
its initialized and keeping that for the entire cycle).


I think that all mvn repos should behave like this actually... and if  
they did I would have much less to complain about, since that would  
provide a rich audit trail and atomic access to artifacts.


Only thing left would be to allow projects to add specific tags to  
those artifacts on central during release time, so that at any point,  
a project could always get the complete set of artifacts it needed to  
build.



Another comment I will make is that I am fairly sure there are  
severe bugs in the maven artifact resolution process when  
snapshots are present.


There are a huge number, I believe it's completely unreliable and  
it's going to need an overhaul. It was very apparent from my last  
round of travels that in many cases especially when snapshots are  
used there are severe problems. I think we underestimated the use  
snapshots and how prevalent their use would be for external  
dependencies.


Ya, maybe... I think almost everyone is using them for development,  
and most people who are working with other groups to get changes made  
are pulling in those snapshots.


This appears to be the recommended way to handle these types of  
dependencies which change rapidly using Maven.  IMO its lossy and  
harmful in many cases... and that is why I prefer using source repos  
to handle this, so that I can always get a snapshot of the source  
code, build it and integration.  I believe mvn's snapshot's are  
really only a convenience for folks that don't want to build a bunch  
of stuff, but are willing to live with the possibility that they are  
not using the latest version of the codebase.


But, personally... and specifically for build automation, I would  
much rather work directly from source code so that I was 100% sure  
what was getting pulled in, and have a changelog to correlate with.   
And... well, currently Maven does more to get in the way to implement  
that solution than it does to help get there.



This is because if I remove all org.apache.geronimo.modules  
artifacts from my local repo and build the corresponding part of  
geronimo, if I build online I usually get errors together with  
downloaded timestamped artifacts whereas if I build offline the  
build succeeds.


Yup, that's a patch we applied for Jason to provide a stopgap  
solution. Where no snapshots will be updated when building.


Ya, thanks by the way.  Not sure when I will actually get to be able  
to use that though... I don't think I want to recommend that anyone  
use an pre-release of Maven to get around issues with the current  
version.



Note carefully that I am only building geronimo artifacts and  
there is no change whatsoever in non-geronimo artifacts in my  
local repo.  I think nearly every time we've made a change  
involving more than one module since we started using m2 and  
pushing snapshots to the snapshot repo we've had user complaints  
that the build is broken, and the solution always is to build  
offline.


Snapshots are an inherit instability but there are definitely error  
in working with snapshots in maven-artifact and it's bad. I see it  
as the most critical problem with 2.0.x. But moving toward using  
less of them even if that's locking to some timestamped versions  
will help greatly.


Timestampped versions cause enough problems by themselves... and I  
don't recommend using them directly.


The thing is that... we need to have a simple mechanism to easily  
pick up changes made by other dependency projects with out needing  
them to make a release and/or changing our poms to pick up new bits.   
It would be a PITA if everytime a change was made to OpenEJB to  
support G

Re: no more modules for specs...

2006-12-16 Thread Jason Dillon

On Dec 16, 2006, at 1:14 PM, Kevan Miller wrote:
Jason Dillon said he was having a problem building our specs. I'm  
trying to understand his problem.


There are 2 issues... first, specs/trunk does not build asis (which I  
assume dain is gonna tidy up)...


Second, is that the 1.2 branch now depends on specs which are not in  
one place for me (or anthill in this case) to easily check out build  
and then deliver to be used by the openejb2 and geronimo builds.  The  
code is now split across 30 some tags, which becomes a real pain for  
me to automate building or manage as specs are changed and versions/ 
tags need to be updated.


I really would like to say, that server/trunk uses all of the modules  
from specs/trunk, and that openejb/branches/2_2 uses specs/trunk.


Before the change, I could almost say that... but it was more like  
server/trunk uses a handful of modules from specs/trunk and the rest  
were all from public repos.  The recent change, has made it so that  
none of the modules from specs/trunk are used, and that handful of  
modules is not yet on a public repo which means that the codeline  
will not build.


And since I have a direct build artifact output  dependency tree  
setup in Anthill to go from nothing to a CTS server... with that  
second link broken (specs) I can't build anything past it.  And I can  
not finish up the TCK testsuite automation, which I was close to  
completing.


--jason


Re: no more modules for specs...

2006-12-16 Thread Kevan Miller


On Dec 16, 2006, at 12:33 PM, Jason van Zyl wrote:



On 16 Dec 06, at 11:26 AM 16 Dec 06, Kevan Miller wrote:



On Dec 15, 2006, at 6:41 PM, Jason Dillon wrote:

this change really killed me for all of the build automation I  
have been working on... and was one of the reasons why I had  
warned against using this style of versioning.


basically since this was removed, checking out the specs project  
and building it produces nothing, except for this:


org/apache/geronimo/specs/specs/maven-metadata-local.xml
org/apache/geronimo/specs/specs/1.2/specs-1.2.pom
org/apache/geronimo/specs/specs/1.2/specs-1.2-site.xml




I think this discussion has gotten a bit off track. I don't think  
we should be discussing archiva capabilities, at all.


IMO, we release source code. Binary distributions and maven  
artifacts are a convenience. If users can't build our source code,  
then there's a problem.


You think your users build from sources to make their Geronimo  
servers for production or are you talking about just the specs? I  
would argue that it's rare for users to want to build everything  
from source, but even if they only built the Geronimo sources they  
still need all the binary dependencies at which point the quality  
of the repository matters. I think the discussion is germane in the  
context of your users building production systems from source.


Jason van Zyl,
No, I don't expect users to be building specs from source. However,  
if users chose to build our sources, I expect our sources to build.  
If users find problems, I expect that they can patch our source and  
build patched versions of our software. Do you disagree with any of  
that?


Jason Dillon said he was having a problem building our specs. I'm  
trying to understand his problem.


The behavior and reliability of maven repos is a great discussion to  
have. It's just moving off of what I think is the core issue...


--kevan 
  


Re: no more modules for specs...

2006-12-16 Thread Dain Sundstrom
I wish you had just waited a day while we stabilized after I released  
32 projects (or 5 if you don't want to count the specs  
individually).  I am going to be merging everything to the staging  
repo into the release dir.


After the merge we will not have the SNAPSHOT mess that created this  
problem in the first place.  I only hope we use this as a learning  
experience and do not use SNAPSHOTS so readily.


-dain

On Dec 15, 2006, at 11:24 PM, Jason Dillon wrote:


I agree and disagree... but I 'm gonna shut up.

I think you guys are moving in positive direction with mvn...

i'm just pissed off that the recent changes to specs has derailed  
my automation efforts.


I certainly did not mean for this email to turn into a mvn ranting  
session.


and maybe tomorrow I can cope better with this and provide a  
reasonable response to this email.


I appreciate the work you guys have been doing... and while not  
perfect, IMO it is positive direction.


and I will leave it at that for now.

thanks for your time... sorry if I ruffled some feathers.





Re: no more modules for specs...

2006-12-16 Thread Jason van Zyl


On 16 Dec 06, at 11:26 AM 16 Dec 06, Kevan Miller wrote:



On Dec 15, 2006, at 6:41 PM, Jason Dillon wrote:

this change really killed me for all of the build automation I  
have been working on... and was one of the reasons why I had  
warned against using this style of versioning.


basically since this was removed, checking out the specs project  
and building it produces nothing, except for this:


org/apache/geronimo/specs/specs/maven-metadata-local.xml
org/apache/geronimo/specs/specs/1.2/specs-1.2.pom
org/apache/geronimo/specs/specs/1.2/specs-1.2-site.xml




I think this discussion has gotten a bit off track. I don't think  
we should be discussing archiva capabilities, at all.


IMO, we release source code. Binary distributions and maven  
artifacts are a convenience. If users can't build our source code,  
then there's a problem.


You think your users build from sources to make their Geronimo  
servers for production or are you talking about just the specs? I  
would argue that it's rare for users to want to build everything from  
source, but even if they only built the Geronimo sources they still  
need all the binary dependencies at which point the quality of the  
repository matters. I think the discussion is germane in the context  
of your users building production systems from source.


Jason.



Jason Dillon, what source are you checking out? geronimo/specs/ 
trunk? I see that geronimo/specs/trunk still contains many sub- 
directories. IIUC, this is just a point in time statement. geronimo/ 
specs/trunk/pom.xml should be updated to be at 1.3-SNAPSHOT. Those  
sub-directories should be going away. Only specs which are under  
development should reside in trunk. At least, that's what I thought  
we'd agreed to... Released specs should be in tags. If you want to  
build our released specs from source, you need to get them from  
tags and build each individually. If that's not working, then I'd  
agree there's a problem.


--kevan





Re: no more modules for specs...

2006-12-16 Thread Kevan Miller


On Dec 15, 2006, at 6:41 PM, Jason Dillon wrote:

this change really killed me for all of the build automation I have  
been working on... and was one of the reasons why I had warned  
against using this style of versioning.


basically since this was removed, checking out the specs project  
and building it produces nothing, except for this:


org/apache/geronimo/specs/specs/maven-metadata-local.xml
org/apache/geronimo/specs/specs/1.2/specs-1.2.pom
org/apache/geronimo/specs/specs/1.2/specs-1.2-site.xml




I think this discussion has gotten a bit off track. I don't think we  
should be discussing archiva capabilities, at all.


IMO, we release source code. Binary distributions and maven artifacts  
are a convenience. If users can't build our source code, then there's  
a problem.


Jason Dillon, what source are you checking out? geronimo/specs/trunk?  
I see that geronimo/specs/trunk still contains many sub-directories.  
IIUC, this is just a point in time statement. geronimo/specs/trunk/ 
pom.xml should be updated to be at 1.3-SNAPSHOT. Those sub- 
directories should be going away. Only specs which are under  
development should reside in trunk. At least, that's what I thought  
we'd agreed to... Released specs should be in tags. If you want to  
build our released specs from source, you need to get them from tags  
and build each individually. If that's not working, then I'd agree  
there's a problem.


--kevan


Re: no more modules for specs...

2006-12-16 Thread Jason van Zyl


On 16 Dec 06, at 2:50 AM 16 Dec 06, Jason Dillon wrote:


On Dec 15, 2006, at 11:05 PM, Jason van Zyl wrote:
I doubt it. When it came down to releases and updating artifacts  
and trying to tie everything all together. Lots of people might  
have problems but even now I bet there are people who could help  
with the build if necessary. If you did it in Ant you would be the  
only person who would know how it worked. You would get the nice  
Inner Platform Effect with a build your size:


http://thedailywtf.com/forums/69415/ShowPost.aspx

I'd bet my life you would have more overall problems using Ant.  
Just because you could get it to work doesn't mean it would scale  
or be something anyone else could comprehend. It's probably  
already hard enough with what you have.


Dang, I could not resist...

I think that the mvn build we have no is already fairly hard for  
folks to comprehend and would probably fall apart unless someone  
like me was here to answer everyones questions and monitor changes  
to keep things in check.  I think that is no different than if we  
were using Ant.


I think you under estimate what other people know about Maven in that  
they can apply their knowledge of the infrastructure from other  
projects here provided you're following standard conventions. And  
it's complicated because you're fighting a losing battle with too  
many SNAPSHOTs. Even if the snapshot artifact resolution was perfect  
you would still have these problems. You are getting bitten by some  
severe technical problems in Maven, no doubt, but you are  
exacerbating the situation by using the "SNAPSHOT" identifier  
everywhere which inherently means "a shifting pile of sand" to Maven.  
Start moving to eliminate them and your instabilities will lessen  
tremendously.




I actually think that Ant + a few tasks to manage limited remote  
dep inclusion + groovy scripting would be a very powerful  
combination and would be at least half as complicated as our Maven  
build.




That's what people always say, but I have first hand experience with  
many large companies with large builds and they are moving away from  
their homebrew systems because they are sinking projects. If you used  
Maven in a way you describe above you would have stability i.e. no  
snapshots. If you're going to script something then create a plugin  
to walk your dependencies and replace the "SNAPSHOT" markers with  
real timestamps and you will have stability. You are so much closer  
to having something working all the time then you think.


But... I'd rather have Maven with a richer control over how deps  
get pulled from remote repos, and more control over how local repos  
get installed/pulled into the cache.


 * * *

Anyways, I just want to be able to build G projects/components from  
source, pull in external binary deps and generate assemblies for  
specific branches.  This was working fine before... and only the  
recent change of the specs versioning has tossed me through a  
loop.  The solution is to make more project configurations to  
handle each spec, but that is not scalable at all...


And... well, I think the only fault here really is that people look  
at other mvn projects and just follow them... regardless if they  
make sense for the problem at hand.


I still think remote repos suck... but, maybe you guys will  
eventually find a better solution to that.


I don't think they suck, I think you're just getting bitten severely  
by the erratic snapshot handling and overuse of snapshots. We will  
eventually get the repository under control as it's clear now they  
have become a mission critical resource for many people, we  
understand that and it will be fixed. There's going to be a lot of  
bitching when we actually turn on the gauntlet. Any release, for  
example which contains a "SNAPSHOT" identifier anywhere in the graph  
will simply be rejected.


Jason.



--jason





Re: no more modules for specs...

2006-12-16 Thread Jason van Zyl


On 16 Dec 06, at 3:40 AM 16 Dec 06, David Jencks wrote:



On Dec 15, 2006, at 11:05 PM, Jason van Zyl wrote:



Then don't use those repos, or label them as snapshot repos. As  
far as Geronimo is concerned why do you need anything more then  
central as a source? Aside from your SNAPSHOT dependencies.


This will only stop when Archiva is in full effect. The only way  
to submit anything to central will be via Archiva. Any project who  
wishes to have the same stability will only take artifacts that  
have passed through and instance of Archiva. You'll know you're  
using an instance of Archiva because we'll have a wagon for doing  
that and it will be configured. It will eventually be the default.  
It will simply be the Grizzly client and Jetty using the Grizzly  
connector.


Jason, one thing I'd like to point out here is that to a large  
extent jdillon has been saying "the current state of maven remote  
repos is unreliable" and you are saying, "no, as soon as we get  
archiva, signatures, audit trails, etc etc etc working they will be  
reliable".  That's agreeing with jdillon that the current state of  
maven remote repos is unreliable since they don't have signed  
artifacts and an audit trail (at least).  Just because you wish  
remote repos worked and were reliable does not mean they are  
today.  I personally don't think they will be satisfactory until  
you have a revocation procedure in place as well as signing and an  
audit trail.  I suspect that making this distributed system  
reliable is going to be much much harder than you imagine: I hope  
I'm wrong because if it works it would be really great.


The central repository itself has always been pretty stable with no  
safeguards. I realize not having these safeguards is not great but  
things don't just disappear off that machine. We have a huge problem,  
it appears,  with the syncs we are pulling in automatically.  
Organization wide syncs are soon going to stop and it's going to be  
per-project so that when garbage appears we will know immediately  
who's polluting the repository, Archiva will also keep track of  
deletions. So yes, I agree on one hand that we need a watchdog in  
place but we are not randomly jumbling stuff around on the central  
repository. We're getting burned from our source syncs and the misuse  
of SNAPSHOT repositories for the most part.


Another comment I will make is that I am fairly sure there are  
severe bugs in the maven artifact resolution process when snapshots  
are present.


There are a huge number, I believe it's completely unreliable and  
it's going to need an overhaul. It was very apparent from my last  
round of travels that in many cases especially when snapshots are  
used there are severe problems. I think we underestimated the use  
snapshots and how prevalent their use would be for external  
dependencies.


This is because if I remove all org.apache.geronimo.modules  
artifacts from my local repo and build the corresponding part of  
geronimo, if I build online I usually get errors together with  
downloaded timestamped artifacts whereas if I build offline the  
build succeeds.


Yup, that's a patch we applied for Jason to provide a stopgap  
solution. Where no snapshots will be updated when building.


Note carefully that I am only building geronimo artifacts and there  
is no change whatsoever in non-geronimo artifacts in my local  
repo.  I think nearly every time we've made a change involving more  
than one module since we started using m2 and pushing snapshots to  
the snapshot repo we've had user complaints that the build is  
broken, and the solution always is to build offline.


Snapshots are an inherit instability but there are definitely error  
in working with snapshots in maven-artifact and it's bad. I see it as  
the most critical problem with 2.0.x. But moving toward using less of  
them even if that's locking to some timestamped versions will help  
greatly.




Your complaints about any already released geronimo artifacts are  
totally irrelevant  unless you want to recommend we move back to m1  
since the 1.2-beta and 2.0-M1 are the first releases we've tried to  
do with m2 (except for specs, which got messed up in various other  
ways but have not been a giant problem until recently).


With m1 or m2 a release with snapshots is deadly. The practice seems  
to be something present regardless of what version of Maven you're  
using. The concept of a SNAPSHOT is the same in both versions though  
implemented differently.


Even in the face of the instability with SNAPSHOT handling in m2 I  
think you can eliminate a lot of it by getting off many of your  
SNAPSHOTs and I am trying to get out 2.0.5 which now contains a fix  
that always takes SNAPSHOTs locally if you have them.


Jason.



thanks
david jencks






Re: no more modules for specs...

2006-12-16 Thread Matt Hogstrom


On Dec 15, 2006, at 10:51 PM, Jason van Zyl wrote:




IMO the remote repos are for user connivence *ONLY*,


That is horse shit. What they contain is a product of what you  
people put in them. You guys for example have something that is


I understand the passion and frustration in the thread.  However, we  
just went through a really crappy (not to over use the metaphor) time  
in the project and have moved to a productive stage.  Let's keep the  
communication level at the respect level.  I'm not picking on you  
Jason, I think Dain referred to lame arguments in another thread.   
I'm responding to this one as I've seen an increase in "attack" mode  
and I'm not going back there.


Thanks

Matt Hogstrom
[EMAIL PROTECTED]




Re: no more modules for specs...

2006-12-16 Thread David Jencks


On Dec 15, 2006, at 11:05 PM, Jason van Zyl wrote:



Then don't use those repos, or label them as snapshot repos. As far  
as Geronimo is concerned why do you need anything more then central  
as a source? Aside from your SNAPSHOT dependencies.


This will only stop when Archiva is in full effect. The only way to  
submit anything to central will be via Archiva. Any project who  
wishes to have the same stability will only take artifacts that  
have passed through and instance of Archiva. You'll know you're  
using an instance of Archiva because we'll have a wagon for doing  
that and it will be configured. It will eventually be the default.  
It will simply be the Grizzly client and Jetty using the Grizzly  
connector.


Jason, one thing I'd like to point out here is that to a large extent  
jdillon has been saying "the current state of maven remote repos is  
unreliable" and you are saying, "no, as soon as we get archiva,  
signatures, audit trails, etc etc etc working they will be  
reliable".  That's agreeing with jdillon that the current state of  
maven remote repos is unreliable since they don't have signed  
artifacts and an audit trail (at least).  Just because you wish  
remote repos worked and were reliable does not mean they are today.   
I personally don't think they will be satisfactory until you have a  
revocation procedure in place as well as signing and an audit trail.   
I suspect that making this distributed system reliable is going to be  
much much harder than you imagine: I hope I'm wrong because if it  
works it would be really great.


Another comment I will make is that I am fairly sure there are severe  
bugs in the maven artifact resolution process when snapshots are  
present.  This is because if I remove all org.apache.geronimo.modules  
artifacts from my local repo and build the corresponding part of  
geronimo, if I build online I usually get errors together with  
downloaded timestamped artifacts whereas if I build offline the build  
succeeds.  Note carefully that I am only building geronimo artifacts  
and there is no change whatsoever in non-geronimo artifacts in my  
local repo.  I think nearly every time we've made a change involving  
more than one module since we started using m2 and pushing snapshots  
to the snapshot repo we've had user complaints that the build is  
broken, and the solution always is to build offline.


Your complaints about any already released geronimo artifacts are  
totally irrelevant  unless you want to recommend we move back to m1  
since the 1.2-beta and 2.0-M1 are the first releases we've tried to  
do with m2 (except for specs, which got messed up in various other  
ways but have not been a giant problem until recently).


thanks
david jencks



Re: no more modules for specs...

2006-12-15 Thread Jason Dillon

On Dec 15, 2006, at 11:05 PM, Jason van Zyl wrote:
I doubt it. When it came down to releases and updating artifacts  
and trying to tie everything all together. Lots of people might  
have problems but even now I bet there are people who could help  
with the build if necessary. If you did it in Ant you would be the  
only person who would know how it worked. You would get the nice  
Inner Platform Effect with a build your size:


http://thedailywtf.com/forums/69415/ShowPost.aspx

I'd bet my life you would have more overall problems using Ant.  
Just because you could get it to work doesn't mean it would scale  
or be something anyone else could comprehend. It's probably already  
hard enough with what you have.


Dang, I could not resist...

I think that the mvn build we have no is already fairly hard for  
folks to comprehend and would probably fall apart unless someone like  
me was here to answer everyones questions and monitor changes to keep  
things in check.  I think that is no different than if we were using  
Ant.


I actually think that Ant + a few tasks to manage limited remote dep  
inclusion + groovy scripting would be a very powerful combination and  
would be at least half as complicated as our Maven build.


But... I'd rather have Maven with a richer control over how deps get  
pulled from remote repos, and more control over how local repos get  
installed/pulled into the cache.


 * * *

Anyways, I just want to be able to build G projects/components from  
source, pull in external binary deps and generate assemblies for  
specific branches.  This was working fine before... and only the  
recent change of the specs versioning has tossed me through a loop.   
The solution is to make more project configurations to handle each  
spec, but that is not scalable at all...


And... well, I think the only fault here really is that people look  
at other mvn projects and just follow them... regardless if they make  
sense for the problem at hand.


I still think remote repos suck... but, maybe you guys will  
eventually find a better solution to that.


--jason


Re: no more modules for specs...

2006-12-15 Thread Jason Dillon

I agree and disagree... but I 'm gonna shut up.

I think you guys are moving in positive direction with mvn...

i'm just pissed off that the recent changes to specs has derailed my  
automation efforts.


I certainly did not mean for this email to turn into a mvn ranting  
session.


and maybe tomorrow I can cope better with this and provide a  
reasonable response to this email.


I appreciate the work you guys have been doing... and while not  
perfect, IMO it is positive direction.


and I will leave it at that for now.

thanks for your time... sorry if I ruffled some feathers.

--jason


On Dec 15, 2006, at 11:05 PM, Jason van Zyl wrote:



On 16 Dec 06, at 12:10 AM 16 Dec 06, Jason Dillon wrote:


On Dec 15, 2006, at 7:51 PM, Jason van Zyl wrote:

IMO the remote repos are for user connivence *ONLY*,


That is horse shit. What they contain is a product of what you  
people put in them. You guys for example have something that is  
tagged, which you released and which is not reproducible. That's  
not Maven's fault, that's your fault. You are hosing anyone  
trying to consume your stuff. And if people are doing that to you  
then your should irate like anyone should be who looks at your  
tag for 1.1.1. That's why the release plugin behaves the way it  
does. It screamed at you when you tried to release that 1.1.1 tag  
I bet and you side stepped what you should have done and did  
something manually.


First off... this is *my opinion*... not sure how you can jump to  
the conclusion that my oppinon is horse shirt... or any mammal  
shit for that matter.  But I have been know to say that mvn is  
crap many times before... so if you feel better stating that my  
opinion is shit... well, then go for it.


Second, what assurance does any project have any any given  
artifact in the central repo will remain there asis for the  
foreseeable future?


It's always been our policy that artifacts that we place in the  
repository are not removed. From syncing partners like Codehaus,  
Apache, ObjectWeb, or Mortbay they have to tell us that they do not  
want us to match the source they give us. We have never culled the  
repository.


There are are already situations where bits have been added and  
removed (at least one which I requested to get around a sync  
problem from ASF) and a few others which I have heard about  
through others.


Only if the repository is diddled on the source side. We cannot  
control in all places what is done. You guys seem to manually  
delete artifacts from this side which wreaks havoc.




There is no audit trail for central or any other popular mvn repo,  
so any release manager with half their wits is going to think  
twice about trusting any content which is pulled from it.




Two separate issues. People behind firewalls generally do not and  
cannot use the central repository directly but they manage ones  
that do rely on central. And they can for releases because we do  
not delete anything. Everything release be signed and that will get  
easier to do automatically so though it's not fully automated to  
check the signatures you can verify what you have. We will have a  
transition period where unsigned artifacts will be accepted but  
shortly anything coming in by any means will have to be signed and  
the metadata will carry with it a reference to its source.




one critical error I believe that everyone is making is thinking  
that deployed artifacts are permanent...


Deployed releases that we place in the repository are most  
certainly permanent.


What assurance do I have that artifacts are going to exist in the  
repo in 10 years?  20?


That is our policy and we will bolster our infrastructure but  
Ibiblio gives us the assurance that what we give them stays in  
perpetuity. That's their policy. That's why we still mirror things  
there even though we are using our own dedicated box. I also have  
backups from day one that are stored in an offsite along with my  
other backups.



How about the assurance that they have not been tampered with?


Signatures that you are supposed to use for releases here since  
always. We don't enforce that yet but anything coming from Apache  
should have them.


Digest files don't do jack... as if someone alters an artifact,  
and I download it into an empty repo... then it all looks fine to  
the consumer and its highly regular for users to nuke their  
repo when problems happen... and problems are regular too.


It will not be long before it will simply be mandatory to have a  
PGP key sign your goods if they are going to make it into the  
repository.





Snapshots can be transient depending on the policy of the  
repository. What I've been told is that infrastructure is telling  
you to remove old versions from the repository at Apache which  
can have disasterous effects. Your artifacts deployed to central  
should not disappear so I'll go make sure that we're not deleting  
files that have been removed from the sour

Re: no more modules for specs...

2006-12-15 Thread Jason van Zyl


On 16 Dec 06, at 12:10 AM 16 Dec 06, Jason Dillon wrote:


On Dec 15, 2006, at 7:51 PM, Jason van Zyl wrote:

IMO the remote repos are for user connivence *ONLY*,


That is horse shit. What they contain is a product of what you  
people put in them. You guys for example have something that is  
tagged, which you released and which is not reproducible. That's  
not Maven's fault, that's your fault. You are hosing anyone trying  
to consume your stuff. And if people are doing that to you then  
your should irate like anyone should be who looks at your tag for  
1.1.1. That's why the release plugin behaves the way it does. It  
screamed at you when you tried to release that 1.1.1 tag I bet and  
you side stepped what you should have done and did something  
manually.


First off... this is *my opinion*... not sure how you can jump to  
the conclusion that my oppinon is horse shirt... or any mammal shit  
for that matter.  But I have been know to say that mvn is crap many  
times before... so if you feel better stating that my opinion is  
shit... well, then go for it.


Second, what assurance does any project have any any given artifact  
in the central repo will remain there asis for the foreseeable future?


It's always been our policy that artifacts that we place in the  
repository are not removed. From syncing partners like Codehaus,  
Apache, ObjectWeb, or Mortbay they have to tell us that they do not  
want us to match the source they give us. We have never culled the  
repository.


There are are already situations where bits have been added and  
removed (at least one which I requested to get around a sync  
problem from ASF) and a few others which I have heard about through  
others.


Only if the repository is diddled on the source side. We cannot  
control in all places what is done. You guys seem to manually delete  
artifacts from this side which wreaks havoc.




There is no audit trail for central or any other popular mvn repo,  
so any release manager with half their wits is going to think twice  
about trusting any content which is pulled from it.




Two separate issues. People behind firewalls generally do not and  
cannot use the central repository directly but they manage ones that  
do rely on central. And they can for releases because we do not  
delete anything. Everything release be signed and that will get  
easier to do automatically so though it's not fully automated to  
check the signatures you can verify what you have. We will have a  
transition period where unsigned artifacts will be accepted but  
shortly anything coming in by any means will have to be signed and  
the metadata will carry with it a reference to its source.




one critical error I believe that everyone is making is thinking  
that deployed artifacts are permanent...


Deployed releases that we place in the repository are most  
certainly permanent.


What assurance do I have that artifacts are going to exist in the  
repo in 10 years?  20?


That is our policy and we will bolster our infrastructure but Ibiblio  
gives us the assurance that what we give them stays in perpetuity.  
That's their policy. That's why we still mirror things there even  
though we are using our own dedicated box. I also have backups from  
day one that are stored in an offsite along with my other backups.



How about the assurance that they have not been tampered with?


Signatures that you are supposed to use for releases here since  
always. We don't enforce that yet but anything coming from Apache  
should have them.


Digest files don't do jack... as if someone alters an artifact, and  
I download it into an empty repo... then it all looks fine to the  
consumer and its highly regular for users to nuke their repo  
when problems happen... and problems are regular too.


It will not be long before it will simply be mandatory to have a PGP  
key sign your goods if they are going to make it into the repository.





Snapshots can be transient depending on the policy of the  
repository. What I've been told is that infrastructure is telling  
you to remove old versions from the repository at Apache which can  
have disasterous effects. Your artifacts deployed to central  
should not disappear so I'll go make sure that we're not deleting  
files that have been removed from the source. But we generally  
assume when we are syncing from a source that the source knows  
what it's doing. If you delete stuff on this end then we expect  
you to understand what effect that will have on your clients.


Many users may understand, but there is always one that will not,  
which throws off the entire system.


Which users? You as a user supplier of artifacts populating your  
repository?




If one weak link decided to re-release version 2.0 of something  
that effectively breaks compatibility with consumers of the  
previous 2.0 release, then then entire system is compromised... who  
is to say that any other artifact might just change u

Re: no more modules for specs...

2006-12-15 Thread Jason Dillon

On Dec 15, 2006, at 7:51 PM, Jason van Zyl wrote:

IMO the remote repos are for user connivence *ONLY*,


That is horse shit. What they contain is a product of what you  
people put in them. You guys for example have something that is  
tagged, which you released and which is not reproducible. That's  
not Maven's fault, that's your fault. You are hosing anyone trying  
to consume your stuff. And if people are doing that to you then  
your should irate like anyone should be who looks at your tag for  
1.1.1. That's why the release plugin behaves the way it does. It  
screamed at you when you tried to release that 1.1.1 tag I bet and  
you side stepped what you should have done and did something manually.


First off... this is *my opinion*... not sure how you can jump to the  
conclusion that my oppinon is horse shirt... or any mammal shit for  
that matter.  But I have been know to say that mvn is crap many times  
before... so if you feel better stating that my opinion is shit...  
well, then go for it.


Second, what assurance does any project have any any given artifact  
in the central repo will remain there asis for the foreseeable  
future?  There are are already situations where bits have been added  
and removed (at least one which I requested to get around a sync  
problem from ASF) and a few others which I have heard about through  
others.


There is no audit trail for central or any other popular mvn repo, so  
any release manager with half their wits is going to think twice  
about trusting any content which is pulled from it.



one critical error I believe that everyone is making is thinking  
that deployed artifacts are permanent...


Deployed releases that we place in the repository are most  
certainly permanent.


What assurance do I have that artifacts are going to exist in the  
repo in 10 years?  20?  How about the assurance that they have not  
been tampered with?  Digest files don't do jack... as if someone  
alters an artifact, and I download it into an empty repo... then it  
all looks fine to the consumer and its highly regular for users  
to nuke their repo when problems happen... and problems are regular too.



Snapshots can be transient depending on the policy of the  
repository. What I've been told is that infrastructure is telling  
you to remove old versions from the repository at Apache which can  
have disasterous effects. Your artifacts deployed to central should  
not disappear so I'll go make sure that we're not deleting files  
that have been removed from the source. But we generally assume  
when we are syncing from a source that the source knows what it's  
doing. If you delete stuff on this end then we expect you to  
understand what effect that will have on your clients.


Many users may understand, but there is always one that will not,  
which throws off the entire system.


If one weak link decided to re-release version 2.0 of something that  
effectively breaks compatibility with consumers of the previous 2.0  
release, then then entire system is compromised... who is to say that  
any other artifact might just change under the covers.


Granted that does not happen often, but it does happen, and will most  
certainly continue to happen.  Maybe its possible to remove the  
chance of it happening from central, but projects do not just depend  
upon central... its easy enough to setup a repo, then include that  
into your project.  And the managers of that repo may remove add/ 
change artifacts at will.  So my comment about the transitive nature  
of all mvn repos is much more general... and certainly not mean to  
know mvn, but more as a warning that artifacts on remote repos are  
much more transient that many people (like you) believe they are.


which they are most certainly not, so we should generally always  
build from source to ensure that everything is being included at  
the right version.


No you shouldn't. That defeats the entire purpose of Maven.


Certainly not Jason... and I'm surprised to hear this from you as I  
have heard you talk about the plug-ability of build functionality as  
much more important to the purpose of maven than the artifact  
remoting.  IMO the plugin framework of mvn is much more important to  
the purpose of mvn than remote artifact handling... and more so the  
lossy artifact handling is more of a detriment to mvn than anything  
else.  I'd like to throw the remote artifact capabilities into the  
trash... and then we'd find a reliable and trustworthy build platform  
in mvn.


I can't even count the number of times that mvn builds have broken  
due to external dependency changes... no local sources changes, but  
come in to the office the next day and the build is just fucked.



You have unmaintainable and non-reproducible builds because you  
have such a massive number of SNAPSHOTs and you never make any  
incremental releases. What do you expect when you're building upon  
sand?


Ya could be... mvn's snapshot

Re: no more modules for specs...

2006-12-15 Thread Jason van Zyl


On 15 Dec 06, at 6:41 PM 15 Dec 06, Jason Dillon wrote:

this change really killed me for all of the build automation I have  
been working on... and was one of the reasons why I had warned  
against using this style of versioning.


basically since this was removed, checking out the specs project  
and building it produces nothing, except for this:


org/apache/geronimo/specs/specs/maven-metadata-local.xml
org/apache/geronimo/specs/specs/1.2/specs-1.2.pom
org/apache/geronimo/specs/specs/1.2/specs-1.2-site.xml

since my automation setup actually builds projects, then holds on  
to the *exact* build output for use by a project which depends on  
them, anything that depends on a spec which is not already released  
and propagated to central and its mirrors will fail.  This means  
that anything that needs a snapshot will always fail.


 * * *

one critical error I believe that everyone is making is thinking  
that deployed artifacts are permanent...


Deployed releases that we place in the repository are most certainly  
permanent. Snapshots can be transient depending on the policy of the  
repository. What I've been told is that infrastructure is telling you  
to remove old versions from the repository at Apache which can have  
disasterous effects. Your artifacts deployed to central should not  
disappear so I'll go make sure that we're not deleting files that  
have been removed from the source. But we generally assume when we  
are syncing from a source that the source knows what it's doing. If  
you delete stuff on this end then we expect you to understand what  
effect that will have on your clients.


which they are most certainly not, so we should generally always  
build from source to ensure that everything is being included at  
the right version.


No you shouldn't. That defeats the entire purpose of Maven. You have  
unmaintainable and non-reproducible builds because you have such a  
massive number of SNAPSHOTs and you never make any incremental  
releases. What do you expect when you're building upon sand? Get the  
producers of your artifacts to release things more frequently, this  
ensures that SNAPSHOTs are removed from the chain. I look in your  
server build and you've got SNAPSHOTS **all** over the place.  
OpenEJB, ActiveMQ, Tranql, your spec JARs. You control, or have  
relationships with most of these projects. You don't wait until it's  
time to release to get the producers of your dependencies to crap out  
some junk you can use. Get them release, lock your stuff down to it,  
move on. Repeat. You should have no SNAPSHOTs other then ones for  
your own projects in your POMs. If everything is changing everywhere  
it's total insanity. Build everything from source is not something  
you should be doing because you should not have to. It is one of the  
most obvious signifiers that you are doing something very wrong. You  
have way too many points of variation and all your projects seem to  
be in constant swing with one another. If that's the case it begs the  
question whether they really separate and why not just lump them all  
together if they are so coupled together. Which generally the case  
when you cannot eliminate SNAPSHOTs from your build. When you see  
SNAPSHOT not from your project in our POM it is a warning sign.



IMO the remote repos are for user connivence *ONLY*,


That is horse shit. What they contain is a product of what you people  
put in them. You guys for example have something that is tagged,  
which you released and which is not reproducible. That's not Maven's  
fault, that's your fault. You are hosing anyone trying to consume  
your stuff. And if people are doing that to you then your should  
irate like anyone should be who looks at your tag for 1.1.1. That's  
why the release plugin behaves the way it does. It screamed at you  
when you tried to release that 1.1.1 tag I bet and you side stepped  
what you should have done and did something manually.


so that you don't need to always go and build everything.  For  
automated builds it is much easier to build projects one by one,  
hold on to the exact outputs, then use those outputs as  
dependencies directly to ensure that dependent builds use the right  
artifacts.


That is just so wrong if you're using Maven. Why do you go have to  
build everything? It's because your dependency chain consists of so  
many transient SNAPSHOTs that you can't possibly rely on anything.  
Start in your specs and release them,  when you have resolved a few  
issues bump the micro number and release again. Eliminate them  
anywhere you can. Releasing is not painful with the release plugin  
when you have your SNAPSHOTs under control.




if the project is not going to build as one might expect when  
running `mvn install` from the root and if that is because you  
want to version each of those bits separately and use the half- 
functional maven release plugin to handle releases... then each of  
these