What do you mean? If you specify -Dmaven.repo.local=./svn-repo (or
where ever the svn checkout is) and run the build offline, then the
repo won't get modified, and thus only chance a bad artifact would
get in there would be if someone checked in something bad, or if the
local `mvn install` got messed up, but generally `mvn install`
artifacts will never be checked into that repo.
It is really too bad that there is not a local repo for deps and then
a local repo for generated artifacts. The whole local repo thing is
lossy IMO and makes it very difficult to assure quality of releases.
--jason
On Dec 11, 2006, at 7:55 AM, Guillaume Nodet wrote:
> Also, keep in mind that there is no way to bypass the
> local repository afaik. So if a bad artifact goes into the
> user local repo, it may disturb Geronimo's build, even
> if Geronimo build only use a single svn based remote
> repo. In such a case, the only way to ensure that the
> build will work is to start from a clean local repo.
>
> On 12/9/06, Jason Dillon <[EMAIL PROTECTED]> wrote:
>> On Dec 8, 2006, at 6:41 PM, Prasad Kashyap wrote:
>> >> ... At this point... and ya, I may be in a
>> >> bad mood now... I don't think that mvn is an appropriate
tool for
>> >> building production quality products... period.
>> >
>> > I hear ye. I share the pain. But I fear the alternative -
spending
>> > considerable time migrating to another build system.
>>
>> Ya, I know... I'm not suggesting that we change any time soon.
But I
>> do fear that there is going to be some serious ongoing pain.
>>
>>
>> > When you return from your bad mood to your jolly good ole' self
>> again,
>>
>> I dunno... I'm jaded now... good ole jolly jason was eaten by
the big
>> angry maven monster... :-P
>>
>>
>> > can you please shed more light on what it would take to have
this
>> > *ONE* repo; it's pros and cons and such..
>>
>> I've sent a few emails about this in the past. Major hurtles
to this
>> are going to be sysadmin/network overheads, ASF infra politics,
and
>> of course keeping the artifacts in sync. There are just way to
many
>> things that need to get downloaded, making the window for problems
>> really quite massive.
>>
>> I'm still trying to figure out how to effectively workaround this
>> problem for an open community... in a corporate setting this is
a no
>> brainer, setup a machine, back it up, setup proximity or maven-
proxy
>> to aggregate remote repos, then create a few local repos backed by
>> svn to hold custom artifacts or specific versions to help
reduce risk
>> incurred by remote artifact stability. Then each project just
lists
>> that one repo.
>>
>> This works well, but due to the way maven works, other
dependencies
>> may list repos, which will then get picked up and used for
artifacts
>> selection, which tends to pollute the sanity and stability, but
>> usually not too much. But its yet another flaw in maven's
>> architecture which while its flexible and easy for smaller
projects,
>> its nearly impossible to make any sort of assurances for larger
more
>> complicated projects. Actually even for smaller ones it makes it
>> very very difficult to ensure build stability over the life of the
>> project (past build repeatability and assumed future
compatibility,
>> as at any time someone could publish a plugin or artifact which
>> completely breaks your build, often times requiring days to debug
>> why).
>>
>> The only way around this is to have total ownership of imported
build
>> artifacts and an effective paper trail for changes (ie svn change
>> logs).
>>
>> While maven has made many things simpler... it really has made
it a
>> lot harder to implement stable, reliable and durable builds. :-(
>>
>> Anyways, all I can really think of to step around this problem,
is to
>> checkin all of the artifacts which are needed into svn,
configure the
>> build to use a checkout of that repo for its local and then always
>> run offline. And periodically update the svn repo from remotes as
>> well as manage some artifacts by hand. Essentially removing any
>> remoteness from Maven, which is IMO key to making builds stable,
>> reliable and durable.
>>
>> Svn has all the artifacts needed, so svn co will get you the right
>> bits, svn up will make sure its the latest, so no need to keep
making
>> all those network calls to check for artifacts, which will
speed the
>> build up dramatically. Svn will always have a trail of who
changed
>> what when which can be easily correlated to build failures
using a CI
>> tool. Mysterious dependency download, metadata corruption, bad
>> network connections basically go a way from the list of normal
>> problems we run into. The repository gets labeled when the
software
>> gets labeled, so you can *always* go back in time, checkout an old
>> release and build it... and have a very, very, very high chance
that
>> it will work with no fuss, only things which may break it would be
>> environment related (deep windows folder, wrong jdk version,
missing
>> heap settings, etc).
>>
>> Dunno if there are other options really... maybe... but I can't
think
>> of it at the moment.
>>
>> I think the mvn plugin system is good, getting better once they
fix
>> some of the annoying bugs... and even better once they document
the
>> apis more. Wish the dang pom was not so verbose... or need to
carry
>> version details into each and every pom... but those are all
minor.
>> The major issue is the remote repo. Once you eliminate that, then
>> mvn starts to look a whole lot more attractive for serious
production
>> builds.
>>
>> --jason
>>
>>
>
>
> --
> Cheers,
> Guillaume Nodet