Marc Portier wrote:
not really,
only idea is some naive cron-job removing files from the repo:
1/ regular (monthly) removal of the complete repository off the
continuum 'user' will enforce pure clean builds, but
* make those first new builds of the month extremely slow
* dunno if that same 'user'/repo is used in other projects as well
2/ another approach with a more 'continuous' feel would be to daily
remove all files that were downloaded XX days ago (30?)
find ${USER}/.m2/repository -type f -mtime +30 -exec rm {} \;
forcing only those to be fetched again will somewhat spread the stress
on downloading that stuff again
to introduce this scenario however you'ld need to gradually step down in
age of documents (just to make sure that you don't get a big
download-stress every 30 days measure from the first time you've put
this in place)
some listings like:
$ for i in $(seq 0 10); do echo -n "#files older then ${i}0 days: ";
find ~/.m2/repository/ -type f -mtime +${i}0| wc -l; done
might hint at which groups to remove manually (during the week before
starting the regular remove) in order to get an optimum spread
anyway: additional challenge with this kind of hard cleaning is making
sure that continuum isn't starting/running builds while the repo-files
are being removed...
above suggests that stuff like this might make more sense at the
continuum level, anyone?
or even better, inside maven: having some kind of a flag that would at
least check (not download) if jars/checksums are still
available/matching at the remote repositories
How about we just execute dependency:purge-local-repository from a
continuum profile in the root pom ? That should clean out the repo
enough to guarantee a consistent build.
I wouldn't worry too much about the first build being slow, repo1 is
really fast nowadays.
Jorg