[m2] dependency ordering
Hi - I'm doing some processing with a custom mojo that relies on the dependency information to build an artifact, but the order of processing is important. Because MavenProject.getArtifacts() returns a set, this is unordered with respect to the pom.xml; I was wondering if it's possible to add extra information in the pom, something like ... dependency groupIdblah/groupId artifactIdthing/artifactId version3.8.1/version order1/order -- /dependency dependency groupIdwoo/groupId artifactIdyay/artifactId version3/version order2/order -- /dependency ... Is there a good way of tackling this? - To unsubscribe, e-mail: [EMAIL PROTECTED] For additional commands, e-mail: [EMAIL PROTECTED]
Re: [m2] dependency ordering
I was just looking at that and thinking that was probably the way I should do it.. just seemed a shame that I couldn't attach some arbitary property inside the dependency tag, but it's no biggie. Nigel On 11/14/05, Brett Porter [EMAIL PROTECTED] wrote: Right - transitivity defeats the ability to do this. What you need to do is specify the order in configuration to your plugin (and still declare the dependencies normally). wars wargroupId:artifactId1/war wargroupId:artifactId2/war /wars From this you can construct an artifact filter pretty easily and filter the ${project.artifacts}. This is how the assembly plugin works. - Brett On 11/14/05, Nigel Magnay [EMAIL PROTECTED] wrote: The artifact being generated is effectively a merge of several other artifacts of the same type - a WAR file. It's important to get the ordering right as the overwriting precidence matters. I was just thinking of needing an extra bit of user data, much like the properties you could put on dependencies in M1 that controlled things like whether the file got included in the manifest or not; a 'proper' ordering of all dependencies does sound harder (and maybe too hard to be always right in all circumstances..) On 11/14/05, Brett Porter [EMAIL PROTECTED] wrote: Can you describe what the relevance of the ordering is? There might be something already available. The suggestion of this tag is tricky, as how does dependencies brought in transitively get ordered? - Brett On 11/14/05, Nigel Magnay [EMAIL PROTECTED] wrote: Hi - I'm doing some processing with a custom mojo that relies on the dependency information to build an artifact, but the order of processing is important. Because MavenProject.getArtifacts() returns a set, this is unordered with respect to the pom.xml; I was wondering if it's possible to add extra information in the pom, something like ... dependency groupIdblah/groupId artifactIdthing/artifactId version3.8.1/version order1/order -- /dependency dependency groupIdwoo/groupId artifactIdyay/artifactId version3/version order2/order -- /dependency ... Is there a good way of tackling this? - To unsubscribe, e-mail: [EMAIL PROTECTED] For additional commands, e-mail: [EMAIL PROTECTED] - To unsubscribe, e-mail: [EMAIL PROTECTED] For additional commands, e-mail: [EMAIL PROTECTED] - To unsubscribe, e-mail: [EMAIL PROTECTED] For additional commands, e-mail: [EMAIL PROTECTED] - To unsubscribe, e-mail: [EMAIL PROTECTED] For additional commands, e-mail: [EMAIL PROTECTED] - To unsubscribe, e-mail: [EMAIL PROTECTED] For additional commands, e-mail: [EMAIL PROTECTED]
[m2] InstallMojo properties
A bit further into my experimenting with generating an m2 plugin - I have successfully created my mojo and bound it to the bit of lifecycle to do with packaging. My 'client' project has packaginguberwar/packaging To get hte right binding. However - because of this, the install mojo assumes the extension is going to be [something].uberwar, when I actually want it to be [something].war I had hoped plugin artifactIdmaven-install-plugin/artifactId configuration packagingwar/packaging /configuration /plugin Would do the trick but packaging is defined as readonly... Is there another (better?) way around this, or should I just create some more code to get around it? - To unsubscribe, e-mail: [EMAIL PROTECTED] For additional commands, e-mail: [EMAIL PROTECTED]
Re: [m2] dependency ordering
The artifact being generated is effectively a merge of several other artifacts of the same type - a WAR file. It's important to get the ordering right as the overwriting precidence matters. I was just thinking of needing an extra bit of user data, much like the properties you could put on dependencies in M1 that controlled things like whether the file got included in the manifest or not; a 'proper' ordering of all dependencies does sound harder (and maybe too hard to be always right in all circumstances..) On 11/14/05, Brett Porter [EMAIL PROTECTED] wrote: Can you describe what the relevance of the ordering is? There might be something already available. The suggestion of this tag is tricky, as how does dependencies brought in transitively get ordered? - Brett On 11/14/05, Nigel Magnay [EMAIL PROTECTED] wrote: Hi - I'm doing some processing with a custom mojo that relies on the dependency information to build an artifact, but the order of processing is important. Because MavenProject.getArtifacts() returns a set, this is unordered with respect to the pom.xml; I was wondering if it's possible to add extra information in the pom, something like ... dependency groupIdblah/groupId artifactIdthing/artifactId version3.8.1/version order1/order -- /dependency dependency groupIdwoo/groupId artifactIdyay/artifactId version3/version order2/order -- /dependency ... Is there a good way of tackling this? - To unsubscribe, e-mail: [EMAIL PROTECTED] For additional commands, e-mail: [EMAIL PROTECTED] - To unsubscribe, e-mail: [EMAIL PROTECTED] For additional commands, e-mail: [EMAIL PROTECTED] - To unsubscribe, e-mail: [EMAIL PROTECTED] For additional commands, e-mail: [EMAIL PROTECTED]
use of ArtifactHandler
Hello - I have been trying to follow the configuration for ArtifactHandlers - I have in my components.xml component roleorg.apache.maven.artifact.handler.ArtifactHandler/role role-hintuberwar/role-hint implementationorg.apache.maven.artifact.handler.DefaultArtifactHandler/implementation configuration extensionwar/extension typewar/type packagingwar/packaging languagejava/language addedToClasspathfalse/addedToClasspath /configuration /component But, looking at DefaultArtifactHandler, the configuration never seems to be used as there are only private member variables, and extension defaults to be the same as the type, which will always be 'uberwar' (I want it to be war). Am I missing something? Or is the intention to create your own subtype of ArtifactHandler rather than using the Default (is the configuration not implemented?) - To unsubscribe, e-mail: [EMAIL PROTECTED] For additional commands, e-mail: [EMAIL PROTECTED]
Re: use of ArtifactHandler
Yep - I'm pretty sure it's reading it as there is also a LifecycleMapping which is being used correctly. Are the private member variables supposed to get read set by some persistence mechanism from the configuration node ? On 11/17/05, John Casey [EMAIL PROTECTED] wrote: -BEGIN PGP SIGNED MESSAGE- Hash: SHA1 You're defining this components.xml in a plugin, right? Do you have extensionstrue/extensions defined in the plugin reference within your plugin-user POM? If not, it will use a default artifact handler that has the same type as your packaging, and the same extension as your packaging... - -j Nigel Magnay wrote: | Hello - I have been trying to follow the configuration for | ArtifactHandlers - I have in my components.xml | | component | roleorg.apache.maven.artifact.handler.ArtifactHandler/role | role-hintuberwar/role-hint | implementationorg.apache.maven.artifact.handler.DefaultArtifactHandler/implementation | configuration | extensionwar/extension | typewar/type | packagingwar/packaging | languagejava/language | addedToClasspathfalse/addedToClasspath | /configuration | /component | | But, looking at DefaultArtifactHandler, the configuration never seems | to be used as there are only private member variables, and extension | defaults to be the same as the type, which will always be 'uberwar' (I | want it to be war). | | Am I missing something? Or is the intention to create your own subtype | of ArtifactHandler rather than using the Default (is the configuration | not implemented?) | | - | To unsubscribe, e-mail: [EMAIL PROTECTED] | For additional commands, e-mail: [EMAIL PROTECTED] | | | -BEGIN PGP SIGNATURE- Version: GnuPG v1.2.6 (GNU/Linux) iD8DBQFDfLJpK3h2CZwO/4URAnfNAKCuOex6mF2oS8mp9O3lsGfbV7kSSQCdGype Fpjllnt2C0Pd8ju0fnu4+24= =LdHb -END PGP SIGNATURE- - To unsubscribe, e-mail: [EMAIL PROTECTED] For additional commands, e-mail: [EMAIL PROTECTED] - To unsubscribe, e-mail: [EMAIL PROTECTED] For additional commands, e-mail: [EMAIL PROTECTED]
convert a dependency to an artifact in plugin
I have a project using the groovy maven plugin to do some scripting. I can get access to my project, and its dependencies from a project variable. I can get access to the instance of the local repository by passing ${localRepository} in a property What I want to do is, for a dependency, find where it is in the localRepository (by using pathOf). However, I need to make an Artifact (DefaultArtifact) from the Dependency, and I can't see how to do this. If I just create one, I can't populate the right ArtifactHandler values. Is there an easy way to do this ?
LATEST, SNAPSHOT and RELEASE
I could have sworn I'd read somewhere that it was possible to use dependency versions of LATEST, RELEASE and SNAPSHOT in a pom dependency to not have to specify which particular version I needed. Did I just dream that? Or does it not apply to dependencies?
dependency:resolve and dependency:tree
Hello I have a war project (actually, a cargo uberwar project), which I want to analyse using dependency:resolve / dependency:tree (and maybe even through site reporting) in order to find dependency conflicts. However - the dependency plugin does not show the jar files that are a part of the dependent war files - it is cut off at the jar level. Is it possible to configure it to descend the dependency tree into WAR artifacts as well? Without this my dependency list looks fine, but I have WAR files with differing versions in them... Nigel
Re: dependency:resolve and dependency:tree
.. and on a related note... When I do dependency:tree on a project that contains dependency groupIdorg.hibernate/groupId artifactIdhibernate/artifactId version3.2.4.ga/version typejar/type scopecompile/scope /dependency In the hibernate pom in my repository, it says ... dependency groupIdcommons-collections/groupId artifactIdcommons-collections/artifactId version2.1.1/version /dependency ... Why does dependency:tree show it as 3.1 ?!?! org.hibernate:hibernate:jar:3.2.4.ga:compile [INFO] net.sf.ehcache:ehcache:jar:1.2.3:compile [INFO] commons-logging:commons-logging:jar:1.1:compile [INFO] asm:asm-attrs:jar:1.5.3:compile [INFO] antlr:antlr:jar:2.7.6:compile [INFO] cglib:cglib:jar:2.1_3:compile [INFO] asm:asm:jar:1.5.3:compile [INFO] commons-collections:commons-collections:jar:3.1:compile Is this also the reason why my dependency convergence report looks OK, when really there's a mismatch ? On 26/10/2007, Nigel Magnay [EMAIL PROTECTED] wrote: Hello I have a war project (actually, a cargo uberwar project), which I want to analyse using dependency:resolve / dependency:tree (and maybe even through site reporting) in order to find dependency conflicts. However - the dependency plugin does not show the jar files that are a part of the dependent war files - it is cut off at the jar level. Is it possible to configure it to descend the dependency tree into WAR artifacts as well? Without this my dependency list looks fine, but I have WAR files with differing versions in them... Nigel
Re: dependency:resolve and dependency:tree
Hmm - ok. FWIW, I'm seeing 3.1 on dependency:resolve as well. Yes - the dependencies of the WAR (which will be the same as those inside it, if it's generated by maven). If I use cargo's merging (or I think just the war plugin itself) to build a war file out of other war fragments, merging them together, it's possible to get 2 versions of a dependency in the output WEB-INF/lib. It'd be really nice to be able to find out that this is happening in the convergence report or dependency analysis.. On 27/10/2007, Brian E. Fox [EMAIL PROTECTED] wrote: It's possible that some other dependency has the 3.1 set. Use mvn -X to see the actual resolution, the tree mojo is still in progress. I don't understand your question about the war files. Are you expecting to see the dependencies inside the war or OF the war? -Original Message- From: Nigel Magnay [mailto:[EMAIL PROTECTED] Sent: Friday, October 26, 2007 9:43 AM To: Maven Users List Subject: Re: dependency:resolve and dependency:tree .. and on a related note... When I do dependency:tree on a project that contains dependency groupIdorg.hibernate/groupId artifactIdhibernate/artifactId version3.2.4.ga/version typejar/type scopecompile/scope /dependency In the hibernate pom in my repository, it says ... dependency groupIdcommons-collections/groupId artifactIdcommons-collections/artifactId version2.1.1/version /dependency ... Why does dependency:tree show it as 3.1 ?!?! org.hibernate:hibernate:jar:3.2.4.ga:compile [INFO] net.sf.ehcache:ehcache:jar:1.2.3:compile [INFO] commons-logging:commons-logging:jar:1.1:compile [INFO] asm:asm-attrs:jar:1.5.3:compile [INFO] antlr:antlr:jar:2.7.6:compile [INFO] cglib:cglib:jar:2.1_3:compile [INFO] asm:asm:jar:1.5.3:compile [INFO] commons-collections:commons-collections:jar:3.1:compile Is this also the reason why my dependency convergence report looks OK, when really there's a mismatch ? On 26/10/2007, Nigel Magnay [EMAIL PROTECTED] wrote: Hello I have a war project (actually, a cargo uberwar project), which I want to analyse using dependency:resolve / dependency:tree (and maybe even through site reporting) in order to find dependency conflicts. However - the dependency plugin does not show the jar files that are a part of the dependent war files - it is cut off at the jar level. Is it possible to configure it to descend the dependency tree into WAR artifacts as well? Without this my dependency list looks fine, but I have WAR files with differing versions in them... Nigel - To unsubscribe, e-mail: [EMAIL PROTECTED] For additional commands, e-mail: [EMAIL PROTECTED]
maven-plugin-plugin failing in reactor
I have a mojo building with groovy. In my pom I have plugin groupIdorg.apache.maven.plugins/groupId artifactIdmaven-plugin-plugin/artifactId version2.3/version dependencies dependency groupIdorg.codehaus.mojo.groovy/groupId artifactIdgroovy-mojo-tools/artifactId version1.0-beta-2/version /dependency /dependencies configuration extractors extractorgroovy/extractor /extractors /configuration /plugin This all works fine when the plugin is compiled in isolation. However, when compiled as a module from the parent, it fails with INFO] [INFO] Error extracting plugin descriptor: 'No extractor for language: groovy' [INFO] Anyone seen this or is it just something for JIRA ?
POM 'inclusion'? A tricky problem..
Hi maveneers. I have an interesting problem which I'm not sure how to solve, and I wonder if anyone else has met this before or has some good ideas.. We have 2 products; Alpha and Beta, a CommonCode project, and a SuperPom project (it's actually more complex than that, but this is just to simplify). Each of these has a separate repository (trunk) in SVN and can be released separately. All projects inherit through SuperPom for company-wide settings. In addition we have 2 'workspace' projects AlphaWorkspace and BetaWorkspace, which use aggregation and svn:externals to bring in their projects into a common build. The projects also use properties to define the versions of their dependencies - i.e - Alpha's POM has a property of commoncode.version1.0-SNAPSHOT/commoncode.version. So - we have something like AlphaWorkspace +---SuperPom (5.0) +---CommonCode (1.0-SNAPSHOT) +---Alpha (2.1-SNAPSHOT) and BetaWorkspace +---SuperPom (5.0) +---CommonCode (1.0-SNAPSHOT) +---Beta (3.3-SNAPSHOT) So far so good. Developers like this because they can do 'eclipse:eclipse' at the top level, and the dependencies get worked out correctly, they can patch things in commoncode and not have to worry too much about which project they're modifying. At some point, Alpha needs to release. So, it must release all dependent snapshots as well, meaning CommonCode. So AlphaWorkspace is released, and then moves to AlphaWorkspace +---SuperPom (5.0) +---CommonCode (1.1-SNAPSHOT) +---Alpha (2.2-SNAPSHOT) However. Because CommonCode has changed, BetaWorkspace has now become BetaWorkspace +---SuperPom (5.0) +---CommonCode (1.1-SNAPSHOT) +---Beta (3.3-SNAPSHOT) Because Beta uses CommonCode:1.0-SNAPSHOT, not the 1.1-SNAPSHOT that is now HEAD in CommonCode, fresh 'clean' builds will fail, and existing developers suddenly find their code builds in an odd order, and changes they made in commoncode don't seem to be appearing in the output until.. aha! someone's released CommonCode, we have to change our property as well. In effect, Beta just wants the 'LATEST' version of CommonCode, whatever that happens to be in the repository. So far, I've thought of Set the version to 'LATEST' - no: only works for plugins Have a property in SuperPom for each project commoncode.latest - no: just pushes the problem one project back - still have to change that version Have the versions set in some kind of buildscript before calling mvn : possible, but yuck Have some mojo contribute the properties : not sure if this would work - reactor build order is calculated before the plugin runs? Have some manual step / mojo to update the property commons.latest - hmm, another step for developers to forget.. Some kind of POM 'import' function? Can't do yet. Does anyone have any clever ideas or solved this one themselves? - To unsubscribe, e-mail: [EMAIL PROTECTED] For additional commands, e-mail: [EMAIL PROTECTED]
Re: POM 'inclusion'? A tricky problem..
The problem is that the projects always end up changing quite often - perhaps the example of 'commoncode' is a bad one - in reality we have about 3 'shared' projects that both teams might commit changes to. It's desirable that they move to using a 'release' (and remove the project from their workspace svn externals) - that's really the purpose of trying to keep them as separate projects with their own lifecycles, but in practice this only seems to happen for short periods of time before someone needs to commit a change, and so needs to flip back onto a snapshot again. Which is fine - but it's when the other project does a release (e.g at the end of their sprint) - they have to be very fastidious about telling everyone 'hey, HEAD of CommonCode is moving to 1.1-SNAPSHOT - update your poms now'. My experience has been that when this happens, there's much wailing and gnashing of teeth :-( As an aside, what would also be cool would be the ability to know the 'latest' when doing a CI build. What I want to do (but haven't tried yet) is to have an additional build profile, which instead of using, say, the declared version of CommonCode, instead uses the LATEST - that way if that build is failing, you can know you've got trouble ahead if you want to migrate. Thinking about it, I guess POM inclusion wouldn't help any more than inheritance, as it'd also have to be versioned. I guess what I could do is have an externalised descriptor of what the versions on each projects /trunk are, and have a mojo that fails the build if the project says it's on a SNAPSHOT, but wants it to be the 'latest' SNAPSHOT and there's a mismatch. Ideally I could get it to modify the pom to be correct and have it automatically re-start the entire build - but I don't know how practical that would be to acheive. On 11/3/07, Brett Porter [EMAIL PROTECTED] wrote: Is there a reason to always have common code in snapshot as a dependency? Could it not be set to the latest release and only updated when a change from the new one is needed? - Brett On 03/11/2007, Nigel Magnay [EMAIL PROTECTED] wrote: Hi maveneers. I have an interesting problem which I'm not sure how to solve, and I wonder if anyone else has met this before or has some good ideas.. We have 2 products; Alpha and Beta, a CommonCode project, and a SuperPom project (it's actually more complex than that, but this is just to simplify). Each of these has a separate repository (trunk) in SVN and can be released separately. All projects inherit through SuperPom for company-wide settings. In addition we have 2 'workspace' projects AlphaWorkspace and BetaWorkspace, which use aggregation and svn:externals to bring in their projects into a common build. The projects also use properties to define the versions of their dependencies - i.e - Alpha's POM has a property of commoncode.version1.0-SNAPSHOT/commoncode.version. So - we have something like AlphaWorkspace +---SuperPom (5.0) +---CommonCode (1.0-SNAPSHOT) +---Alpha (2.1-SNAPSHOT) and BetaWorkspace +---SuperPom (5.0) +---CommonCode (1.0-SNAPSHOT) +---Beta (3.3-SNAPSHOT) So far so good. Developers like this because they can do 'eclipse:eclipse' at the top level, and the dependencies get worked out correctly, they can patch things in commoncode and not have to worry too much about which project they're modifying. At some point, Alpha needs to release. So, it must release all dependent snapshots as well, meaning CommonCode. So AlphaWorkspace is released, and then moves to AlphaWorkspace +---SuperPom (5.0) +---CommonCode (1.1-SNAPSHOT) +---Alpha (2.2-SNAPSHOT) However. Because CommonCode has changed, BetaWorkspace has now become BetaWorkspace +---SuperPom (5.0) +---CommonCode (1.1-SNAPSHOT) +---Beta (3.3-SNAPSHOT) Because Beta uses CommonCode:1.0-SNAPSHOT, not the 1.1-SNAPSHOT that is now HEAD in CommonCode, fresh 'clean' builds will fail, and existing developers suddenly find their code builds in an odd order, and changes they made in commoncode don't seem to be appearing in the output until.. aha! someone's released CommonCode, we have to change our property as well. In effect, Beta just wants the 'LATEST' version of CommonCode, whatever that happens to be in the repository. So far, I've thought of Set the version to 'LATEST' - no: only works for plugins Have a property in SuperPom for each project commoncode.latest - no: just pushes the problem one project back - still have to change that version Have the versions set in some kind of buildscript before calling mvn : possible, but yuck Have some mojo contribute the properties : not sure if this would work - reactor build order is calculated before the plugin runs? Have some manual step / mojo to update the property commons.latest - hmm, another step for developers to forget.. Some kind of POM 'import' function? Can't do yet. Does anyone have any
Bug in property resolution?
Hi I have a big reactor project. In one project, there is a property set as properties commons-version2.2.1/commons-version /properties and in a later project, it is properties commons-version2.2.4-SNAPSHOT/commons-version /properties .. build plugins plugin groupIdcommons/groupId artifactIdcommons-maven/artifactId version${commons-version}/version executions ... The differing versions are desired, and correct. If I build the later project in isolation, commons-version = 2.2.4-SNAPSHOT. If I build it in the reactor, then commons-version=2.2.1 (it seems) bug? or intended? - To unsubscribe, e-mail: [EMAIL PROTECTED] For additional commands, e-mail: [EMAIL PROTECTED]
Re: Bug in property resolution?
Building a test project shows that actually the property resolution seems fine (so is getting the right version), it's just not invoking the plugin - instead I get ERROR] BUILD ERROR [INFO] [INFO] 'mymojo' was specified in an execution, but not found in the plugin [INFO] Which is odd - the task is definitely there (I can see it in the descriptor in the repository), and it works in isolation, just not when invoked by the reactor.. Anyone got any good ideas? On Nov 7, 2007 4:34 PM, Nigel Magnay [EMAIL PROTECTED] wrote: Hi I have a big reactor project. In one project, there is a property set as properties commons-version2.2.1/commons-version /properties and in a later project, it is properties commons-version2.2.4-SNAPSHOT/commons-version /properties .. build plugins plugin groupIdcommons/groupId artifactIdcommons-maven/artifactId version${commons-version}/version executions ... The differing versions are desired, and correct. If I build the later project in isolation, commons-version = 2.2.4-SNAPSHOT. If I build it in the reactor, then commons-version=2.2.1 (it seems) bug? or intended? - To unsubscribe, e-mail: [EMAIL PROTECTED] For additional commands, e-mail: [EMAIL PROTECTED]
Re: moving from maven 1 to maven 2, part 2
I'd really (really really) try not to mess with the maven versioning - it's a recipe for having to custom write loads of stuff, and it really is pretty fundamental to the operation of maven. Without it, I'm not sure there's much reason to change if it's currently working... That said... Is there some reason the JARs *can't* have a version (I.E. why can't you create 4 website projects with the dependencies of the JARs set to particular versions)? You can always write a custom script or MOJO (ANT or GROOVY) to move things about during the build (including the repository) - perhaps that's a way forward? On Nov 13, 2007 2:31 PM, Christian Andersson [EMAIL PROTECTED] wrote: quick recap.. Hi there, first some history :-) I'm currently using maven 1.0.3 (yes, I know it is old, but it works for me) and we are currently switching from cvs to subversion. Along with this switch from cvs to subversion we are also going to try to switch from our old maven to maven 2 (2.0.7) We are developing web applications that share many of our projects and we have several installed out at the customers, unfourtunally due to our own laziness and some reasons from the customers, they are not always updated at the same time to a newer version.. so, what we have is then something like this (example) website1 using v1 of all jar files website2 using v1.02 of some jar files and v1 of the rest website3 using v1.1 of some jar files and v1 of the rest website4 using v1.02 of some jar files and v1 of the rest this is no problem since different versions can be stored in the repository at the same time.. however some external dependencies and unfourtunally some of our own, do not have version information, or are still in V1 even though there are differences between 2 sites.. when developing for the different sites we checked out the different branches (source code) from cvs into dedicated folders.. for example branches/website1/projects/ (all projects) branches/website2/projects/ (all projects) branches/website3/projects/ (all projects) branches/website4/projects/ (all projects) and to be able to support our unversioned jar files (or different jar files with the same version number) we simple put the repository for each website inside the directory for the website (using the MAVEN_HOME_LOCAL and MAVEN_HOME_USER so that we could have per site repositories and settings. branches/website1/repository/... branches/website2/repository/ branches/website3/repository/ branches/website4/repository/ now after such long description here is my question.. from what I have learned about maven 2 there is no such environment variables that I can set and maven2 reads only settings from 3 different places? the global settings, the settings in the users home directory and the settings for the project.. that means I cannot create a per site settings.xml with the localRepository set to a specific directory ? I can probably create a settings.xml file per project but that means many files in many places that needs to be set up, and if one of them is wrong.. I tried creating a settings.xml for the parent (pom) project but that did not look like it was working (still using /home/user/.m2/repository) can this be done, or are there better ways to do this (yes versioning all files would be best, but that is not a realistic option at the moment.. -- Christian Andersson - [EMAIL PROTECTED] Configuration and Collaboration for OpenOffice.org Open Framework Systems AS http://www.ofs.no - To unsubscribe, e-mail: [EMAIL PROTECTED] For additional commands, e-mail: [EMAIL PROTECTED]
Re: moving from maven 1 to maven 2, part 2
Sure, but by doing that you're effectively trying to dodge the bullet of having correct versions for artifacts, which isn't really what m2 is designed to do. It's quite common to have 'unversioned' 3rd party jars. Best solution is to version them yourself, and deploy them somewhere locally - either (minimally) an HTTP server (such as SVN itself), or into an artifact repository such as proximity, archiva or artifactory. On Nov 13, 2007 2:54 PM, Christian Andersson [EMAIL PROTECTED] wrote: I'm not sure what you mean by messing with maven versioning since what I basicly want is to have different local repositories for different projects without having to create a settings.xml for each project.. that would take care of our immediate problem and in time we could start using maven everywhere and also get some 3rd party providers to start doing versioning of jar files.. (which they don't today) Nigel Magnay skrev: I'd really (really really) try not to mess with the maven versioning - it's a recipe for having to custom write loads of stuff, and it really is pretty fundamental to the operation of maven. Without it, I'm not sure there's much reason to change if it's currently working... That said... Is there some reason the JARs *can't* have a version (I.E. why can't you create 4 website projects with the dependencies of the JARs set to particular versions)? You can always write a custom script or MOJO (ANT or GROOVY) to move things about during the build (including the repository) - perhaps that's a way forward? On Nov 13, 2007 2:31 PM, Christian Andersson [EMAIL PROTECTED] wrote: quick recap.. Hi there, first some history :-) I'm currently using maven 1.0.3 (yes, I know it is old, but it works for me) and we are currently switching from cvs to subversion. Along with this switch from cvs to subversion we are also going to try to switch from our old maven to maven 2 (2.0.7) We are developing web applications that share many of our projects and we have several installed out at the customers, unfourtunally due to our own laziness and some reasons from the customers, they are not always updated at the same time to a newer version.. so, what we have is then something like this (example) website1 using v1 of all jar files website2 using v1.02 of some jar files and v1 of the rest website3 using v1.1 of some jar files and v1 of the rest website4 using v1.02 of some jar files and v1 of the rest this is no problem since different versions can be stored in the repository at the same time.. however some external dependencies and unfourtunally some of our own, do not have version information, or are still in V1 even though there are differences between 2 sites.. when developing for the different sites we checked out the different branches (source code) from cvs into dedicated folders.. for example branches/website1/projects/ (all projects) branches/website2/projects/ (all projects) branches/website3/projects/ (all projects) branches/website4/projects/ (all projects) and to be able to support our unversioned jar files (or different jar files with the same version number) we simple put the repository for each website inside the directory for the website (using the MAVEN_HOME_LOCAL and MAVEN_HOME_USER so that we could have per site repositories and settings. branches/website1/repository/... branches/website2/repository/ branches/website3/repository/ branches/website4/repository/ now after such long description here is my question.. from what I have learned about maven 2 there is no such environment variables that I can set and maven2 reads only settings from 3 different places? the global settings, the settings in the users home directory and the settings for the project.. that means I cannot create a per site settings.xml with the localRepository set to a specific directory ? I can probably create a settings.xml file per project but that means many files in many places that needs to be set up, and if one of them is wrong.. I tried creating a settings.xml for the parent (pom) project but that did not look like it was working (still using /home/user/.m2/repository) can this be done, or are there better ways to do this (yes versioning all files would be best, but that is not a realistic option at the moment.. -- Christian Andersson - [EMAIL PROTECTED] Configuration and Collaboration for OpenOffice.org Open Framework Systems AS http://www.ofs.no - To unsubscribe, e-mail: [EMAIL PROTECTED] For additional commands, e-mail: [EMAIL PROTECTED] -- Christian Andersson - [EMAIL PROTECTED] Configuration and Collaboration for OpenOffice.org Open Framework Systems AS http://www.ofs.no
Re: maven 2.0.7 assembly plugin
You can do a jar containing all dependencies with the following descriptor. However - the excludes doesn't work properly in the assembly plugin, so if you have dependencies that are signed, you'll have to fix them afterwards annoyingly. You can also pack jars into the /lib and use something like one-jar - this is quicker, but I found one-jar loads everthing into memory before running (which fell over for me because my project is big). assembly iduberjar/id formats formatjar/format /formats includeBaseDirectoryfalse/includeBaseDirectory dependencySets dependencySet outputDirectory/outputDirectory outputFileNameMapping/outputFileNameMapping unpacktrue/unpack scoperuntime/scope unpackOptions excludes exclude/META-INF/*.RSA/exclude exclude/META-INF/*.DSA/exclude exclude/META-INF/*.SF/exclude exclude/META-INF/*.rsa/exclude exclude/META-INF/*.dsa/exclude exclude/META-INF/*.sf/exclude /excludes /unpackOptions /dependencySet /dependencySets fileSets fileSet directorytarget/classes/directory outputDirectory/outputDirectory /fileSet /fileSets /assembly On Nov 14, 2007 2:04 AM, Steve Taylor [EMAIL PROTECTED] wrote: In theory, at the command line from the directory of the parent project, just type: mvn -DdescriptorId=jar-with-dependencies assembly:assembly and you should get a single jar with everything in it. In practice, what you actually get is a jar with just META-INF/MANIFEST.MF. So I am also really looking forward to the answer to this one. You'd think a simple problem like this would have a simple solution. Hopefully it does. I still want to believe that maven can make my life easier instead of following the inversion of difficulty pattern inherent in many java tools and frameworks (ie. difficult tasks are simplified and simple tasks become difficult). Brandon Enochs wrote: I'm trying to assemble a multi-module project with a single jar file for all of the modules in my project and a lib directory with all of jar files for my dependencies. Does anyone know how to do this with the assembly plugin? - To unsubscribe, e-mail: [EMAIL PROTECTED] For additional commands, e-mail: [EMAIL PROTECTED] -- View this message in context: http://www.nabble.com/maven-2.0.7-assembly-plugin-tf4755367s177.html#a13738602 Sent from the Maven - Users mailing list archive at Nabble.comhttp://nabble.com/ . - To unsubscribe, e-mail: [EMAIL PROTECTED] For additional commands, e-mail: [EMAIL PROTECTED]
M2 : using assembly plugin to copy part of the repository
I have a pom.xml with various dependencies included in it. I want, in the output JAR file, to include a directory that contains a subsection of the repository from those dependencies. I can nearly get there, by using the assembly descriptor (at the end), but I have 2 problems 1) I want to use the directory the M2 repo would use - the example below is creating a directory com.cswgroup.kms.kes.config when I really want com/cswgroup/kms/kes/config - is there a way of getting at ${groupId} with / separators ? 2) I would like to include the .pom descriptors as well. Is there a way of specifying this with the assembly plugin, or should I look at rolling my own MOJO ? assembly iddirectory/id formats formatjar/format /formats dependencySets dependencySet includes includecom.cswgroup.kms.kes.config:config-kms/include /includes outputFileNameMapping${groupId}/${artifactId}/${version}/${artifactId}-${version}.${extension}/outputFileNameMapping outputDirectory//outputDirectory unpackfalse/unpack scoperuntime/scope /dependencySet /dependencySets /assembly
Re: Modules Building Out of Order
Are you using the assembly plugin? could be *MASSEMBLY-97 http://jira.codehaus.org/browse/MASSEMBLY-97 or MASSEMBLY-102, *raised way back in May but don't look any closer to being fixed. On 04/10/06, SingleShot [EMAIL PROTECTED] wrote: I have a master POM and many modules. For months we've been building fine. We occassionally add a new module with no problem whatsoever. We typically will delete our artifacts from our local repository to make sure there are no leftovers from defnuct modules. Now for the past few days, when we do a build (using a cleaned repository), our modules appear to build out of order. The Reactor build order: list at the start of the build looks fine, but the build eventually tries to access a module out of order, complaining it is not in the repository. If we go down and build that module directly, it builds. We go back to the top and rebuild, and it gets further, but has the same error at a later point. If we manually build every module in order, then do a top down build, it works. So... I'm asking if anyone has any hints or suggestions as to what could cause a perfectly ordered build to suddenly build out of order. Thanks. -- View this message in context: http://www.nabble.com/Modules-Building-Out-of-Order-tf2385050.html#a6648340 Sent from the Maven - Users mailing list archive at Nabble.com. - To unsubscribe, e-mail: [EMAIL PROTECTED] For additional commands, e-mail: [EMAIL PROTECTED]
using ant to do the packaging
Hello listers I am binding things like ant tasks into the lifecycle by using the maven-antrun-plugin bound to the package phase. The ant script usually overwrites the target/blah.jar file with something new. This works well, and I get what I expect in the target directory of my build. However, in the continuum repository, I don't get that - I get the jar file that I would have had if I hadn't done the ANT step. I'm assuming this is because the jar file is assembled in some other way - is there a way for me to get it to deploy the right thing ?
Re: using ant to do the packaging
I think I fixed one problem which was to do with a case issue. I do have ant separately, but I think maven pulls it down automatically. Now the only problem I have remaining is when I have an assembly item with an ID (so the jar becomes -myid.jar), it doesn't appear in the continuum repository (but does in the local user repo..) :( On 09/10/06, r maclean [EMAIL PROTECTED] wrote: Hi Nigel: I'm struggling with my Jar packaging...though I cannot offer you a solution, I am just curious to know if you had to install ANT separately and declare it either in the classpath or put the ant jars in the lib as suggested in the Maven2 doc (Better Builds with Maven)...this on top of the ant-plugins? thanks. Nigel Magnay [EMAIL PROTECTED] wrote: Hello listers I am binding things like ant tasks into the lifecycle by using the maven-antrun-plugin bound to the package phase. The ant script usually overwrites the target/blah.jar file with something new. This works well, and I get what I expect in the target directory of my build. However, in the continuum repository, I don't get that - I get the jar file that I would have had if I hadn't done the ANT step. I'm assuming this is because the jar file is assembled in some other way - is there a way for me to get it to deploy the right thing ?
[m2] Using the eclipse (3.2) compiler
I've been trying to get maven(2) to use the eclipse compiler, which is implied to be possible by fiddling with maven-compiler-plugin and plexus-compiler. However, I can't get it to work (in particular, I can't persuade it that the project is java 1.5, even if I pass the right values in compilerArgument -source 5.0 -target 5.0) Does anyone have this working ? It seems particularly important - debugging with hotcode replace is pretty much totally broken for me because of the differences in the compilers being used - I was hoping I could fix it with a profile to use the eclipse compiler.
Re: Poor man's web.xml merging
I think your merge.xml is required to have a context params section e.g : webXml contextParams strategy name=ChooseByName default strategy name=Preserve/ /default choice name=contextConfigLocation strategy name=NodeMerge context-param param-name$left:param-name/param-name param-value$left:param-value $right:param-value/param-value /context-param /strategy /choice /strategy /contextParams /webXml On 07/11/06, Arnaud Bailly [EMAIL PROTECTED] wrote: Ouuup ! [INFO] [cargo:uberwar] [INFO] [ERROR] BUILD ERROR [INFO] [INFO] Exception merging web.xml [INFO] [DEBUG] Trace org.apache.maven.lifecycle.LifecycleExecutionException: Exception merging web.xml at org.apache.maven.lifecycle.DefaultLifecycleExecutor.executeGoals( DefaultLifecycleExecutor.java:559) at org.apache.maven.lifecycle.DefaultLifecycleExecutor.executeGoalWithLifecycle (DefaultLifecycleExecutor.java:475) at org.apache.maven.lifecycle.DefaultLifecycleExecutor.executeGoal (DefaultLifecycleExecutor.java:454) at org.apache.maven.lifecycle.DefaultLifecycleExecutor.executeGoalAndHandleFailures (DefaultLifecycleExecutor.java:306) at org.apache.maven.lifecycle.DefaultLifecycleExecutor.executeTaskSegments( DefaultLifecycleExecutor.java:273) at org.apache.maven.lifecycle.DefaultLifecycleExecutor.execute( DefaultLifecycleExecutor.java:140) at org.apache.maven.DefaultMaven.doExecute(DefaultMaven.java:322) at org.apache.maven.DefaultMaven.execute(DefaultMaven.java:115) at org.apache.maven.cli.MavenCli.main(MavenCli.java:256) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke( NativeMethodAccessorImpl.java:39) at sun.reflect.DelegatingMethodAccessorImpl.invoke( DelegatingMethodAccessorImpl.java:25) at java.lang.reflect.Method.invoke(Method.java:585) at org.codehaus.classworlds.Launcher.launchEnhanced(Launcher.java :315) at org.codehaus.classworlds.Launcher.launch(Launcher.java:255) at org.codehaus.classworlds.Launcher.mainWithExitCode( Launcher.java:430) at org.codehaus.classworlds.Launcher.main(Launcher.java:375) Caused by: org.apache.maven.plugin.MojoExecutionException: Exception merging web.xml at org.codehaus.cargo.maven2.UberWarMojo.doWebXmlMerge( UberWarMojo.java:214) at org.codehaus.cargo.maven2.UberWarMojo.execute(UberWarMojo.java :169) at org.apache.maven.plugin.DefaultPluginManager.executeMojo( DefaultPluginManager.java:412) at org.apache.maven.lifecycle.DefaultLifecycleExecutor.executeGoals( DefaultLifecycleExecutor.java:534) ... 16 more Caused by: java.lang.NullPointerException at org.codehaus.cargo.maven2.UberWarMojo.doWebXmlMerge( UberWarMojo.java:208) ... 19 more [INFO] [INFO] Total time: 7 seconds [INFO] Finished at: Tue Nov 07 10:55:44 CET 2006 [INFO] Final Memory: 5M/11M [INFO] Here is my merge.xml: uberwar wars wartoto:tutu-web/war wartoto:tutu-testui/war /wars /uberwar -- OQube software engineering \ génie logiciel Arnaud Bailly, Dr. \web http://www.oqube.com - To unsubscribe, e-mail: [EMAIL PROTECTED] For additional commands, e-mail: [EMAIL PROTECTED]
Surefire tests occasionally failing because the classpath is wrong
In my multiproject, I use commons-lang. In my pom.xml (atend) I have the dependency set (excluded once to avoid a conflict). However, the build sometimes fails, but not always, trying to find a class in commons-lang. Doing a mvn -X shows it isn't present on the test classpath. Is there some way I can persuade it to be on the classpath correctly? dependency groupIdacegisecurity/groupId artifactIdacegi-security/artifactId version1.0.0/version exclusions exclusion !-- don't use this old version of commons-lang -- artifactIdcommons-lang/artifactId groupIdcommons-lang/groupId /exclusion ... dependency ... dependency !-- this ought to be fetched transitively -- groupIdcommons-lang/groupId artifactIdcommons-lang/artifactId version2.1/version typejar/type scopetest/scope /dependency
surefire transitive classpath bug ?
I think this is a bug in surefire. If I run my project with mvn -X I see in the output : [DEBUG] Retrieving parent-POM: org.apache.myfaces.maven:myfaces-master::1.0.2 for project: org.apache.myfaces.core:myfaces-core-project:pom:1.1. from the repository. [DEBUG] org.apache.myfaces.core:myfaces-impl:jar:1.1.3:compile(selected for compile) [DEBUG] commons-codec:commons-codec:jar:1.3:compile (selected for compile) [DEBUG] commons-collections:commons-collections:jar:3.1:compile(selected for compile) [DEBUG] commons-logging:commons-logging:jar:1.0.4:compile (selected forcompile) [DEBUG] commons-el:commons-el:jar:1.0:compile (selected for compile) [DEBUG] commons-logging:commons-logging:jar:1.0.3:compile (removed - nearer found: 1.0.4) [DEBUG] org.apache.myfaces.core:myfaces-api:jar:1.1.3:compile(selected for compile) [DEBUG] javax.servlet:jstl:jar:1.1.0:compile (selected for compile) [DEBUG] commons-digester:commons-digester:jar:1.6:compile (selected for compile) [DEBUG] commons-beanutils:commons-beanutils:jar:1.6:compile(selected for compile) [DEBUG] commons-logging:commons-logging:jar:1.0:compile (removed - nearer found: 1.0.4) [DEBUG] commons-collections:commons-collections:jar:2.0:compile(removed - nearer found: 3.1) ... [DEBUG] org.springframework:spring:jar:2.0:compile (selected for compile) [DEBUG] commons-logging:commons-logging:jar:1.0.4:compile (removed - nearer found: 1.1) [DEBUG] commons-logging:commons-logging:jar:1.1:compile (selected for compile) [DEBUG] avalon-framework:avalon-framework:jar:4.1.3:compile(selected for compile) [DEBUG] javax.servlet:servlet-api:jar:2.3:compile (selected for compile) [DEBUG] commons-logging:commons-logging:jar:1.0:compile (removed - nearer found: 1.0.4) ... etc., etc However, the version of commons-logging that appears in my classpath for test is 1.0.4, NOT 1.1 as it appears that it ought to be. Additionally, this means that avalon-framework is not present in my surefire test classpath, causing tests to break.
Re: surefire transitive classpath bug ?
But the -X shows 2 lines for commons-logging, both claiming to be selected for compile: [DEBUG] commons-logging:commons-logging:jar:1.1:compile (selected for compile) [DEBUG] commons-logging:commons-logging:jar:1.0.4:compile (selected for compile) Though actually the compile works, it's the tests that don't. This is actually not because of commons-logging, but one of the dependencies that 1.1 includes, but 1.0.4 does not. I also have another problem elsewhere where tests *randomly* fail, because commons-lang isn't on the test classpath, even though it's the very first jar file specified in the POM! This leads me to suspect that the dependencies are being stuffed in a hashmap somewhere, and dependent upon the phase of the moon, one particular version 'wins'. This has basically destroyed our ability to do CI builds, and developers must 'mvn install - if tests fail, rinse, repeat'. I'll file a JIRA with full -X output; I've been testing with surefire 2.3-SNAPSHOT which may be better. As is always the case with these things, turning on -X tends to make it work ;-S On 08/12/06, Wendy Smoak [EMAIL PROTECTED] wrote: On 12/8/06, Nigel Magnay [EMAIL PROTECTED] wrote: I think this is a bug in surefire. If I run my project with mvn -X I see in the output : ... [DEBUG] commons-logging:commons-logging:jar:1.0:compile(removed - nearer found: 1.0.4) ... However, the version of commons-logging that appears in my classpath for test is 1.0.4, NOT 1.1 as it appears that it ought to be. Additionally, this means that avalon-framework is not present in my surefire test classpath, causing tests to break. If removed - nearer found: 1.0.4 is the last thing you see, then that's the version Maven has chosen. The list archives contain plenty of discussion about Maven 2.0.4's approach to transitive dependency resolution, especially when dependencyManagement is in use. It's something that has been improved for the next release. The usual fix is the explicitly declare the dependency version you want, in the project pom where the problem is occurring. If that doesn't help, we need to see the pom and the full output of mvn -X. -- Wendy - To unsubscribe, e-mail: [EMAIL PROTECTED] For additional commands, e-mail: [EMAIL PROTECTED]
Re: Maven shell
I actually had a related issue - our project is comprised of many jar and war fragments and, good though the jetty plugin is, it's helpful when debugging webapps in eclipse when hotswap fails (which is always), to be able to have it automatically copy changed class files into the running application. The existing eclipse integrations (last time I looked) seemed to concentrate on replacing the IDE compile function with maven which is a total non-starter, because it takes *forever*. I did something simple in .net (mostly because when I go near the maven codebase, the vast array of poorly documented dependences like plexus scares me and I don't have the time to figure it all out), but I think a decent editor that didn't force people to hack XML would be nice, as well as the ability to do things like - detect possibly unneded dependencies - promote common dependencies to a parent pom - UI for setting up common plugins (compile, report, assembly) delivered as an eclipse plugin would be useful. On 04/02/07, Arnaud Bailly [EMAIL PROTECTED] wrote: Hello, Sure, I would be interested in working on such a tool. My particular need is that I want to make a continuous testing tool indenpendent of any IDE and based on informations in the POM. As for the virtual ant system, that could be thought of as a kind of GUI over a maven shell. One could think adding web-based GUI, something like a finer grained continuum. I have a bit of time, so maybe I could start something. Ideas, code, specs are welcomed... regards, -- OQube software engineering \ génie logiciel Arnaud Bailly, Dr. \web http://www.oqube.com - To unsubscribe, e-mail: [EMAIL PROTECTED] For additional commands, e-mail: [EMAIL PROTECTED] - To unsubscribe, e-mail: [EMAIL PROTECTED] For additional commands, e-mail: [EMAIL PROTECTED]
Re: Maven shell
I did something simple in .net (mostly because when I go near the maven codebase, the vast array of poorly documented dependences like plexus scares me and I don't have the time to figure it all out), but That's also what is refraining me ! I love using maven, I think the people behind it did a great job, but contributing on your spare time is somewhat challenging as coding information is lacking, particularly as you noted it on related projects. I think the documentation is getting better - it's a shame that maven has a whiff of wheel-reinventing going on which is disappointing (plexus not spring, wagon not vfs) - which means these technologies have to be grasped that are *only* useful to maven. I gave up trying to fix bugs because the barrier to entry is just too great for the time I have :-( I think a decent editor that didn't force people to hack XML would be nice, as well as the ability to do things like - detect possibly unneded dependencies Could you please elaborate on this ? Do you think about something like extracting dependencies from compiled classes and inferring unused deps ? That kind of thing would be good, but also something simple like A and B are both modules of C, and all have dependency X, so you couild remove it. - promote common dependencies to a parent pom Something like easy dependencyManagement ? Yes - exactly that. - UI for setting up common plugins (compile, report, assembly) Could you please elaborate on this one too ? What I've got is : Tree View on left hand side, with hierarchy of POM projects (modules shown as children). Click on a POM item, right-hand-side is tabbed dialog, with various sections (build, dependencies, plugins, etc). For plugins, it'd be nice to have a GUI for setting up, say, reports, with textboxes to fill in rather than having to remember the right XML values. BTW, did you try the pomtools plugin ? Yep - the tree looks rather like what I did for the dependencies view. I'm after more of a GUI thing - I find some stuff like the archetype generator is useful, but I use it so rarely that in the interval between uses I've forgotten all the required parameters, and it ends up taking me longer tracking them down than crafting a new POM from adjusting a similar existing one... - To unsubscribe, e-mail: [EMAIL PROTECTED] For additional commands, e-mail: [EMAIL PROTECTED]
deploy problem
I've a recent version of archiva in tomcat (windows), and maven 2.0.5 on the client; I'm having problems doing deploys. With wagon-1.0-beta-2 it complains of insufficient storage - some files do deploy however. On the server logs, I get an error reported: Unable to service DAV request due to unconfigured DavServerComponent for prefix []. I've attached some more log info - bug or user error ? --- -- request -- -- http://team.cswgroup.local:8080/repository GET /archiva/repository/snapshot/com/cswgroup/commons/2.2-SNAPSHOT/maven- metadata.xml.sha1 HTTP/1.1 expires: 0 accept-encoding: gzip x-wagon-version: 1.0-beta-2 cache-store: no-store pragma: no-cache cache-control: no-cache x-wagon-provider: wagon-webdav user-agent: Jakarta Commons-HttpClient/2.0.2 host: team.cswgroup.local:8080 2007-02-28 16:26:59,420 [http-8080-Processor23] ERROR [RepositoryServlet]- Servlet.service() for servlet RepositoryServlet threw exception javax.servlet.ServletException: Unable to service DAV request due to unconfigured DavServerComponent for prefix []. at org.codehaus.plexus.webdav.servlet.multiplexed.MultiplexedWebDavServlet.service (MultiplexedWebDavServlet.java:93) at javax.servlet.http.HttpServlet.service(HttpServlet.java:802) at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter( ApplicationFilterChain.java:252) at org.apache.catalina.core.ApplicationFilterChain.doFilter( ApplicationFilterChain.java:173) at com.opensymphony.webwork.dispatcher.FilterDispatcher.doFilter( FilterDispatcher.java:189) at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter( ApplicationFilterChain.java:202) at org.apache.catalina.core.ApplicationFilterChain.doFilter( ApplicationFilterChain.java:173) at com.opensymphony.module.sitemesh.filter.PageFilter.doFilter( PageFilter.java:39) at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter( ApplicationFilterChain.java:202) at org.apache.catalina.core.ApplicationFilterChain.doFilter( ApplicationFilterChain.java:173) at com.opensymphony.webwork.dispatcher.ActionContextCleanUp.doFilter( ActionContextCleanUp.java:88) at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter( ApplicationFilterChain.java:202) at org.apache.catalina.core.ApplicationFilterChain.doFilter( ApplicationFilterChain.java:173) at org.apache.catalina.core.StandardWrapperValve.invoke( StandardWrapperValve.java:213) at org.apache.catalina.core.StandardContextValve.invoke( StandardContextValve.java:178) at org.apache.catalina.core.StandardHostValve.invoke( StandardHostValve.java:126) at org.apache.catalina.valves.ErrorReportValve.invoke( ErrorReportValve.java:105) at org.apache.catalina.core.StandardEngineValve.invoke( StandardEngineValve.java:107) at org.apache.catalina.connector.CoyoteAdapter.service( CoyoteAdapter.java:148) at org.apache.coyote.http11.Http11Processor.process(Http11Processor.java :869) at org.apache.coyote.http11.Http11BaseProtocol$Http11ConnectionHandler.processConnection (Http11BaseProtocol.java:664) at org.apache.tomcat.util.net.PoolTcpEndpoint.processSocket( PoolTcpEndpoint.java:527) at org.apache.tomcat.util.net.LeaderFollowerWorkerThread.runIt( LeaderFollowerWorkerThread.java:80) at org.apache.tomcat.util.threads.ThreadPool$ControlRunnable.run( ThreadPool.java:684) at java.lang.Thread.run(Thread.java:619) -- request -- -- http://team.cswgroup.local:8080/repository MKCOL /archiva/repository/snapshot/ HTTP/1.1 user-agent: Jakarta Commons-HttpClient/2.0.2 host: team.cswgroup.local:8080 cookie: $Version=0; JSESSIONID=824AEDF2E41F39C723C69DA9C7AB6984; $Path=/archiva
Re: Deploy failing to load webdav extension
I'm seeing similar problems - there are artifacts that are in ibiblio that are not in repo1 I added http://mirrors.ibiblio.org/pub/mirrors/maven2 into my topmost pom, and most things then worked. Depressingly, archiva currently refuses to proxy correctly for the above, and is still failing to deploy artifacts with webdav (different errors depending upon which version of wagon dav I use). This is depressing, since I have to use windows for our repository (so no SCP). Maybe I'll give Artifactory a go. I've already ditched continuum for Hudson. On 01/03/07, drekka [EMAIL PROTECTED] wrote: Hi all, A month or so ago I installed archiva and was able to deploy to it using the web extension as suggested in the documentation. I just tried again today and it failed telling me that the dav protocol is un-supported. I tried using both beta-2 and beta-2 extensions. Turning on debug I see log messages indicating that it cannot load the jetty 4.2.12 pom. So I go over to repo1.maven.org and take a look. Sure enough there is no pom for this version of jetty. Does anyone have any idea whats changed or what I'm missing ?? ciao Derek -- View this message in context: http://www.nabble.com/Deploy-failing-to-load-webdav-extension-tf3324320.html#a9241936 Sent from the archiva-users mailing list archive at Nabble.com.
Re: [ANN] Artifactory - new Maven 2 proxy repository
My experience so far: Archiva : Alpha; doesn't work (random webdav deployment failures), loads of bugs, low rate of progress. Feels dead. Proximity : Works; slightly confusing (don't like the separation of metadata); lots of new releases constantly; hard to configure (hacking around with spring config files) - our install takes *forever* to restart. m2proxy: simple, but simple. Fingers crossed that artifactory hits the sweet spot... On 05/03/07, Wim Deblauwe [EMAIL PROTECTED] wrote: Yes, but it gets quite difficult to see the differences between them... 2007/3/5, Dan Tran [EMAIL PROTECTED]: It is good to see a healthy set of competitions in maven-proxy space :-). On 3/4/07, Frederic Simon [EMAIL PROTECTED] wrote: Sorry about the link explanation: The username password for the guest live demo is guest/guest. On 3/5/07, Yoav Landman [EMAIL PROTECTED] wrote: Hi all, We would like to announce the immediate availability of Artifactory, a Maven 2 enterprise proxy. Artifactory offers advanced proxying, caching and security facilities to answer the needs of a robust, reproducible and independent build environment using Maven 2. It uses a JSR-170 Java Content Repository (JCR) for storage, which makes it extremely easy to manage searchable metadata, and provide extended features such as security, transacted operations, auditing, locking, etc. Artifactory is distributed under APLv2 at http://artifactory.sourceforge.net. It is currently available as a downloadable archive, that can be run out of the box (with default settings). An install script to run it as a Linux service is also provided. A (limited) guest live demo is available at http://www.jfrog.org/artifactory/ You are welcome to give it a go! Cheers, Yoav Landman, The Artifactory Team -- View this message in context: http://www.nabble.com/-ANN--Artifactory---new-Maven-2-proxy-repository-tf3344299s177.html#a9301347 Sent from the Maven - Users mailing list archive at Nabble.com. - To unsubscribe, e-mail: [EMAIL PROTECTED] For additional commands, e-mail: [EMAIL PROTECTED] -- Frederic Simon - Senior Architect AlphaCSP Israel Malam Group Phone: +972 3 5312388 * Fax: +972 3 5312512 Mobile: +972 54 954301 13 Gosh Etzion St.,Givat Shmuel 54030, Israel http://www.alphacsp.com/ http://www.malam.com/ -- Vigilog - an open source log file viewer: http://vigilog.sourceforge.net Blog: http://www.jroller.com/page/Fester
mirroring polluting local repository?
I have an annoying problem. I have an artifactory mirror running, and it seems to be working correctly. I have various mirrors set up, one of which is for the apache-m2-snapshots. This all works correctly, and files are getting placed in the right repositories. However In my local repo, I'm getting the metadata stored in the wrong place. For example, In repo\org\apache\maven\skins\maven-default-skin I have maven-metadata-central.xml. But - this *isn't* the metadata from central, it's the metadata from the snapshot repo! Running site:site with -X yields no clues, as it doesn't tell you *where* it thinks it is getting this data from : [INFO] [site:site] [DEBUG] Searching for parent-POM: org.codehaus.cargo:cargo-core::1.0-SNAPSHOT of project: null:cargo-core-api:pom:1.0-SNAPSHOT in relative path: ../pom.xml [DEBUG] Using parent-POM from the project hierarchy at: '../pom.xml' for project: null:cargo-core-api:pom:1.0-SNAPSHOT [DEBUG] Searching for parent-POM: org.codehaus.cargo:cargo-parent::3-SNAPSHOT of project: null:cargo-core:pom: 1.0-SNAPSHOT in relative path: ../pom/pom.xml [DEBUG] Using parent-POM from the project hierarchy at: '../pom/pom.xml' for project: null:cargo-core:pom:1.0-SNAPSHOT [DEBUG] Searching for parent-POM: org.codehaus.cargo:cargo-parent::3-SNAPSHOT of project: null:cargo-core:pom: 1.0-SNAPSHOT in relative path: ../pom/pom.xml [DEBUG] Using parent-POM from the project hierarchy at: '../pom/pom.xml' for project: null:cargo-core:pom:1.0-SNAPSHOT [INFO] artifact org.apache.maven.skins:maven-default-skin: checking for updates from central [WARNING] *** CHECKSUM FAILED - Checksum failed on download: local = '134b49badcc7a6de341b1d4f6c0355846ec2224d'; remote = 'fa58f552f7b0d2d697f7b006e4b007996bcbaae1' - RETRYING [WARNING] *** CHECKSUM FAILED - Checksum failed on download: local = '134b49badcc7a6de341b1d4f6c0355846ec2224d'; remote = 'fa58f552f7b0d2d697f7b006e4b007996bcbaae1' - IGNORING [DEBUG] maven-default-skin: resolved to version 1.1-SNAPSHOT from repository central [DEBUG] Skipping disabled repository central [DEBUG] maven-default-skin: using locally installed snapshot [DEBUG] Skipping disabled repository central [INFO] [ERROR] BUILD ERROR [INFO] It seems like it's pulling this information from the wrong mirror, but I can't work out why...?
Re: Maven's included with Leopard (Mac OS 10.5)
symlink usr/share/maven to whichever install you want to use. On Nov 15, 2007 5:00 PM, Ryan Scott [EMAIL PROTECTED] wrote: Does anyone know how to fix this? I tried using darwin ports to upgrade but leopard keeps overlaying the install with 2.0.6 which does not work for my application. What were they thinking baking this into the operating system. This is almost as bad as the Java Story. Are they trying to drive developers away from Apple on purpose? Scott Ryan CTO Soaring Eagle L.L.C. Denver, Co. 80129 www.soaringeagleco.com www.theryansplace.com (303) 263-3044 [EMAIL PROTECTED] On Nov 15, 2007, at 9:53 AM, Lally Singh wrote: Just fyi: [EMAIL PROTECTED] ~]$ uname -a Darwin hc65210f0.dhcp.vt.edu 9.0.0 Darwin Kernel Version 9.0.0: Tue Oct 9 21:35:55 PDT 2007; root:xnu-1228~1/RELEASE_I386 i386 [EMAIL PROTECTED] ~]$ which mvn /usr/bin/mvn [EMAIL PROTECTED] ~]$ mvn --version Maven version: 2.0.6 [EMAIL PROTECTED] ~]$ I only noticed this when I upgraded and noticed my build failing (I needed 2.0.7 :-) ). Congrats! -- H. Lally Singh Ph.D. Candidate, Computer Science Virginia Tech - To unsubscribe, e-mail: [EMAIL PROTECTED] For additional commands, e-mail: [EMAIL PROTECTED] - To unsubscribe, e-mail: [EMAIL PROTECTED] For additional commands, e-mail: [EMAIL PROTECTED]
Re: Maven users interested in the future of Proximity
Is this going to beta any time soon? Our artifactory proxy has rotted and started serving 0 byte files randomly (who knows if it's the underlying repo); I was never that keen on proximity's 'configure me by hacking spring config xml', particularly if it's about to be deprecated soon for something more shiny.. On Nov 21, 2007 11:10 PM, Jason van Zyl [EMAIL PROTECTED] wrote: http://blogs.sonatype.com/jvanzyl/2007/11/21.html Thanks, Jason -- Jason van Zyl Founder, Apache Maven jason at sonatype dot com -- - To unsubscribe, e-mail: [EMAIL PROTECTED] For additional commands, e-mail: [EMAIL PROTECTED] - To unsubscribe, e-mail: [EMAIL PROTECTED] For additional commands, e-mail: [EMAIL PROTECTED]
mojo cookbook; resolving transitively
The Mojo Developer Cookbook (http://docs.codehaus.org/display/MAVENUSER/Mojo+Developer+Cookbook) is great. In it, there's a section on Resolving Transitively. This works great - however, the artifact set that is resolved is based on Maven's idea of which graph edges ought to be considered for resolution. For example, if I have a WAR file, then any declared dependencies that are JARs (and POM I think) are included, and those artifacts have their dependencies added to the list (and so on, recursively). The end result resolves to a single version for each JAR artifact. However - if the WAR file has a WAR dependency, then it will not be considered, and any JAR files in there won't be included in the set that the version is resolved in. This is the behaviour that I want, where I have a WAR of WARs, and I want to calculate the resolved set of dependencies. a) How can I interject my own requirements for dependency traversal? b) Is / how is this changed in 2.1 - all the methods on ArtifactResolver have been condensed down into 1.. - To unsubscribe, e-mail: [EMAIL PROTECTED] For additional commands, e-mail: [EMAIL PROTECTED]
mojo bug when not in a reactor ?
Hi I use the standard bit of code to get all the transitively-resolved artifacts from a project : Artifact art2 = some artifact in the repository ... resolver.resolve(art2, remoteRepositories, localRepository); MavenProject pomProject = mavenProjectBuilder.buildFromRepository( art2, remoteRepositories, localRepository ); Set artifacts = pomProject.createArtifacts( artifactFactory, null, null); ArtifactFilter filter = null; ArtifactResolutionResult arr = resolver.resolveTransitively(artifacts, art2, localRepository, remoteRepositories, artifactMetadataSource, filter); Set resartifacts = arr.getArtifacts(); This is good - BUT - I get different results in the resartifacts set if I'm running in the reactor to when I'm not (!) If I run in the reactor, some of the transitive dependencies have already been seen by maven. So if I print out all the dependencies, I get some entries like this: Artifact: active project artifact: artifact = com.blah.modules:blah-services:jar:1.0-SNAPSHOT:compile; project: [EMAIL PROTECTED] In this project (com.blah...), there are exclude nodes specified. They seem obeyed in this instance When I run out-of-reactor, I (of course) don't get an 'active project artifact', just a plain one. *However*. It seems that the excludes haven't been honoured (or, they haven't been honoured correctly). Is there any oddity around how resolution in these instances are supposed to operate? - To unsubscribe, e-mail: [EMAIL PROTECTED] For additional commands, e-mail: [EMAIL PROTECTED]
Re: [POLL] Why are you not able to use the most recent maven release?
2.0.7, with local patches. Because I've tried about 3 times to get someone to apply a fix for MNG-3284 (or tell me if the attached patch won't work because of something non-obvious), and it just never goes anywhere. :-/ On Fri, Mar 7, 2008 at 9:44 PM, Brian E. Fox [EMAIL PROTECTED] wrote: I get the sense that lots of people are using older versions of Maven due to various regressions. As we get closer to 2.1 alpha, we need to ensure that we identify the regressions across the 2.0 line so that we can make sure they are fixed in 2.1 and so that users can upgrade to a recent 2.0.x before trying out 2.1. If this is the case for you, please reply and state the version you're using and why (preferably referring to a Jira). We will use this information to prioritize issues for 2.0.10 and beyond. Thanks. - To unsubscribe, e-mail: [EMAIL PROTECTED] For additional commands, e-mail: [EMAIL PROTECTED]
Re: parallel building
Not as far as I know. The Hudson CI tool claims to be able to parallel build in the correct order, but I don't know how good it is at it. IMHO it'd be a very powerful addition - if you've got something like an 8-core mac, it'd be nice to keep more of it busy. Even farming out builds to separate machines (something I believe XCode can do) would be even nicer... Maybe there's a JIRA issue for this. On 10/04/07, emerson cargnin [EMAIL PROTECTED] wrote: Is there any way to make multi modules (regarding to both maven 1 and maven 2) builds to work in parallel when it detects a project has no dependencies (that need to be built beforehand)? thanks emerson - To unsubscribe, e-mail: [EMAIL PROTECTED] For additional commands, e-mail: [EMAIL PROTECTED]
Re: parallel building
I have added it as MNG-3004. I had a quick play to see if I could get it to work - I created a POC that could build in parallel, but my threads don't seem to have the requisite plexus gubbins in their classpath, and I'm not sure how all that stuff works. If someone could tell me how to fix that it'd be good - I know supplementary things will need more thought (e.g. interleaved logging from the different threads), but it oughtn't to be insurmountable. On 10/04/07, emerson cargnin [EMAIL PROTECTED] wrote: I had a look on Hudson and I think it just distribute different projects, not the same project build: https://hudson.dev.java.net/masterSlave.html I couldn't found in any of Jira projects any thing regarding to parallel multi module build. Does anyone knows if this search is right? : http://jira.codehaus.org/secure/IssueNavigator.jspa?reset=truepid=10500pid=11124pid=11533pid=11123pid=11125pid=11126pid=11191pid=11211pid=11212pid=11127pid=11128pid=11227pid=11129pid=11226pid=11213pid=11130pid=11214pid=11131pid=11310pid=11361pid=11132pid=11133pid=11134pid=11530pid=11341pid=11431pid=11532pid=11141pid=11215pid=11135pid=11136pid=11441pid=11137pid=11216pid=11138pid=11340pid=11217pid=11218pid=11220pid=11221pid=11241pid=11293pid=11139pid=11140pid=11142pid=11143pid=11144pid=11391pid=11250pid=11145pid=11531pid=11390pid=11223pid=11483pid=11540pid=11146pid=11147pid=11224pid=11149pid=11150pid=11362pid=11225pid=11095pid=11481pid=10940pid=0query=parallelsummary=truedescription=truebody=true Thanks a lot Emerson On 10/04/07, Nigel Magnay [EMAIL PROTECTED] wrote: Not as far as I know. The Hudson CI tool claims to be able to parallel build in the correct order, but I don't know how good it is at it. IMHO it'd be a very powerful addition - if you've got something like an 8-core mac, it'd be nice to keep more of it busy. Even farming out builds to separate machines (something I believe XCode can do) would be even nicer... Maybe there's a JIRA issue for this. On 10/04/07, emerson cargnin [EMAIL PROTECTED] wrote: Is there any way to make multi modules (regarding to both maven 1 and maven 2) builds to work in parallel when it detects a project has no dependencies (that need to be built beforehand)? thanks emerson - To unsubscribe, e-mail: [EMAIL PROTECTED] For additional commands, e-mail: [EMAIL PROTECTED] - To unsubscribe, e-mail: [EMAIL PROTECTED] For additional commands, e-mail: [EMAIL PROTECTED]
m2 assembly confusion
I have a project that is buildig in Hudson, that uses the m2 assembly plugin. In the project pom, it explicitly sets the version of that plugin to 2.1. It builds correctly from the commandline, and it mostly builds correctly in hudson, but occasionally (I suspect it might be an update issue), it fails - and it seems to be using / including the 2.2-beta-1 plugin AND the 2.1. I don't know if this is a m2 bug, or a hudson bug, or a combination (m2 2.0.5, hudson 1.106); attached is the build output: [INFO] [jar:jar] [INFO] Building jar: e:\tomcat-home\.hudson\jobs\KES\workspace\trunk\sample-publication\workflow-guide-package\target\workflow-guide-package-2.0-SNAPSHOT.jar - this realm = app0.child-container[org.apache.maven.plugins:maven-assembly-plugin] urls[0] = file:/C:/Documents and Settings/tomcat/.m2/repository/org/apache/maven/plugins/maven-assembly-plugin/2.2-beta-1/maven-assembly-plugin-2.2-beta-1.jar urls[1] = file:/C:/Documents and Settings/tomcat/.m2/repository/org/apache/maven/plugins/maven-assembly-plugin/2.1/maven-assembly-plugin-2.1.jar urls[2] = file:/C:/Documents and Settings/tomcat/.m2/repository/org/codehaus/plexus/plexus-archiver/1.0-alpha-6/plexus-archiver-1.0-alpha-6.jar urls[3] = file:/C:/Documents and Settings/tomcat/.m2/repository/org/apache/maven/maven-archiver/2.0.4/maven-archiver-2.0.4.jar urls[4] = file:/C:/Documents and Settings/tomcat/.m2/repository/org/apache/maven/shared/file-management/1.0/file-management-1.0.jar Number of imports: 0 this realm = plexus.core.maven urls[0] = file:/E:/XAMPP/xampp/tomcat/webapps/hudson/WEB-INF/lib/maven-interceptor-1.106.jar urls[1] = file:/c:/maven/lib/commons-cli-1.0.jar urls[2] = file:/c:/maven/lib/doxia-sink-api-1.0-alpha-7.jar urls[3] = file:/c:/maven/lib/jsch-0.1.27.jar urls[4] = file:/c:/maven/lib/jtidy-4aug2000r7-dev.jar urls[5] = file:/c:/maven/lib/maven-artifact-2.0.5.jar urls[6] = file:/c:/maven/lib/maven-artifact-manager-2.0.5.jar urls[7] = file:/c:/maven/lib/maven-core-2.0.5-javadoc.jar urls[8] = file:/c:/maven/lib/maven-core-2.0.5.jar urls[9] = file:/c:/maven/lib/maven-error-diagnostics-2.0.5.jar urls[10] = file:/c:/maven/lib/maven-model-2.0.5.jar urls[11] = file:/c:/maven/lib/maven-monitor-2.0.5.jar urls[12] = file:/c:/maven/lib/maven-plugin-api-2.0.5.jar urls[13] = file:/c:/maven/lib/maven-plugin-descriptor-2.0.5.jar urls[14] = file:/c:/maven/lib/maven-plugin-parameter-documenter-2.0.5.jar urls[15] = file:/c:/maven/lib/maven-plugin-registry-2.0.5.jar urls[16] = file:/c:/maven/lib/maven-profile-2.0.5.jar urls[17] = file:/c:/maven/lib/maven-project-2.0.5.jar urls[18] = file:/c:/maven/lib/maven-reporting-api-2.0.5.jar urls[19] = file:/c:/maven/lib/maven-repository-metadata-2.0.5.jar urls[20] = file:/c:/maven/lib/maven-settings-2.0.5.jar urls[21] = file:/c:/maven/lib/plexus-interactivity-api-1.0-alpha-4.jar urls[22] = file:/c:/maven/lib/wagon-file-1.0-beta-2.jar urls[23] = file:/c:/maven/lib/wagon-http-lightweight-1.0-beta-2.jar urls[24] = file:/c:/maven/lib/wagon-http-shared-1.0-beta-2.jar urls[25] = file:/c:/maven/lib/wagon-provider-api-1.0-beta-2.jar urls[26] = file:/c:/maven/lib/wagon-ssh-1.0-beta-2.jar urls[27] = file:/c:/maven/lib/wagon-ssh-common-1.0-beta-2.jar urls[28] = file:/c:/maven/lib/wagon-ssh-external-1.0-beta-2.jar urls[29] = file:/c:/maven/lib/xml-apis-1.0.b2.jar urls[30] = file:/C:/Documents and Settings/tomcat/.m2/repository/ant/ant/1.5/ant-1.5.jar Number of imports: 0 this realm = plexus.core urls[0] = file:/c:/maven/core/plexus-container-default-1.0-alpha-9.jar urls[1] = file:/c:/maven/core/plexus-utils-1.1.jar Number of imports: 0 - [HUDSON] Archiving e:\tomcat-home\.hudson\jobs\KES\workspace\trunk\sample-publication\workflow-guide-package\pom.xml [HUDSON] Archiving e:\tomcat-home\.hudson\jobs\KES\workspace\trunk\sample-publication\workflow-guide-package\target\workflow-guide-package-2.0-SNAPSHOT.jar [INFO] [ERROR] BUILD ERROR [INFO] [INFO] Internal error in the plugin manager executing goal 'org.apache.maven.plugins:maven-assembly-plugin:2.2-beta-1:attached': Unable to find the mojo 'org.apache.maven.plugins:maven-assembly-plugin:2.2-beta-1:attached' in the plugin 'org.apache.maven.plugins:maven-assembly-plugin' org/codehaus/plexus/archiver/ArchiveFileFilter [INFO] - To unsubscribe, e-mail: [EMAIL PROTECTED] For additional commands, e-mail: [EMAIL PROTECTED]
problem with release:prepare and type / packaging
I have a multi-module project that works fine for everything, apart from release:perform. I think I know why, but I don't know how to fix it. Basically, I have a project that is defined as packaginguberwar/packaging. This artifact is built into a .WAR file. The dependency that includes it has typewar/type (which is what it is). When release:prepare is executed, it fails, because it claims that it can't find the referenced war file in any local repository (error similar to one attached at the end). I think this is something to do with the type/packaging difference - but I don't see why it's behaving differently in the release plugin. Is this a bug or is there a workaround ? Missing: -- 1) com.cswgroup.kms.kes.apps:kms:war:2.2 Try downloading the file manually from the project website. Then, install it using the command: mvn install:install-file -DgroupId=com.cswgroup.kms.kes.apps-DartifactId=kms \ -Dversion=2.2 -Dpackaging=war -Dfile=/path/to/file Path to dependency: 1) com.cswgroup.kms.kes.installer:KES-Install:jar:2.2 2) com.cswgroup.kms.kes.apps:kms:war:2.2
Massive number of threads being created on project shutdown ??
Hi I've recently switched to using OS X. Our project that builds fine on Windows displays some rather odd behaviour. When maven closes down (after a BUILD FAILURE) message, the machine slows down. Observing the process in activity monitor shows that the number of java threads leaps from about 10, to about 250, all of which appear to be trying to delete something (I'm less sure of this as I'm new to reading the inspection output). This means effectively the machine freezes for about 5 minutes whilst the threadcount steadily goes down... This is maven 2.0.7 on OS X's java 1.5.0_07-164. Any ideas?
JAR file with a different extension ?
Hi I'm building a project with jar:install-snapshot goal, which works fine. I need the file to actually have the extension .wsr when it is assembled into an EAR file. Should I do this with something like a rename repository:copy-jar, or some kind of step Before the ear:ear ? -- Activiti Ltd. 7th Floor, Nicholsons House Nicholsons Walk Maidenhead SL6 1LD DDI: 01628 513 625 Mob 07973 285 424 - To unsubscribe, e-mail: [EMAIL PROTECTED] For additional commands, e-mail: [EMAIL PROTECTED]
reactor being called when goal is no explicit
We have the following layout /Proj - root /Proj/Build - maven.xml that uses the reactor /Proj/Hub/P1 - 1st project /Proj/Hub/P2 - 2nd project Our Build reactor calls an install goal in P1 and P2 which they define in their own maven.xml fine. I can go into either P1 or P2 directly, and do 'maven install' and it does the expected thing. However, if I run another goal such as site:generate from P1, it seems to 'magically' find the reactor project In /Proj/Build, get the paths all wrong, and die. How can I tell it I don't want it to do whatever clever things it is attempting? -- - To unsubscribe, e-mail: [EMAIL PROTECTED] For additional commands, e-mail: [EMAIL PROTECTED]
problem with linkcheck:linkcheck
For some reason, some of my projects crash in site:generate when it does the linkcheck part. I've attached the (rather large) exception - are there any ideas as to what it could be and How I might fix it? -- BUILD FAILED File.. file:/C:/Documents and Settings/nigelm/.maven/plugins/maven-linkcheck-plugin-1.2/plugin.jelly Element... linkcheck:linkcheck Line.. 91 Column 9 Implementing class com.werken.werkz.UnattainableGoalException: Unable to obtain goal [site] -- file:/C:/Documents and Settings/nigelm/.maven/plugins/maven-linkcheck-plug in-1.2/plugin.jelly:91:9: linkcheck:linkcheck Implementing class at com.werken.werkz.Goal.fire(Goal.java:646) at com.werken.werkz.Goal.attain(Goal.java:575) at com.werken.werkz.Goal.attainPrecursors(Goal.java:488) at com.werken.werkz.Goal.attain(Goal.java:573) at com.werken.werkz.WerkzProject.attainGoal(WerkzProject.java:193) at org.apache.maven.plugin.PluginManager.attainGoals(PluginManager.java:531 ) at org.apache.maven.MavenSession.attainGoals(MavenSession.java:265) at org.apache.maven.cli.App.doMain(App.java:466) at org.apache.maven.cli.App.main(App.java:1117) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.jav a:39) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessor Impl.java:25) at java.lang.reflect.Method.invoke(Method.java:324) at com.werken.forehead.Forehead.run(Forehead.java:551) at com.werken.forehead.Forehead.main(Forehead.java:581) org.apache.commons.jelly.JellyTagException: file:/C:/Documents and Settings/nigelm/.maven/plugins/maven-linkcheck-plugin-1.2/plugin.jelly:9 1:9: linkc heck:linkcheck Implementing class at org.apache.commons.jelly.impl.DynamicBeanTag.doTag(DynamicBeanTag.java:2 43) at org.apache.commons.jelly.impl.StaticTagScript.run(StaticTagScript.java:1 45) at org.apache.commons.jelly.impl.ScriptBlock.run(ScriptBlock.java:135) at org.apache.maven.jelly.tags.werkz.MavenGoalTag.runBodyTag(MavenGoalTag.j ava:78) at org.apache.maven.jelly.tags.werkz.MavenGoalTag$MavenGoalAction.performAc tion(MavenGoalTag.java:99) at com.werken.werkz.Goal.fire(Goal.java:639) at com.werken.werkz.Goal.attain(Goal.java:575) at com.werken.werkz.WerkzProject.attainGoal(WerkzProject.java:193) at org.apache.maven.jelly.tags.werkz.MavenAttainGoalTag.doTag(MavenAttainGo alTag.java:126) at org.apache.commons.jelly.impl.TagScript.run(TagScript.java:279) at org.apache.commons.jelly.impl.ScriptBlock.run(ScriptBlock.java:135) at org.apache.commons.jelly.TagSupport.invokeBody(TagSupport.java:233) at org.apache.commons.jelly.tags.core.IfTag.doTag(IfTag.java:88) at org.apache.commons.jelly.impl.TagScript.run(TagScript.java:279) at org.apache.commons.jelly.impl.ScriptBlock.run(ScriptBlock.java:135) at org.apache.commons.jelly.TagSupport.invokeBody(TagSupport.java:233) at org.apache.commons.jelly.tags.core.ForEachTag.doTag(ForEachTag.java:145) at org.apache.commons.jelly.impl.TagScript.run(TagScript.java:279) at org.apache.commons.jelly.impl.ScriptBlock.run(ScriptBlock.java:135) at org.apache.commons.jelly.TagSupport.invokeBody(TagSupport.java:233) at org.apache.commons.jelly.tags.core.IfTag.doTag(IfTag.java:88) at org.apache.commons.jelly.impl.TagScript.run(TagScript.java:279) at org.apache.commons.jelly.impl.ScriptBlock.run(ScriptBlock.java:135) at org.apache.commons.jelly.TagSupport.invokeBody(TagSupport.java:233) at com.werken.werkz.jelly.PostGoalTag$1.firePostGoal(PostGoalTag.java:87) at com.werken.werkz.Goal.firePostGoalCallbacks(Goal.java:710) at com.werken.werkz.Goal.fire(Goal.java:654) at com.werken.werkz.Goal.attain(Goal.java:575) at com.werken.werkz.WerkzProject.attainGoal(WerkzProject.java:193) at org.apache.maven.jelly.tags.werkz.MavenAttainGoalTag.doTag(MavenAttainGo alTag.java:126) at org.apache.commons.jelly.impl.TagScript.run(TagScript.java:279) at org.apache.commons.jelly.impl.ScriptBlock.run(ScriptBlock.java:135) at org.apache.maven.jelly.tags.werkz.MavenGoalTag.runBodyTag(MavenGoalTag.j ava:78) at org.apache.maven.jelly.tags.werkz.MavenGoalTag$MavenGoalAction.performAc tion(MavenGoalTag.java:99) at com.werken.werkz.Goal.fire(Goal.java:639) at com.werken.werkz.Goal.attain(Goal.java:575) at com.werken.werkz.Goal.attainPrecursors(Goal.java:488) at com.werken.werkz.Goal.attain(Goal.java:573) at com.werken.werkz.WerkzProject.attainGoal(WerkzProject.java:193) at org.apache.maven.jelly.tags.werkz.MavenAttainGoalTag.doTag(MavenAttainGo alTag.java:126) at
RE: problem with linkcheck:linkcheck
I think I have discovered why - the project depends on an early version of apache commons-http client - if I take this out, I can generate the linkchecks (but not build). Is there any way to control the classpath that the plugin gets, independent of the java compilation ? ---Original Message- --From: Nigel Magnay --Sent: 05 May 2004 14:44 --To: '[EMAIL PROTECTED]' --Subject: problem with linkcheck:linkcheck -- -- --For some reason, some of my projects crash in site:generate --when it does the linkcheck part. -- --I've attached the (rather large) exception - are there any --ideas as to what it could be and How I might fix it? -- -- -- --BUILD FAILED --File.. file:/C:/Documents and --Settings/nigelm/.maven/plugins/maven-linkcheck-plugin-1.2/plu --gin.jelly --Element... linkcheck:linkcheck --Line.. 91 --Column 9 --Implementing class --com.werken.werkz.UnattainableGoalException: Unable to obtain --goal [site] -- file:/C:/Documents and --Settings/nigelm/.maven/plugins/maven-linkcheck-plug --in-1.2/plugin.jelly:91:9: linkcheck:linkcheck Implementing class --at com.werken.werkz.Goal.fire(Goal.java:646) --at com.werken.werkz.Goal.attain(Goal.java:575) --at com.werken.werkz.Goal.attainPrecursors(Goal.java:488) --at com.werken.werkz.Goal.attain(Goal.java:573) --at --com.werken.werkz.WerkzProject.attainGoal(WerkzProject.java:193) --at --org.apache.maven.plugin.PluginManager.attainGoals(PluginManag --er.java:531) --at --org.apache.maven.MavenSession.attainGoals(MavenSession.java:265) --at org.apache.maven.cli.App.doMain(App.java:466) --at org.apache.maven.cli.App.main(App.java:1117) --at --sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) --at --sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAcces --sorImpl.java:39) --at --sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMet --hodAccessorImpl.java:25) --at java.lang.reflect.Method.invoke(Method.java:324) --at com.werken.forehead.Forehead.run(Forehead.java:551) --at com.werken.forehead.Forehead.main(Forehead.java:581) --org.apache.commons.jelly.JellyTagException: --file:/C:/Documents and --Settings/nigelm/.maven/plugins/maven-linkcheck-plugin-1.2/plu --gin.jelly:91:9: linkc heck:linkcheck Implementing class --at --org.apache.commons.jelly.impl.DynamicBeanTag.doTag(DynamicBea --nTag.java:243) --at --org.apache.commons.jelly.impl.StaticTagScript.run(StaticTagSc --ript.java:145) --at --org.apache.commons.jelly.impl.ScriptBlock.run(ScriptBlock.java:135) --at --org.apache.maven.jelly.tags.werkz.MavenGoalTag.runBodyTag(Mav --enGoalTag.java:78) --at --org.apache.maven.jelly.tags.werkz.MavenGoalTag$MavenGoalActio --n.performAction(MavenGoalTag.java:99) --at com.werken.werkz.Goal.fire(Goal.java:639) --at com.werken.werkz.Goal.attain(Goal.java:575) --at --com.werken.werkz.WerkzProject.attainGoal(WerkzProject.java:193) --at --org.apache.maven.jelly.tags.werkz.MavenAttainGoalTag.doTag(Ma --venAttainGoalTag.java:126) --at --org.apache.commons.jelly.impl.TagScript.run(TagScript.java:279) --at --org.apache.commons.jelly.impl.ScriptBlock.run(ScriptBlock.java:135) --at --org.apache.commons.jelly.TagSupport.invokeBody(TagSupport.java:233) --at --org.apache.commons.jelly.tags.core.IfTag.doTag(IfTag.java:88) --at --org.apache.commons.jelly.impl.TagScript.run(TagScript.java:279) --at --org.apache.commons.jelly.impl.ScriptBlock.run(ScriptBlock.java:135) --at --org.apache.commons.jelly.TagSupport.invokeBody(TagSupport.java:233) --at --org.apache.commons.jelly.tags.core.ForEachTag.doTag(ForEachTa --g.java:145) --at --org.apache.commons.jelly.impl.TagScript.run(TagScript.java:279) --at --org.apache.commons.jelly.impl.ScriptBlock.run(ScriptBlock.java:135) --at --org.apache.commons.jelly.TagSupport.invokeBody(TagSupport.java:233) --at --org.apache.commons.jelly.tags.core.IfTag.doTag(IfTag.java:88) --at --org.apache.commons.jelly.impl.TagScript.run(TagScript.java:279) --at --org.apache.commons.jelly.impl.ScriptBlock.run(ScriptBlock.java:135) --at --org.apache.commons.jelly.TagSupport.invokeBody(TagSupport.java:233) --at --com.werken.werkz.jelly.PostGoalTag$1.firePostGoal(PostGoalTag --.java:87) --at com.werken.werkz.Goal.firePostGoalCallbacks(Goal.java:710) --at com.werken.werkz.Goal.fire(Goal.java:654) --at com.werken.werkz.Goal.attain(Goal.java:575) --at --com.werken.werkz.WerkzProject.attainGoal(WerkzProject.java:193) --at --org.apache.maven.jelly.tags.werkz.MavenAttainGoalTag.doTag(Ma --venAttainGoalTag.java:126) --at --org.apache.commons.jelly.impl.TagScript.run(TagScript.java:279
Using reactor to generate navigation.xml
Is there any example of using the reactor to auto-generate a navigation.xml to subprojects ? I have j:forEach var=reactorProject items=${reactorProjects} echo${reactorProject.artifactId}/echo /j:forEach But I haven't looked at how to write out the navigation.xml file so I was wondering if someone Had already done the donkeywork ? - To unsubscribe, e-mail: [EMAIL PROTECTED] For additional commands, e-mail: [EMAIL PROTECTED]
Maven with CruiseControl
I'm getting some very odd behaviour running Maven from CC. It seems that every other build, my ${user.home}/build.properties file is not being read, so my build fails, thus 50% of them seem to fail all the time. Is there any reason this file would not get read? - To unsubscribe, e-mail: [EMAIL PROTECTED] For additional commands, e-mail: [EMAIL PROTECTED]
RE: Maven with CruiseControl
It's windows. It's really odd, because sometimes it works, then the next build it doesn't, And then it does (Etc., etc) I can run the maven build by hand and it's fine, so I'm guessing it might be CC to blame :( ---Original Message- --From: Brett Porter [mailto:[EMAIL PROTECTED] --Sent: 20 May 2004 00:16 --To: 'Maven Users List' --Subject: RE: Maven with CruiseControl -- --Works for me... Might be environmental? -- --Is this on windows or unix? -- --IS CC running as the user you expect? -- --- Brett -- -- -Original Message- -- From: Nigel Magnay [mailto:[EMAIL PROTECTED] -- Sent: Wednesday, 19 May 2004 9:15 PM -- To: [EMAIL PROTECTED] -- Subject: Maven with CruiseControl -- -- -- -- I'm getting some very odd behaviour running Maven from CC. -- -- It seems that every other build, my -- ${user.home}/build.properties file is not being read, so my build -- fails, thus 50% of them seem to fail all the time. -- -- Is there any reason this file would not get read? -- -- -- --- -- To unsubscribe, e-mail: [EMAIL PROTECTED] -- For additional commands, e-mail: [EMAIL PROTECTED] -- -- - To unsubscribe, e-mail: [EMAIL PROTECTED] For additional commands, e-mail: [EMAIL PROTECTED]
RC3 problem with clover
After upgrading to RC3, we have a problem with the clover plugin. It complains in a lot of our projects that it can't get maven.test.compile.src.set. This is because we don't want to run any tests in that project, we are merely instrumenting the classes so that when they are run later (inside Jboss), the coverage information is created. Are there any easy fixes for this or do I have to hack it? - To unsubscribe, e-mail: [EMAIL PROTECTED] For additional commands, e-mail: [EMAIL PROTECTED]
Plugin taglibs
Hello I have a plugin where I have declared a taglib by doing define:taglib uri=myuri define:tag name=yadayada etc etc I can do plugin:install fine However, when I try and use it in my script, with project xmlns:mine=myuri ... goal. ... mine:yadayada / I get Tag library requested that is not present: 'myuri' in plugin 'null' I'm evidently missing something here.. how do I 'register' my plugin? I've examined the other plugins in the cache and I can't figure it out... Cheers, Nigel - To unsubscribe, e-mail: [EMAIL PROTECTED] For additional commands, e-mail: [EMAIL PROTECTED]
CruiseControl, Maven, modificationsets
Hi people We're using CruiseControl and Maven quite happily for individual project builds and larger 'continuous integration' type stuff. Cruisecontrol gets launched with the cvs modificationset detecting whether there have been any changes to a part of the cvs repo relevant to that project. It would be very useful to have an additional modificationset that used the maven dependency list in order to find out if any of the dependencies has been modified in the local (or maybe remote) repo. I think it might be relatively easy to call out to maven to get it to show the timestamp for all the dependencies that it is going to use, and compare them to a previously stored list. Has anyone already written anything like this? - To unsubscribe, e-mail: [EMAIL PROTECTED] For additional commands, e-mail: [EMAIL PROTECTED]
maven with eclipse, mevenide et al
Hi people. We've been using maven for a while now, with cruisecontrol, very nicely indeed, for builds. Some of us use maven to do our local box builds, but more don't than do, and a part of that is probably our heavy additction to Eclipse. Now, I've been using the IDE integration stuff, but I have a question (actually a 'how are you guys doing this). I've tried googling for this but I can't get search terms that provide a good answer :-) We have somee projects that produce EJBs. So, in essence, we have 3 projects: product-ejb : EJB Jar product-ejb/target/classes product-ejb/target/xdoclet product-ejb/target - ejb artefact appears here product-client: Client Jar product-client/target/classes product-client/target/xdoclet product-client/target - ejb client jar appears here product-root: Common product-root/src : effectively all the source code Where product-ejb and product-client set their source dir to be '../product-root/src'. This is nice because our legacy ant scripts (that some people still live breathe) just live entirely in product-root. This all works fine for maven builds, but of course eclipse doesn't like its source files to be anywhere other than a direct descendent of the project directory. I suspect this layout is a bit wasteful - for one thing, xdoclet and the compiler get run twice, but I'm not sure the best route to go. symlinks for the src directories occur to me, but that scares me both because we're NT, and that it may confuse our source control (and our developers) How are other people doing this? - To unsubscribe, e-mail: [EMAIL PROTECTED] For additional commands, e-mail: [EMAIL PROTECTED]
Re: maven with eclipse, mevenide et al
Yes, I'm thinking that the current structure probably needs to be modularised in a more intelligent way. It's not that maven is unhappy, it's more the eclipse problem of the sourcecode being 'elsewhere'. we used to have 1 project controlled by ant that build a heap of artefacts, such as client jar and ejb jar. So I effectively created 2 maven projects that did one thing each, and just said 'my sourcecode is actually in ..\common\src'. I guess I should get the common project (or project*s*) to do the compilation, and then do some kind of jar-ing or UberJaring to build my artefacts. On Tue, 26 Oct 2004 11:33:51 +0200, Eric Pugh [EMAIL PROTECTED] wrote: Take a look at the updated (in CVS) docs. I added a bit about using the generated source directory to import your xdcolet directory as a source directory. That way, when eclipse does a clean compile, it doesn't wipe out your source... I htink you are using xdoclet to generate a lot of the ejb stuff right? This works perfect for that... As far as how you are reading in the common code, shouldn't that be a seperate jar (and therefore project) that you are referring to? Maven doesn't really like to read in source from one place into multiple jars... Eric -Original Message- From: Nigel Magnay [mailto:[EMAIL PROTECTED] Sent: Tuesday, October 26, 2004 11:01 AM To: [EMAIL PROTECTED] Subject: maven with eclipse, mevenide et al Hi people. We've been using maven for a while now, with cruisecontrol, very nicely indeed, for builds. Some of us use maven to do our local box builds, but more don't than do, and a part of that is probably our heavy additction to Eclipse. Now, I've been using the IDE integration stuff, but I have a question (actually a 'how are you guys doing this). I've tried googling for this but I can't get search terms that provide a good answer :-) We have somee projects that produce EJBs. So, in essence, we have 3 projects: product-ejb : EJB Jar product-ejb/target/classes product-ejb/target/xdoclet product-ejb/target - ejb artefact appears here product-client: Client Jar product-client/target/classes product-client/target/xdoclet product-client/target - ejb client jar appears here product-root: Common product-root/src : effectively all the source code Where product-ejb and product-client set their source dir to be '../product-root/src'. This is nice because our legacy ant scripts (that some people still live breathe) just live entirely in product-root. This all works fine for maven builds, but of course eclipse doesn't like its source files to be anywhere other than a direct descendent of the project directory. I suspect this layout is a bit wasteful - for one thing, xdoclet and the compiler get run twice, but I'm not sure the best route to go. symlinks for the src directories occur to me, but that scares me both because we're NT, and that it may confuse our source control (and our developers) How are other people doing this? - To unsubscribe, e-mail: [EMAIL PROTECTED] For additional commands, e-mail: [EMAIL PROTECTED] - To unsubscribe, e-mail: [EMAIL PROTECTED] For additional commands, e-mail: [EMAIL PROTECTED]
Re: maven with eclipse, mevenide et al
The issue isn't the number of source directories, it's the location. product-ejb has in the project.xml sourceDir../product/source/src/sourceDir This isn't allowed in eclipse. I could set it to an absolute path, but that's nasty as it assumes everyone checks out in the same location. On Tue, 26 Oct 2004 17:24:15 -0700, Kenneth Simpson [EMAIL PROTECTED] wrote: This all works fine for maven builds, but of course eclipse doesn't like its source files to be anywhere other than a direct descendent of the project directory. Hmm, that's news to me. I have 6 independent source directories in one project in Eclipse. Try going into the Properties for the project, click on the Java Build Path and add the source directories under Source. Also, there should be some maven goals for Eclipse to help set both classpath and maven repository path. - To unsubscribe, e-mail: [EMAIL PROTECTED] For additional commands, e-mail: [EMAIL PROTECTED] - To unsubscribe, e-mail: [EMAIL PROTECTED] For additional commands, e-mail: [EMAIL PROTECTED]
Re: use of ArtifactHandler
I have an 'uberwar' mojo; it's components.xml has a lifecyclemapping and an artifacthandler definition, both with the role-hint of uberwar. My project that uses this mojo has a pom.xml with packaginguberwar/packaging The lifecycle is working correctly, my mojo is being called in the packaging step. It does a getArtifact().setFile() correctly - the package should be have an extension of .war. The components.xml for the mojo has the configuration that states extensionwar/extension. However, when the install plugin runs, it copies artifact.war into the repository as artifact.uberwar Attaching a debugger seems to show that when the installer determines the name for the artifact in the repository, it's asking the defaultArtifactHandler.getExtension() - the value of its member variable is null, so it defaults to the type (which is uberwar). So it sounds like I'm configuring the DefaultArtifactHandle wrongly if its not getting the config I am passing.. On 11/17/05, John Casey [EMAIL PROTECTED] wrote: -BEGIN PGP SIGNED MESSAGE- Hash: SHA1 Yes, the private member vars are injected with values from the configuration. That's how plexus works by default (plexus is the underlying container Maven uses). So, you have a POM with packaginguberwar/packaging, and it's setting the extension to 'uberwar'? Or, is it that you have a dependency in another POM with typeuberwar/type, and it's looking for a dependency artifact with an extension of 'uberwar' rather than 'war'? Sorry, I'm a little confused. - -j Nigel Magnay wrote: | Yep - I'm pretty sure it's reading it as there is also a | LifecycleMapping which is being used correctly. | | Are the private member variables supposed to get read set by some | persistence mechanism from the configuration node ? | | | On 11/17/05, John Casey [EMAIL PROTECTED] wrote: | | You're defining this components.xml in a plugin, right? Do you have | extensionstrue/extensions defined in the plugin reference within | your plugin-user POM? If not, it will use a default artifact handler | that has the same type as your packaging, and the same extension as your | packaging... | | -j | | Nigel Magnay wrote: | | Hello - I have been trying to follow the configuration for | | ArtifactHandlers - I have in my components.xml | | | | component | | roleorg.apache.maven.artifact.handler.ArtifactHandler/role | | role-hintuberwar/role-hint | | | implementationorg.apache.maven.artifact.handler.DefaultArtifactHandler/implementation | | configuration | | extensionwar/extension | | typewar/type | | packagingwar/packaging | | languagejava/language | | addedToClasspathfalse/addedToClasspath | | /configuration | | /component | | | | But, looking at DefaultArtifactHandler, the configuration never seems | | to be used as there are only private member variables, and extension | | defaults to be the same as the type, which will always be 'uberwar' (I | | want it to be war). | | | | Am I missing something? Or is the intention to create your own subtype | | of ArtifactHandler rather than using the Default (is the configuration | | not implemented?) | | | | - | | To unsubscribe, e-mail: [EMAIL PROTECTED] | | For additional commands, e-mail: [EMAIL PROTECTED] | | | | | | - - To unsubscribe, e-mail: [EMAIL PROTECTED] For additional commands, e-mail: [EMAIL PROTECTED] | - | To unsubscribe, e-mail: [EMAIL PROTECTED] | For additional commands, e-mail: [EMAIL PROTECTED] -BEGIN PGP SIGNATURE- Version: GnuPG v1.2.6 (GNU/Linux) iD8DBQFDfNYtK3h2CZwO/4URAhD7AJ9PjeGqrhU2vfst6l6SkFqMsmFH5gCfbFCo 4J7BJTXgfCiuvXn0WmYpK18= =kvEp -END PGP SIGNATURE- - To unsubscribe, e-mail: [EMAIL PROTECTED] For additional commands, e-mail: [EMAIL PROTECTED] - To unsubscribe, e-mail: [EMAIL PROTECTED] For additional commands, e-mail: [EMAIL PROTECTED]
[m2] plugin default properties
I have a mojo which is pretty much like the Jar mojo. I know I can set the main class for the manifest with something like plugin groupIdmygroup/groupId artifactIdmyplugin/artifactId configuration archive manifest mainClasstrue/mainClass /manifest /archive /configuration /plugin However, since the mainclass is always the same, is there a way for me to set up my mojo so I don't have to do this. I've tried looking at pluginManagement, adding a property of @parameter expression=${archive.manifest.mainClass} and seeing if I could manually set the value on the manifest, but I can't seem to find a way around it... - To unsubscribe, e-mail: [EMAIL PROTECTED] For additional commands, e-mail: [EMAIL PROTECTED]
Re: use of ArtifactHandler
Ah - I have found out why I'm having trouble - my configuration looks like configuration extensionwar/extension typeuberwar/type /configuration This works correctly when I build the project that uses the plugin. However, if the project that uses the plugin is built as a part of a multiproject build, it fails, using the (incorrect) extension of uberwar again. Also in the trace I get [ERROR] Nonexistent component: org.apache.maven.lifecycle.mapping.LifecycleMappinguberwar Which seems to not be correct, because it must have read this mapping in order to execute the plugin (which it does, because I can see this in the output that it is calling my plugin correctly. Is this a bug, or is there something else I need to set ? On 11/18/05, Brett Porter [EMAIL PROTECTED] wrote: As I said, your type should be uberwar - not war (it needs to match the role-hint, as stated in the docs). On 11/18/05, Nigel Magnay [EMAIL PROTECTED] wrote: I have an 'uberwar' mojo; it's components.xml has a lifecyclemapping and an artifacthandler definition, both with the role-hint of uberwar. My project that uses this mojo has a pom.xml with packaginguberwar/packaging The lifecycle is working correctly, my mojo is being called in the packaging step. It does a getArtifact().setFile() correctly - the package should be have an extension of .war. The components.xml for the mojo has the configuration that states extensionwar/extension. However, when the install plugin runs, it copies artifact.war into the repository as artifact.uberwar Attaching a debugger seems to show that when the installer determines the name for the artifact in the repository, it's asking the defaultArtifactHandler.getExtension() - the value of its member variable is null, so it defaults to the type (which is uberwar). So it sounds like I'm configuring the DefaultArtifactHandle wrongly if its not getting the config I am passing.. On 11/17/05, John Casey [EMAIL PROTECTED] wrote: -BEGIN PGP SIGNED MESSAGE- Hash: SHA1 Yes, the private member vars are injected with values from the configuration. That's how plexus works by default (plexus is the underlying container Maven uses). So, you have a POM with packaginguberwar/packaging, and it's setting the extension to 'uberwar'? Or, is it that you have a dependency in another POM with typeuberwar/type, and it's looking for a dependency artifact with an extension of 'uberwar' rather than 'war'? Sorry, I'm a little confused. - -j Nigel Magnay wrote: | Yep - I'm pretty sure it's reading it as there is also a | LifecycleMapping which is being used correctly. | | Are the private member variables supposed to get read set by some | persistence mechanism from the configuration node ? | | | On 11/17/05, John Casey [EMAIL PROTECTED] wrote: | | You're defining this components.xml in a plugin, right? Do you have | extensionstrue/extensions defined in the plugin reference within | your plugin-user POM? If not, it will use a default artifact handler | that has the same type as your packaging, and the same extension as your | packaging... | | -j | | Nigel Magnay wrote: | | Hello - I have been trying to follow the configuration for | | ArtifactHandlers - I have in my components.xml | | | | component | | roleorg.apache.maven.artifact.handler.ArtifactHandler/role | | role-hintuberwar/role-hint | | | implementationorg.apache.maven.artifact.handler.DefaultArtifactHandler/implementation | | configuration | | extensionwar/extension | | typewar/type | | packagingwar/packaging | | languagejava/language | | addedToClasspathfalse/addedToClasspath | | /configuration | | /component | | | | But, looking at DefaultArtifactHandler, the configuration never seems | | to be used as there are only private member variables, and extension | | defaults to be the same as the type, which will always be 'uberwar' (I | | want it to be war). | | | | Am I missing something? Or is the intention to create your own subtype | | of ArtifactHandler rather than using the Default (is the configuration | | not implemented?) | | | | - | | To unsubscribe, e-mail: [EMAIL PROTECTED] | | For additional commands, e-mail: [EMAIL PROTECTED] | | | | | | - - To unsubscribe, e-mail: [EMAIL PROTECTED] For additional commands, e-mail: [EMAIL PROTECTED] | - | To unsubscribe, e-mail: [EMAIL PROTECTED] | For additional commands, e-mail: [EMAIL PROTECTED] -BEGIN PGP SIGNATURE- Version
Re: use of ArtifactHandler
I think the component.xml is correct (at end) - it certainly looks the same as the plexus examples. My project that uses this plugin works entirely correctly, *unless* it is a part of a multiproject build, in which case it uses the wrong extension. I don't know why this would be the case unless I've missed something? In same directory: W:\kms\dev\apps\kmsmvn install [INFO] Scanning for projects... [INFO] [INFO] Building KMS Application Code [INFO]task-segment: [install] [INFO] [INFO] [cargo2:uberwar] [INFO] [install:install] [INFO] Installing W:\1244 - Knowledge Management System (KMS)\dev\apps\kms\target\kms-2.0-SNAPSHOT.war to C:\Documents and Settings\nig el.magnay\.m2\repository\com\cswgroup\kms\kms\2.0-SNAPSHOT\kms-2.0-SNAPSHOT.war [INFO] [INFO] BUILD SUCCESSFUL [INFO] [INFO] Total time: 1 minute 9 seconds [INFO] Finished at: Thu Nov 24 11:46:53 GMT 2005 [INFO] Final Memory: 3M/6M [INFO] As a part of a multiproject: [INFO] [INFO] Building KMS Application Code [INFO]task-segment: [install] [INFO] [INFO] [cargo2:uberwar] [INFO] [install:install] [INFO] Installing W:\1244 - Knowledge Management System (KMS)\dev\apps\kms\target\kms-2.0-SNAPSHOT.war to C:\Documents and Settings\nig el.magnay\.m2\repository\com\cswgroup\kms\kms\2.0-SNAPSHOT\kms-2.0-SNAPSHOT.uberwar Config of plugin: component-set components component roleorg.apache.maven.lifecycle.mapping.LifecycleMapping/role role-hintuberwar/role-hint implementationorg.apache.maven.lifecycle.mapping.DefaultLifecycleMapping/implementation configuration phases package org.codehaus.cargo.maven2:cargo-maven2-plugin:uberwar /package installorg.apache.maven.plugins:maven-install-plugin:install/install deployorg.apache.maven.plugins:maven-deploy-plugin:deploy/deploy /phases /configuration /component component roleorg.apache.maven.artifact.handler.ArtifactHandler/role role-hintuberwar/role-hint implementationorg.apache.maven.artifact.handler.DefaultArtifactHandler/implementation configuration typeuberwar/type extensionwar/extension packaginguberwar/packaging /configuration /component /components /component-set On 11/23/05, Brett Porter [EMAIL PROTECTED] wrote: I'm losing track of this in all the bits and pieces as to what you really have. I'd suggest looking at the Plexus plugin and comparing the components.xml to your own: http://svn.plexus.codehaus.org/trunk/plexus-maven-plugin/ HTH, Brett On 11/24/05, Nigel Magnay [EMAIL PROTECTED] wrote: Ah - I have found out why I'm having trouble - my configuration looks like configuration extensionwar/extension typeuberwar/type /configuration This works correctly when I build the project that uses the plugin. However, if the project that uses the plugin is built as a part of a multiproject build, it fails, using the (incorrect) extension of uberwar again. Also in the trace I get [ERROR] Nonexistent component: org.apache.maven.lifecycle.mapping.LifecycleMappinguberwar Which seems to not be correct, because it must have read this mapping in order to execute the plugin (which it does, because I can see this in the output that it is calling my plugin correctly. Is this a bug, or is there something else I need to set ? On 11/18/05, Brett Porter [EMAIL PROTECTED] wrote: As I said, your type should be uberwar - not war (it needs to match the role-hint, as stated in the docs). On 11/18/05, Nigel Magnay [EMAIL PROTECTED] wrote: I have an 'uberwar' mojo; it's components.xml has a lifecyclemapping and an artifacthandler definition, both with the role-hint of uberwar. My project that uses this mojo has a pom.xml with packaginguberwar/packaging The lifecycle is working correctly, my mojo is being called in the packaging step. It does a getArtifact().setFile() correctly - the package should be have an extension of .war. The components.xml for the mojo has the configuration that states extensionwar/extension. However, when the install plugin runs, it copies artifact.war into the repository as artifact.uberwar Attaching a debugger seems to show that when the installer determines
Re: multi module issues
I'm actively working on improving merging support in cargo for web.xml and the container specific .xml files; also to support plugging in strategies for merging other files that may be in your WARs (e.g. struts-config.xml and other application specific items). Some of it is committed to cargo, some is waiting patches, some I have yet to submit - manly because cargo is getting ready for a 0.7 release and it is probably worthwhile to let that stabilise before adding a lot more new functionality. I've also adapted its maven2 plugin to configure the merging to happen as an 'uberwar' task. cargo-dev is the place to be as vincent says if you're interested in these things in cargo.. On 22/12/05, Kevin Galligan [EMAIL PROTECTED] wrote: Just wanted to comment quick on the message from Vincent (I just saw this message for the first time...). I abandoned the xml merging for right now. I was having trouble with the cargo xml merging code. I added some stuff myself, but after the war plugin added the war file composition, I just figured for now we would replicate the web.xml data manually rather than merge them. However, I would really like to get the web.xml merging to work. Is there anybody actively working on the web xml merger? I'm also pretty clueless. What's cargo? What is that code being used for currently? I'd possibly like to work on that, or get somebody in my group to work on that, but I'm not sure what the protocol is. Thanks, -Kevin -Original Message- From: Richard Allen [mailto:[EMAIL PROTECTED] Sent: mercredi 30 novembre 2005 14:01 To: Maven Users List Subject: Re: Multi-module wars [snip] However, I would like to see the maven-war-plugin support the web.xml merge functionality that Kevin Galligan's hack includes. Maybe you can raise this as a JIRA issue Kevin? Just a note here for anyone interested in implementing this: there's code in Cargo to merge web.xml files and this library is being improved with support for also merging container-specific deployment descriptors. There are also generic classes for merging any xml files. It's in there: http://tinyurl.com/dtwet Check the org.codehaus.cargo.module.webapp.WebXmlMerger class along with the files in org.codehaus.cargo.module.merge. Hope it helps, -Vincent - To unsubscribe, e-mail: [EMAIL PROTECTED] For additional commands, e-mail: [EMAIL PROTECTED] - To unsubscribe, e-mail: [EMAIL PROTECTED] For additional commands, e-mail: [EMAIL PROTECTED]
[m2] programmatic exclusion of a dependency
Hi people I'm trying to produce a fix for the cobertura mojo. When it instruments the code, it adds cobertura as a dependency to the projects 'test' cycle by doing this.project.setDependencyArtifacts( set ); where the set has included the artifact cobertura-1.7 in the 'test' scope. The problem is, is that the cobertura POM includes asm-2.1. This is a dependency for running cobertura to *instrument* the code, but it is not a dependency (and prevents it from working for hibernate projects) a dependency at runtime. I hoped I might be able to add a filter in order to prevent this happening - however I think the bit where the filter is used (DefaultPluginManager::resolveTransitiveDependencies) manually just creates a ScopeArtifactFilter, so I have no opportunity to change it to something that might scope out the dependency. Am I barking up the wrong tree here, or is there a better way of trying to do this?
Re: cobertura plugin breaks with maven 2.02 - wors fine with 2.0
You need to svn-update the cobertura mojo and rebuild it - the bug has been fixed for 2.0.2.,.. On 26/01/06, David Sag [EMAIL PROTECTED] wrote: I have just updated my maven 2 to 2.0.2 and now the instrumentation of my classes fails when using the cobertura plugin. I get NoClassDefFoundError: org/apache/maven/wagon/util/FileUtils any clues as to what changed between 2.0 and 2.0.2 that would lead to this error? most frustrating Kind regards, Dave Sag
[m2] specifying the surefire-booter for cobertura?
I'm suffering from SUREFIRE-30, which is down to the fact that the cobertura plugin uses surefire-booter-1.5.2.jar - which has this problem. I have tried adding a different booter into the plugin : plugin groupIdorg.codehaus.mojo/groupId artifactIdcobertura-maven-plugin/artifactId version2.0-SNAPSHOT/version executions execution goals goalclean/goal /goals /execution /executions dependencies dependency groupIdorg.apache.maven.surefire/groupId artifactIdsurefire-booter/artifactId version1.5.3-SNAPSHOT/version /dependency /dependencies /plugin But mvn -X shows it's still using the 1.5.2 jar: [DEBUG] Adding to surefire test classpath: C:\Documents and Settings\nigel.magney\.m2\repository\org\apache\maven\surefire\surefire\1.5 .2\surefire-1.5.2.jar Short of recompiling all these dependent files ( having to distribute them internally so everyone else's build works) is there a way of overriding this behaviour ?
Re: Merge static generated files in WAR?
If you need to do merging of web.xml, you may want to look at cargo http://cargo.codehaus.org/Merging+WAR+Files On 19/04/06, Gwyn [EMAIL PROTECTED] wrote: Hi, I'd appreciate any suggestions as to the best way to do the above... I've got some generated files (web.xml, weblogic.xml, taglib.tld) that I generate to target/xdoclet/META-INF/, but I've also got some static files, in src/main/webapp/, so can't simply use the target folder as the warSourceDirectory value. I can pickup the web.xml via the maven-war-plugin's webXml config, but don't know how I could get the other generated files into the WAR. Options seem to be: 1) Get the war-plugin to also pull them in 2) Generate them into the exploded war 3) Generate them into src/main/webapp/ (I suspect all of the above will need some form of special filtering for web.xml...) Any suggestions? /Gwyn -- View this message in context: http://www.nabble.com/Merge-static-generated-files-in-WAR--t1473530.html#a3984999 Sent from the Maven - Users forum at Nabble.com. - To unsubscribe, e-mail: [EMAIL PROTECTED] For additional commands, e-mail: [EMAIL PROTECTED]
is CONTINUUM-609 really in 1.0.3?
It's in the release note - I try http://box:8080/continuum/repository but just get a 404 ?
Re: is CONTINUUM-609 really in 1.0.3?
Hmm.. I must be doing something wrong - are there any docs for this or handy pointers? It sounds a really useful feature - 1.0.3 is really nice. On 25/04/06, Brett Porter [EMAIL PROTECTED] wrote: It's not browsable, but it will work just fine when used from Maven. - Brett On 4/25/06, Nigel Magnay [EMAIL PROTECTED] wrote: It's in the release note - I try http://box:8080/continuum/repository but just get a 404 ?
Re: is CONTINUUM-609 really in 1.0.3?
AHa - that fixed it (and a little local difficulty in forgetting I needed to do mvn -U) Is there a wiki for this kind of stuff that I could add to? I can't believe I'll be the last to tie my own shoelaces together on that one. On 25/04/06, Emmanuel Venisse [EMAIL PROTECTED] wrote: Do you have define the deployment repository field in configuration screen? It's empty by default. Emmanuel Nigel Magnay a écrit : Hmm.. I must be doing something wrong - are there any docs for this or handy pointers? It sounds a really useful feature - 1.0.3 is really nice. On 25/04/06, Brett Porter [EMAIL PROTECTED] wrote: It's not browsable, but it will work just fine when used from Maven. - Brett On 4/25/06, Nigel Magnay [EMAIL PROTECTED] wrote: It's in the release note - I try http://box:8080/continuum/repository but just get a 404 ?
Re: [ANN] JAVAWUG BOF XVI Videos Available Now!
So, nothing to do with maven then? On 27/04/06, Peter Pilgrim [EMAIL PROTECTED] wrote: Hi All I am very proud to announce that two video presentations are now downloadable from Google Video Beta sites. These presentations were recorded on our sixteenth Java Web User Group meet-up that took place at Oracle's City of London offices on Friday, 17th February 2006. In case you are wondering the videos are free to view and/or download. Here are the available videos and the slides: Phil Zoio presents Strecks: Java 5 extension framework for Struts http://video.google.com/videoplay?docid=9029377979060230803 Emmanuel Okyere presents Spring into RIFE http://video.google.com/videoplay?docid=1163267747129711638 For more info visit www.javawug.com -- Peter Pilgrim ( Windows XP / Thunderbird 1.5 ) __ _ _ _ / //__ // ___// ___/ + Serverside Java / /___/ // /__ / /__ + Struts / // ___// ___// ___/ + Expresso Committer __/ // /__ / /__ / /__ + Independent Contractor /___/////// + Intrinsic Motivation On Line Resume || \\=== `` http://www.xenonsoft.demon.co.uk/no-it-striker.html '' - To unsubscribe, e-mail: [EMAIL PROTECTED] For additional commands, e-mail: [EMAIL PROTECTED] - To unsubscribe, e-mail: [EMAIL PROTECTED] For additional commands, e-mail: [EMAIL PROTECTED]
Re: Multiple CPUs
Yes - but all these dependencies are calculable from the POM files. I think about this every time my machine sits with 7 idle cores... The biggest initial issue is that the local repo is not threadsafe (2 mvn instances running in parallel can crash in interesting ways). That's the first thing to fix, but it's not a small thing. On Wed, Apr 9, 2008 at 3:08 PM, Nino Saturnino Martinez Vazquez Wael [EMAIL PROTECTED] wrote: Hmm, but isnt it a problem if modules depend on each other? if module a depends on module b... And I think that if implemented it should be scalable to xxx cpus.. I guess module a would have to wait for module b or something.. VELO wrote: Well, may be the best way to do that is add support to maven run modules on parallel not sure how to, but if maven run two modules at same time, on a dual core machine, means a big gain, i believe. VELO On Wed, Apr 9, 2008 at 8:21 AM, Benedikt Thelen [EMAIL PROTECTED] wrote: Hi there, i am sort of a maven newbee, At our workplce we have a quite big Coccon project in developenet and we use maven to build it. Building takes usually 5-6 minutes which is quite a while. I noticed using gkrellm and htop that maven only uses one of the two processors (Levono Thinkpad with intel core Duo) in my notebook. Question: Is there a way to tell maven to use both CPU's while building? I searched gooogle a lot but i didn't find anything. Greetings Benedikt Thelen -- -Wicket for love Nino Martinez Wael Java Specialist @ Jayway DK http://www.jayway.dk +45 2936 7684 - To unsubscribe, e-mail: [EMAIL PROTECTED] For additional commands, e-mail: [EMAIL PROTECTED] - To unsubscribe, e-mail: [EMAIL PROTECTED] For additional commands, e-mail: [EMAIL PROTECTED]
Re: Tip about Skinny Wars
FWIW, I used a slightly different approach in the cargo uberwar plugin (which generates a War-of-Wars). I made it look at the WAR dependencies, and create a 'phantom' pom projects in the repository, which can then be passed to the normal maven dependency reconciliation framework to work out what JARs ought to end up in /WEB-INF/lib. I consider it a totally hacky solution, but avoids having to make your dependent WAR files 'special' in any way. The root problem is that in maven 2.0.x, it's really hard to use the dependency resolution stuff outside the 'default' way (the WAR=no transitive dependencies is burnt into the WAR metatype, so nothing is considered for resolution beyond that point). I'm told this is being fixed in 2.1, but I haven't checked for about 6 months to see if there's any progress. On Wed, Jun 4, 2008 at 11:07 AM, Arnaud HERITIER [EMAIL PROTECTED] wrote: Not totally. It can fix : Automatically inherit transitive dependencies from the war artifact (the default Maven behaviour is not to inherit transitively from war dependencies). But not : Develop with classes included in the war artifact /WEB-INF/classes directory by including them in the project classpath. Arnaud On Wed, Jun 4, 2008 at 11:17 AM, Milos Kleint [EMAIL PROTECTED] wrote: i'm wondering if the solution would render the war-path plugin obsolete? it's causing trouble in embedded use. http://issues.appfuse.org/browse/APF-645 Milos On 6/3/08, Arnaud HERITIER [EMAIL PROTECTED] wrote: Hi all, I would like to share with you a workaround I found for the problem of transitive dependencies in skinny wars. In the documentation it is said that : Now the painful part. Your EAR's pom.xml needs to list every dependency that the WAR has. This is because Maven assumes fat WARs and does not include transitive dependencies of WARs within the EAR. A workaround of this is to define for each war 2 dependencies. One for the war itself and another for the war's pom to retrieve transitive dependencies. With that you'll have something like that in your ear dependencies : dependencies dependency groupIdcom.acme/groupId artifactIdwar1/artifactId version1.0.0/version typewar/type /dependency dependency groupIdcom.acme/groupId artifactIdwar1/artifactId version1.0.0/version typepom/type /dependency /dependencies I'm using maven 2.0.9. I'll do more tests tomorrow and I'll update the doc : http://maven.apache.org/plugins/maven-war-plugin/examples/skinny-wars.html Cheers arnaud - To unsubscribe, e-mail: [EMAIL PROTECTED] For additional commands, e-mail: [EMAIL PROTECTED] - To unsubscribe, e-mail: [EMAIL PROTECTED] For additional commands, e-mail: [EMAIL PROTECTED]
Re: fatal dependency management flaw in maven?
There would be other ways to accomplish this -- for instance, if Maven were aware of the license (if it were published in the POM), you could put It is published in the pom. You'd probably still have to cope with libraries that are (say) GPL, but don't declare themselves in the pom as such. - To unsubscribe, e-mail: [EMAIL PROTECTED] For additional commands, e-mail: [EMAIL PROTECTED]
Re: fatal dependency management flaw in maven?
However, even if you did assume that, it would be easy to work around if you enforced a full build from scratch to delete the local repo. As a developer you may decide to avoid doing a full build but our continuous integration environment would certainly enforce it and would catch the problem as soon as you commit to source control. No - if Foo is a test dependency in module A, and a compile dependency in module B, and A is built earlier than B, the testing of A will cause Foo to be downloaded and be in the local repo, ready to be compiled as a dependency against B. The enforcer plugin is almost certainly what you want. There's a million and one arbitrary and conflicting rules code shops want to impose on their process - codifying it by parsing the (pom) definition(s) is the correct thing to do, rather than trying to layer multiply faceted selections on top of the repository mechanism. I am (slightly) surprised there isn't a dependency black/whitelisting enforcer rule - but just because I haven't seen one doesn't mean there isn't one out there. 2008/7/1 Stuart McCulloch [EMAIL PROTECTED]: 2008/6/30 Ishaaq Chandy [EMAIL PROTECTED]: Well, assuming that a hypothetical implementation of maven only downloads compile/runtime deps from the repo that we actively control and restrict access to, wouldn't that be safe enough? I can't think of a scenario where this would lead to accidents unless someone somehow accidently got the permission to deploy to this restricted repo and then accidently deployed an unvetted artifact to it. Even if you mark something as a compile-scope dep when it should really have been a test-scope dep you would immediately hit a problem because then this hypothetical implementation of maven would abort because the artifact would not be in the vetted repo (even though it may legitimately exist in the test/plugin repo). just off the top of my head, here's one possibility: developer A adds test dependency to artifact Foo, which has a restrictive license developer B updates and rebuilds - Foo is downloaded from central to his local repository developer B thinks Foo would be useful in his code and adds compile dependency to it now because Foo is already in his local repository, IIUC Maven won't bother checking the remote repositories and developer B will happily start writing code that uses Foo. Of course, I admit that just because I can't see a way for this to reasonably accidently happen does not mean that it is impossible. -- Cheers, Stuart - To unsubscribe, e-mail: [EMAIL PROTECTED] For additional commands, e-mail: [EMAIL PROTECTED]
Re: fatal dependency management flaw in maven?
... and lo - I can't even read - http://maven.apache.org/plugins/maven-enforcer-plugin/rules/bannedDependencies.html On Mon, Jun 30, 2008 at 4:48 PM, Nigel Magnay [EMAIL PROTECTED] wrote: However, even if you did assume that, it would be easy to work around if you enforced a full build from scratch to delete the local repo. As a developer you may decide to avoid doing a full build but our continuous integration environment would certainly enforce it and would catch the problem as soon as you commit to source control. No - if Foo is a test dependency in module A, and a compile dependency in module B, and A is built earlier than B, the testing of A will cause Foo to be downloaded and be in the local repo, ready to be compiled as a dependency against B. The enforcer plugin is almost certainly what you want. There's a million and one arbitrary and conflicting rules code shops want to impose on their process - codifying it by parsing the (pom) definition(s) is the correct thing to do, rather than trying to layer multiply faceted selections on top of the repository mechanism. I am (slightly) surprised there isn't a dependency black/whitelisting enforcer rule - but just because I haven't seen one doesn't mean there isn't one out there. 2008/7/1 Stuart McCulloch [EMAIL PROTECTED]: 2008/6/30 Ishaaq Chandy [EMAIL PROTECTED]: Well, assuming that a hypothetical implementation of maven only downloads compile/runtime deps from the repo that we actively control and restrict access to, wouldn't that be safe enough? I can't think of a scenario where this would lead to accidents unless someone somehow accidently got the permission to deploy to this restricted repo and then accidently deployed an unvetted artifact to it. Even if you mark something as a compile-scope dep when it should really have been a test-scope dep you would immediately hit a problem because then this hypothetical implementation of maven would abort because the artifact would not be in the vetted repo (even though it may legitimately exist in the test/plugin repo). just off the top of my head, here's one possibility: developer A adds test dependency to artifact Foo, which has a restrictive license developer B updates and rebuilds - Foo is downloaded from central to his local repository developer B thinks Foo would be useful in his code and adds compile dependency to it now because Foo is already in his local repository, IIUC Maven won't bother checking the remote repositories and developer B will happily start writing code that uses Foo. Of course, I admit that just because I can't see a way for this to reasonably accidently happen does not mean that it is impossible. -- Cheers, Stuart - To unsubscribe, e-mail: [EMAIL PROTECTED] For additional commands, e-mail: [EMAIL PROTECTED]
Re: fatal dependency management flaw in maven?
On Mon, Jun 30, 2008 at 5:10 PM, Ishaaq Chandy [EMAIL PROTECTED] wrote: 2008/7/1 Nigel Magnay [EMAIL PROTECTED]: However, even if you did assume that, it would be easy to work around if you enforced a full build from scratch to delete the local repo. As a developer you may decide to avoid doing a full build but our continuous integration environment would certainly enforce it and would catch the problem as soon as you commit to source control. No - if Foo is a test dependency in module A, and a compile dependency in module B, and A is built earlier than B, the testing of A will cause Foo to be downloaded and be in the local repo, ready to be compiled as a dependency against B. ... and the continuous integration environment could go the next step and clear out the local repo between each module - possibly ugly, but doable. Well... depends on how you do your builds, I guess. However, these are just workarounds - if maven had some way to store scope-context along with the repository these would be a moot-point. I don't speak for the devs, but I just don't see this happening. Firstly, because there's too many arbitrary scopes that you could define that would be equally valid (license, internationalized, snapshot vs. release, 'blessed' releases, black/whitelisted versions.. and on and on). Secondly, because the repo managers are swimming in the opposite direction. They're all aggregating upstream repositories into a unified cache view - a 'I don't care where this dep came from, just gimme' view. My guess is that such adjustments would be viewed as an unnecessary complication. The enforcer plugin is almost certainly what you want. There's a million and one arbitrary and conflicting rules code shops want to impose on their process - codifying it by parsing the (pom) definition(s) is the correct thing to do, rather than trying to layer multiply faceted selections on top of the repository mechanism. I am (slightly) surprised there isn't a dependency black/whitelisting enforcer rule - but just because I haven't seen one doesn't mean there isn't one out there. On the contrary, I actually do want to codify this with my pom - i.e. I want to be able to instruct maven via my pom on how to decide which of the configured repos to use when downloading certain types of dependencies. And no, the enforcer plugin can't do what I want - unless I have misunderstood its documentation. i.e., there is no way that it can stop developers from adding compile/runtime deps that have not been previously vetted and yet at the same time allow arbitrary test/plugin deps and their transitive deps. i.e. a white-list for compile/runtime deps and their transitive deps and no list for all other deps. They can add them, then the enforcer should fail the build. It's close to some of the other rules (e.g. no snapshots, dependency blacklist); it might just need a custom rule, it might need an extension (e.g. to consider particular scopes). Either way, yours is a use-case that fits entirely within the scope of the enforcer plugin - if it can't be done with a custom rule, my guess would be that you're several orders of magnitude more likely to get the enforcer plugin enhanced as opposed to the core dependency and repository layout mechanism adjusted. And it would be a useful rule to have added, since it might also encourage projects to clean up their license sections in the poms. - To unsubscribe, e-mail: [EMAIL PROTECTED] For additional commands, e-mail: [EMAIL PROTECTED]
Re: maven failing to resolve artifact that exists in repository
What version of maven are you using? If not the latest, try 3.0.3.. On Sat, Oct 15, 2011 at 5:21 AM, Mike Power dodts...@sonic.net wrote: I am really confused. Maven is failing to resolve an artifact that I can easily find in the remote repository. I am trying to build an open source project whose git repository is found at https://github.com/magnayn/**Jenkins-Repository.githttps://github.com/magnayn/Jenkins-Repository.git I get the following error: Downloading: http://download.java.net/**maven/2//org/kohsuke/stapler/** stapler/1.167/stapler-1.167.**jarhttp://download.java.net/maven/2//org/kohsuke/stapler/stapler/1.167/stapler-1.167.jar [INFO] --**--** [ERROR] BUILD ERROR [INFO] --**--** [INFO] Failed to resolve artifact. Missing: -- 1) org.kohsuke.stapler:stapler:**jar:1.167 Try downloading the file manually from the project website. Then, install it using the command: mvn install:install-file -DgroupId=org.kohsuke.stapler -DartifactId=stapler -Dversion=1.167 -Dpackaging=jar -Dfile=/path/to/file Alternatively, if you host your own repository you can deploy the file there: mvn deploy:deploy-file -DgroupId=org.kohsuke.stapler -DartifactId=stapler -Dversion=1.167 -Dpackaging=jar -Dfile=/path/to/file -Durl=[url] -DrepositoryId=[id] Path to dependency: 1) com.nirima.jenkins.repository:**pom:pom:0.6.2-SNAPSHOT 2) org.jenkins-ci.main:jenkins-**core:jar:1.417 3) org.kohsuke.stapler:stapler-**adjunct-timeline:jar:1.3 4) org.kohsuke.stapler:stapler:**jar:1.167 -- 1 required artifact is missing. for artifact: com.nirima.jenkins.repository:**pom:pom:0.6.2-SNAPSHOT from the specified remote repositories: central (http://repo1.maven.org/maven2**), m.g.o-public (http://maven.glassfish.org/**content/groups/public/http://maven.glassfish.org/content/groups/public/ ) * * It seems to be only trying central. However it is not available on central. It is available on m.g.o-public. Here is the url: http://maven.glassfish.org/**content/groups/public/org/** kohsuke/stapler/stapler/1.161/**stapler-1.161.jarhttp://maven.glassfish.org/content/groups/public/org/kohsuke/stapler/stapler/1.161/stapler-1.161.jar Why is maven going to central to find the jar, and not going to m.g.o-public? It says it tried m.g.o-public but I do not see that happening.* *
RepositorySystem resolving to classes directory in clover build..
I am trying to diagnose a failure when running flex-mojos copy-resources when running a build with clover enabled. I've seen references to this also being a problem when building with m2e. Flex-mojos' CopyMojo finds dependencies, then copies them into the output (for building up a WAR file). To do this, it utilises the following stanza: Artifact artifact = repositorySystem.createArtifactWithClassifier( groupId, artifactId, version, type, classifier ); if ( !artifact.isResolved() ) { ArtifactResolutionRequest req = new ArtifactResolutionRequest(); req.setArtifact( artifact ); req.setLocalRepository( localRepository ); req.setRemoteRepositories( remoteRepositories ); ArtifactResolutionResult res = repositorySystem.resolve( req ); When running in a normal build, this results in the artifact.getFile() returning a resolved file for copying (in this instance, a .SWF file). However. When running with clover (mvn clean clover2:setup test clover2:aggregate clover2:clover), when it hits this project, the build fails. The reason for this is that when this code is executed for one of the artifacts defined further up the build, instead of resolving to the .SWF file, it resolves to the ${build.dir}/classes directory. I don't know whether this is a bug in clover, in maven, in aether or in flex-mojos... any pointers would be useful as this prevents us from using java code-coverage in our builds..
Re: [DISCUSS] Should the Maven PMC be an example of how we want the Maven Community to behave (was Re: svn commit: r1506778 - /maven/site/trunk/content/markdown/project-roles.md)
That whole section I find pretty bizarre. - Apache is about (open-source) software. - Writing code is *good*. - Forks are *good* * * I'm put in mind of Linus' talk about why git distribution is so important - that 'if you don't think I'm doing a good job, then you can just take your code from another maintainer. *That's* what keeps a project honest and responsive to the users. I would have thought that the kinds of people who are interested in writing maven-esque code would be some of the people you'd want on a PMC. If they have a long running fork or a reimplementation, surely they would be lobbying for its integration? Merging is also good. If, despite this, they're choosing to do this elsewhere, and/or are having trouble merging projects in, isn't that a pretty sad indictment for the health of the project? Isn't it a bit like saying boo-hoo, those that are doing the actual work might go work in their own sandpit if we won't play ball, let's ex-communicate them ? Unless (as some have suspected for a while) Apache isn't about software anymore, it's about the continued existence of Apache (cfex: OpenOffice).- a political edifice where projects go to die. That's certainly what those added paragraphs say to me. On Thu, Jul 25, 2013 at 2:16 PM, Stephen Connolly stephen.alan.conno...@gmail.com wrote: There are two schools of thought amongst the current members of this projects PMC. Without wanting to deliberately tip my hand and reveal where my opinion is, we would like to solicit the opinions if the community that we serve. Please give us your thoughts. The topic is essentially: Do you want the members of the Maven PMC to be social leaders of the Maven community, who's actions demonstrate the best community behaviour? The alternative is that members of the Maven PMC are here purely to complete the legal requirements that an Apache TLP has delegated to PMCs This is not black and white... The answer can be grey... And everyone is human so can make mistakes... So community, what are you expecting? - Stephen Connolly On Thursday, 25 July 2013, wrote: Author: jdcasey Date: Wed Jul 24 23:21:58 2013 New Revision: 1506778 URL: http://svn.apache.org/r1506778 Log: Adding section on PMC standards of community commitment Modified: maven/site/trunk/content/markdown/project-roles.md Modified: maven/site/trunk/content/markdown/project-roles.md URL: http://svn.apache.org/viewvc/maven/site/trunk/content/markdown/project-roles.md?rev=1506778r1=1506777r2=1506778view=diff == --- maven/site/trunk/content/markdown/project-roles.md (original) +++ maven/site/trunk/content/markdown/project-roles.md Wed Jul 24 23:21:58 2013 @@ -176,6 +176,29 @@ The Project Management Committee has the * Voting on release artifacts. * !-- TODO: get the rest of these -- + Standards for Community Commitment + +In the spirit of supporting the health of our community, Project +Management Committee members refrain from actions that subvert the +functioning of the committee itself. + +First, Project Management Committee members should not maintain long-running +forks of Maven code outside of the project itself. Making significant +changes to Maven code outside of the project displays a lack of +investment in the community. Additionally, attempting to re-integrate +a large number of code changes in bulk overwhelms the ability of +volunteers in the community to review (and potentially veto) the +changes. This effectively thwarts the policing function of the +PMC. + +Second, Project Management Committee members should not divert +work on redesigning, reimplementing, or improving Maven code to +alternative projects outside of this community for the purposes of +reintroducing them as replacement for existing Maven code. While there +is a danger here of falling into a Not Invented Here mentality, new projects +created by Maven PMC members strictly to replace Maven code should not be +allowed. + ### [Project Management Chair]( http://www.apache.org/foundation/how-it-works.html#pmc-chair) For various legal reasons, there are certain things that the Apache -- Sent from my phone
Re: Jenkins and Maven
You install https://github.com/takari/takari-local-repository Or fix the bug from, like, 2007 ? https://issues.apache.org/jira/browse/MNG-2802 On Thu, Oct 13, 2016 at 4:41 PM, Benson Margulieswrote: > So, here's a specific puzzle. I want to enable multiple branches in > the Jenkins job and concurrent builds. How do I avoid two jobs trying > to write the same local repo at the same time? > > - > To unsubscribe, e-mail: users-unsubscr...@maven.apache.org > For additional commands, e-mail: users-h...@maven.apache.org > >
building dependent (multi)-projects
I have a set of M2 projects that depend on each other: A | B / \ X Y What I'd like is that if project Y changes, then Y gets built automatically; if successful then B (because a dependency has changed), then A. (A is the project installer in this instance) Is this something that continuum can do, or am I better off sticking to a 'master build' type affair with cruisecontrol ?
using ant to do the packaging
Hello listers I am binding things like ant tasks into the lifecycle by using the maven-antrun-plugin bound to the package phase. The ant script usually overwrites the target/blah.jar file with something new. This works well, and I get what I expect in the target directory of my build. However, in the continuum repository, I don't get that - I get the jar file that I would have had if I hadn't done the ANT step. I'm assuming this is because the jar file is assembled in some other way - is there a way for me to get it to deploy the right thing ?
Re: using ant to do the packaging
I think I fixed one problem which was to do with a case issue. I do have ant separately, but I think maven pulls it down automatically. Now the only problem I have remaining is when I have an assembly item with an ID (so the jar becomes -myid.jar), it doesn't appear in the continuum repository (but does in the local user repo..) :( On 09/10/06, r maclean [EMAIL PROTECTED] wrote: Hi Nigel: I'm struggling with my Jar packaging...though I cannot offer you a solution, I am just curious to know if you had to install ANT separately and declare it either in the classpath or put the ant jars in the lib as suggested in the Maven2 doc (Better Builds with Maven)...this on top of the ant-plugins? thanks. Nigel Magnay [EMAIL PROTECTED] wrote: Hello listers I am binding things like ant tasks into the lifecycle by using the maven-antrun-plugin bound to the package phase. The ant script usually overwrites the target/blah.jar file with something new. This works well, and I get what I expect in the target directory of my build. However, in the continuum repository, I don't get that - I get the jar file that I would have had if I hadn't done the ANT step. I'm assuming this is because the jar file is assembled in some other way - is there a way for me to get it to deploy the right thing ?