Re: Adding support for new dependency mediation strategy

2013-08-26 Thread Jörg Schaible
Hi Stephen,

Stephen Connolly wrote:

[snip]

 It's better than that... I am not sure if I said it earlier or not, so I
 will try to say it now.
 
 When we get the next format, there are probably actually three files we
 want to deploy:
 
 foo-1.0.pom (the legacy 4.0.0 model)
 foo-1.0-build.pom (the new 5.0.0+ model)
 foo-1.0-deps.pom (the new 5.0.0+ model)
 
 Now foo-1.0.pom should be a resolved pom with only the bare minimum
 required elements, e.g. dependencies and hopefully nothing else... may
 need dependencyManagement, but I think we can collapse that down. No
 parent element.

OK, this works for releases, but what about SNAPSHOTs? For SNAPSHOTs is is 
quite normal that your parent is also a SNAPSHOT and you would produce all 
kind of problems if you try to resolve/collapse SNAPSHOT parents for 
SNAPSHOT artifacts that are installed or deployed.

[snip]

- Jörg


-
To unsubscribe, e-mail: dev-unsubscr...@maven.apache.org
For additional commands, e-mail: dev-h...@maven.apache.org



Re: Adding support for new dependency mediation strategy

2013-08-26 Thread Stephen Connolly
On 26 August 2013 08:27, Jörg Schaible joerg.schai...@scalaris.com wrote:

 Hi Stephen,

 Stephen Connolly wrote:

 [snip]

  It's better than that... I am not sure if I said it earlier or not, so I
  will try to say it now.
 
  When we get the next format, there are probably actually three files we
  want to deploy:
 
  foo-1.0.pom (the legacy 4.0.0 model)
  foo-1.0-build.pom (the new 5.0.0+ model)
  foo-1.0-deps.pom (the new 5.0.0+ model)
 
  Now foo-1.0.pom should be a resolved pom with only the bare minimum
  required elements, e.g. dependencies and hopefully nothing else... may
  need dependencyManagement, but I think we can collapse that down. No
  parent element.

 OK, this works for releases, but what about SNAPSHOTs? For SNAPSHOTs is is
 quite normal that your parent is also a SNAPSHOT and you would produce all
 kind of problems if you try to resolve/collapse SNAPSHOT parents for
 SNAPSHOT artifacts that are installed or deployed.


Why?

Or perhaps you are confusing what I mean?

Basically the foo-1.0.pom that gets deployed/installed is the result of
help:effective-pom with some bits removed, such as properties, build,
reporting, profiles etc

When building from a checkout, the reactor will have everything... and if
you are depending on a deployed/installed -SNAPSHOT then the behaviour will
remain the same.

And since this would be for a new Maven, we need only concern ourselves
that the contract of the new Maven's classpath and property behaviour is
correct... thus we don't have to preserve the current crazyiness when you
have a dependency that has transitive dependencies where parts of the GAV
are linked by properties.

In short, by separating the build time pom from the deployed pom, we can
maintain a defined reproducible behaviour[1] *and* migrate the schema

[1]: That does not mean that Maven 4.0 will allow you to reproduce all of
the classpath hacks that you can with Maven 2/3... some of those hacks are
stupid (even if people insist on using them)... but it should mean that
whatever classpath constructs you can do in Maven 4.0 get mapped correctly
on a best effort basis to the legacy clients



 [snip]

 - Jörg


 -
 To unsubscribe, e-mail: dev-unsubscr...@maven.apache.org
 For additional commands, e-mail: dev-h...@maven.apache.org




Re: Adding support for new dependency mediation strategy

2013-08-26 Thread Arnaud Héritier
I think it doesn't work if you have several level of inheritance in
different projects and you don't republish all intermediate artifacts
Let's imagine you have projectA -- inherit -- projectB -- inherit --
projectC
If all of them are in SNAPSHOT for now if you change projectA and republish
it, you'll access to your changes in projectC
With the resolution you'll have to wait to have projectB republished to use
the change in projectC.

No?

Arnaud


On Mon, Aug 26, 2013 at 9:55 AM, Stephen Connolly 
stephen.alan.conno...@gmail.com wrote:

 On 26 August 2013 08:27, Jörg Schaible joerg.schai...@scalaris.com
 wrote:

  Hi Stephen,
 
  Stephen Connolly wrote:
 
  [snip]
 
   It's better than that... I am not sure if I said it earlier or not, so
 I
   will try to say it now.
  
   When we get the next format, there are probably actually three files we
   want to deploy:
  
   foo-1.0.pom (the legacy 4.0.0 model)
   foo-1.0-build.pom (the new 5.0.0+ model)
   foo-1.0-deps.pom (the new 5.0.0+ model)
  
   Now foo-1.0.pom should be a resolved pom with only the bare minimum
   required elements, e.g. dependencies and hopefully nothing else... may
   need dependencyManagement, but I think we can collapse that down. No
   parent element.
 
  OK, this works for releases, but what about SNAPSHOTs? For SNAPSHOTs is
 is
  quite normal that your parent is also a SNAPSHOT and you would produce
 all
  kind of problems if you try to resolve/collapse SNAPSHOT parents for
  SNAPSHOT artifacts that are installed or deployed.
 

 Why?

 Or perhaps you are confusing what I mean?

 Basically the foo-1.0.pom that gets deployed/installed is the result of
 help:effective-pom with some bits removed, such as properties, build,
 reporting, profiles etc

 When building from a checkout, the reactor will have everything... and if
 you are depending on a deployed/installed -SNAPSHOT then the behaviour will
 remain the same.

 And since this would be for a new Maven, we need only concern ourselves
 that the contract of the new Maven's classpath and property behaviour is
 correct... thus we don't have to preserve the current crazyiness when you
 have a dependency that has transitive dependencies where parts of the GAV
 are linked by properties.

 In short, by separating the build time pom from the deployed pom, we can
 maintain a defined reproducible behaviour[1] *and* migrate the schema

 [1]: That does not mean that Maven 4.0 will allow you to reproduce all of
 the classpath hacks that you can with Maven 2/3... some of those hacks are
 stupid (even if people insist on using them)... but it should mean that
 whatever classpath constructs you can do in Maven 4.0 get mapped correctly
 on a best effort basis to the legacy clients


 
  [snip]
 
  - Jörg
 
 
  -
  To unsubscribe, e-mail: dev-unsubscr...@maven.apache.org
  For additional commands, e-mail: dev-h...@maven.apache.org
 
 




-- 
-
Arnaud Héritier
http://aheritier.net
Mail/GTalk: aheritier AT gmail DOT com
Twitter/Skype : aheritier


Re: Adding support for new dependency mediation strategy

2013-08-26 Thread Stephen Connolly
On 26 August 2013 09:06, Arnaud Héritier aherit...@gmail.com wrote:

 I think it doesn't work if you have several level of inheritance in
 different projects and you don't republish all intermediate artifacts
 Let's imagine you have projectA -- inherit -- projectB -- inherit --
 projectC
 If all of them are in SNAPSHOT for now if you change projectA and republish
 it, you'll access to your changes in projectC
 With the resolution you'll have to wait to have projectB republished to use
 the change in projectC.

 No?


Ok, yes you could have that issue... but who is to say that it isn't a good
thing? (or to put it another way, if you have that problem you are being
stupid)

We will only have dependencies in the legacy poms only.

If you are building projectC with Maven 4.0 it will be pulling down the
foo-1.0-build.pom files traversing the tree to generate it's dependency
tree, so Maven 4.0 will be pulling down the parents.

In short, if you are building with Maven  4.0 you should not be using a
GAV that was built by Maven = 4.0 as your parent... the legacy poms are
there so that you can use those GAVs as dependencies, not as parents

When building projectC with mvn4 it will pull down its parents and deploy a
legacy pom, so you would need to do a new `mvn4 install` to pick up the new
parents in the legacy code build that you are running with `mvn3
install`... but:

* Why can you not build that project with mvn4? (which would fix your issue
by not looking at the legacy poms)

-Stephen



 Arnaud


 On Mon, Aug 26, 2013 at 9:55 AM, Stephen Connolly 
 stephen.alan.conno...@gmail.com wrote:

  On 26 August 2013 08:27, Jörg Schaible joerg.schai...@scalaris.com
  wrote:
 
   Hi Stephen,
  
   Stephen Connolly wrote:
  
   [snip]
  
It's better than that... I am not sure if I said it earlier or not,
 so
  I
will try to say it now.
   
When we get the next format, there are probably actually three files
 we
want to deploy:
   
foo-1.0.pom (the legacy 4.0.0 model)
foo-1.0-build.pom (the new 5.0.0+ model)
foo-1.0-deps.pom (the new 5.0.0+ model)
   
Now foo-1.0.pom should be a resolved pom with only the bare minimum
required elements, e.g. dependencies and hopefully nothing else...
 may
need dependencyManagement, but I think we can collapse that down. No
parent element.
  
   OK, this works for releases, but what about SNAPSHOTs? For SNAPSHOTs is
  is
   quite normal that your parent is also a SNAPSHOT and you would produce
  all
   kind of problems if you try to resolve/collapse SNAPSHOT parents for
   SNAPSHOT artifacts that are installed or deployed.
  
 
  Why?
 
  Or perhaps you are confusing what I mean?
 
  Basically the foo-1.0.pom that gets deployed/installed is the result of
  help:effective-pom with some bits removed, such as properties, build,
  reporting, profiles etc
 
  When building from a checkout, the reactor will have everything... and if
  you are depending on a deployed/installed -SNAPSHOT then the behaviour
 will
  remain the same.
 
  And since this would be for a new Maven, we need only concern ourselves
  that the contract of the new Maven's classpath and property behaviour is
  correct... thus we don't have to preserve the current crazyiness when you
  have a dependency that has transitive dependencies where parts of the GAV
  are linked by properties.
 
  In short, by separating the build time pom from the deployed pom, we can
  maintain a defined reproducible behaviour[1] *and* migrate the schema
 
  [1]: That does not mean that Maven 4.0 will allow you to reproduce all of
  the classpath hacks that you can with Maven 2/3... some of those hacks
 are
  stupid (even if people insist on using them)... but it should mean that
  whatever classpath constructs you can do in Maven 4.0 get mapped
 correctly
  on a best effort basis to the legacy clients
 
 
  
   [snip]
  
   - Jörg
  
  
   -
   To unsubscribe, e-mail: dev-unsubscr...@maven.apache.org
   For additional commands, e-mail: dev-h...@maven.apache.org
  
  
 



 --
 -
 Arnaud Héritier
 http://aheritier.net
 Mail/GTalk: aheritier AT gmail DOT com
 Twitter/Skype : aheritier



Re: bug MNG-5459 (failure to resolve pom artifact from snapshotVersion in maven-metadata.xml)

2013-08-26 Thread Claudio Bley
At Sat, 24 Aug 2013 16:40:57 +0200,
Robert Scholte wrote:
 
 Hi Claudio,
 
 my holidays are over, so time to have a look at this.
 I've attached a testcase to the original issue, but I'm not able to
 reproduce it.

Your testcase works fine for me. That is, after I fixed the
maven-metadata.xml file that somehow got truncated - which probably
reveals another bug:

After applying your patch, I also only got the -SNAPSHOT
version. While investigating, I found that there were 2 exceptions
stored inside the versionResult in DefaultArtifactResolver.java:318:

--
[0] MetadataNotFoundException
Could not find metadata 
org.apache.maven.its:dep-mng5459:0.4.0-SNAPSHOT/maven-metadata.xml in local 
([...]\maven-aether-provider\target\local-repo)

[1] EOFException
no more data available - expected end tag /metadata to close start tag 
metadata from line 22, parser stopped on END_TAG seen .../versioning\n\n... 
@48:1
--

But the result was used happily further on, although the
maven-metadata.xml was never read since the document end tag was
missing. IMO, it makes no sense to use a result with /attached/
exceptions, but I may be missing something here.


Here's the output of the test run after I added the /metadata tag:

expected:[@todo] but was:[Could not find artifact
 org.apache.maven.its:dep-mng5459:pom:0.4.0-20130404.093655-3 in repo
 [...]


With my patch applied, this changes to:

expected:[@todo] but was:[Could not find artifact
org.apache.maven.its:dep-mng5459:pom:0.4.0-20130404.090532-2 in repo


Claudio
-- 
AV-Test GmbH, Henricistraße 20, 04155 Leipzig, Germany
Phone: +49 341 265 310 19
Web:http://www.av-test.org

Eingetragen am / Registered at: Amtsgericht Stendal (HRB 114076)
Geschaeftsfuehrer (CEO): Andreas Marx, Guido Habicht, Maik Morgenstern

-
To unsubscribe, e-mail: dev-unsubscr...@maven.apache.org
For additional commands, e-mail: dev-h...@maven.apache.org



Re: bug MNG-5459 (failure to resolve pom artifact from snapshotVersion in maven-metadata.xml)

2013-08-26 Thread Robert Scholte

Duh, a bad copy/paste without any alarmbells from my IDE

Thanks, it's fixed

Robert


Op Mon, 26 Aug 2013 12:50:36 +0200 schreef Claudio Bley cb...@av-test.de:


At Sat, 24 Aug 2013 16:40:57 +0200,
Robert Scholte wrote:


Hi Claudio,

my holidays are over, so time to have a look at this.
I've attached a testcase to the original issue, but I'm not able to
reproduce it.


Your testcase works fine for me. That is, after I fixed the
maven-metadata.xml file that somehow got truncated - which probably
reveals another bug:

After applying your patch, I also only got the -SNAPSHOT
version. While investigating, I found that there were 2 exceptions
stored inside the versionResult in DefaultArtifactResolver.java:318:

--
[0] MetadataNotFoundException
Could not find metadata  
org.apache.maven.its:dep-mng5459:0.4.0-SNAPSHOT/maven-metadata.xml in  
local ([...]\maven-aether-provider\target\local-repo)


[1] EOFException
no more data available - expected end tag /metadata to close start  
tag metadata from line 22, parser stopped on END_TAG seen  
.../versioning\n\n... @48:1

--

But the result was used happily further on, although the
maven-metadata.xml was never read since the document end tag was
missing. IMO, it makes no sense to use a result with /attached/
exceptions, but I may be missing something here.


Here's the output of the test run after I added the /metadata tag:

expected:[@todo] but was:[Could not find artifact
 org.apache.maven.its:dep-mng5459:pom:0.4.0-20130404.093655-3 in repo
 [...]


With my patch applied, this changes to:

expected:[@todo] but was:[Could not find artifact
org.apache.maven.its:dep-mng5459:pom:0.4.0-20130404.090532-2 in repo


Claudio


-
To unsubscribe, e-mail: dev-unsubscr...@maven.apache.org
For additional commands, e-mail: dev-h...@maven.apache.org



RE: artifact attached by plugin not appearing in subsequent plugins

2013-08-26 Thread Richard Sand
Sorry just correcting myself, I was referring to the maven shade plugin not 
mojo shade plugin.

-Richard

-Original Message-
From: Richard Sand [mailto:rs...@idfconnect.com] 
Sent: Monday, August 26, 2013 4:54 PM
To: 'Maven Users List'; 'Maven Developers List'
Subject: RE: artifact attached by plugin not appearing in subsequent plugins

Hi all,

Mirko thanks for your reply. I think the plugin API should have *some* simple 
mechanism for transferring a generated artifact from one plugin to the next.

In my example, the project creates a standard named artifact i.e. 
${project.build.finalName}.war, and the obfuscation plugin also a new 
(generated) jar artifact with classifier small, i.e. 
${project.build.finalName}-small.jar, specifically with the classes it has 
processed.

To get the war file to include this new generated artifact, I have to place it 
directly into the WEB-INF/lib folder, whereas by default it would go into 
${project.build.directory}. Obviously this isn't burdensome. But my question 
was whether it would make sense to have the war plugin consider attached 
artifacts. I actually created issue MWAR-304 with a simple patch to do this, 
but the developer agreed with your point that attaching an artifact should only 
be used for placing said artifact into a repository, not as a mechanism for 
propagating the artifact to other plugins 
(https://jira.codehaus.org/browse/MWAR-304).

However this general use case (propagating a generated artifact between 
plugins) continues to vex me. Example - I tried using the mojo shade plugin to 
generate a jar to include in a web application.  Same type of use case - I want 
the shade plugin to use the artifact generated by a previous plugin. The shade 
plugin only accepts an artifact as an input, I cannot give it a path to a 
filename to include as I did above with the war plugin. So either way I need to 
modify the shade plugin - either to accept filesystem paths to include or a 
Boolean to tell it to check for dynamically attached plugins. Since Maven 3.0 
includes MavenProject.getAttachedArtifacts(), it seems silly not to use it. If 
a plugin is going to accept artifacts as input, why shouldn't an attached 
artifact be considered? It seems like the natural and transparent mechanism to 
propagate generated, attached artifacts between plugins. Just choose your 
classifier and go. 

Granted, the shade plugin should also have a parameter to include filesystem 
paths.

Best regards,

Richard

-Original Message-
From: Mirko Friedenhagen [mailto:mfriedenha...@gmail.com]
Sent: Tuesday, August 20, 2013 11:40 AM
To: Maven Users List; Maven Developers List
Subject: RE: artifact attached by plugin not appearing in subsequent plugins

Hello Richard,

x-posted to dev, as the war-plugin is a core-plugin:
- IMO attaching would be the wrong term as it has another meaning.
- This is more of a generated jar (as generated sources,  classes etc.)
- IMO packages should go in Maven modules of their own.

Regards Mirko
--
Sent from my mobile
On Aug 20, 2013 5:13 PM, Richard Sand rs...@idfconnect.com wrote:

 Is there any merit to the idea of having a configuration option in 
 maven-war-plugin to include attached artifacts in the webapp in the 
 same way it includes dependent artifacts?

 -Richard

 -Original Message-
 From: Mirko Friedenhagen [mailto:mfriedenha...@gmail.com]
 Sent: Tuesday, August 20, 2013 6:20 AM
 To: Maven Users List
 Subject: RE: artifact attached by plugin not appearing in subsequent 
 plugins

 Richard,

 AFAIK attachArtifact just tells Maven to install an additional binary 
 to it's local cache resp. to deploy it to the distribution repository.

 What you want, as far as I understand, is to create an artifact which 
 will be picked up later on and included in a war? You should probably 
 create a separate module project, which creates the jar and just 
 include this jar as runtime dependency in your war project.

 Regards Mirko
 --
 Sent from my mobile
 On Aug 20, 2013 7:42 AM, Richard Sand rs...@idfconnect.com wrote:

  I concluded that this was a missing feature of maven-war-plugin, 
  where it simply wasn't looking to see if there were attached resources.
 
  I supplied a simple patch to the handleArtifacts() method to have 
  that method also handle attached artifacts,  You can see the report here.
  https://jira.codehaus.org/browse/MWAR-304
 
  -Richard
 
  -Original Message-
  From: Richard Sand [mailto:rs...@idfconnect.com]
  Sent: Monday, August 19, 2013 6:19 PM
  To: 'Maven Users List'
  Subject: artifact attached by plugin not appearing in subsequent 
  plugins
 
  Hi all - I've been stuck for a while trying to get an artifact 
  injected by a plugin to apply to subsequent plugins/goals in a 
  project.
 
  I have a project which generates a web application. My use case here 
  is the obfuscator plugin which I wrote, which creates a jar file 
  called projectname-small.jar. The plugin creates jar file using 
  

RE: artifact attached by plugin not appearing in subsequent plugins

2013-08-26 Thread Richard Sand
Hi all,

Mirko thanks for your reply. I think the plugin API should have *some* simple 
mechanism for transferring a generated artifact from one plugin to the next.

In my example, the project creates a standard named artifact i.e. 
${project.build.finalName}.war, and the obfuscation plugin also a new 
(generated) jar artifact with classifier small, i.e. 
${project.build.finalName}-small.jar, specifically with the classes it has 
processed.

To get the war file to include this new generated artifact, I have to place it 
directly into the WEB-INF/lib folder, whereas by default it would go into 
${project.build.directory}. Obviously this isn't burdensome. But my question 
was whether it would make sense to have the war plugin consider attached 
artifacts. I actually created issue MWAR-304 with a simple patch to do this, 
but the developer agreed with your point that attaching an artifact should only 
be used for placing said artifact into a repository, not as a mechanism for 
propagating the artifact to other plugins 
(https://jira.codehaus.org/browse/MWAR-304).

However this general use case (propagating a generated artifact between 
plugins) continues to vex me. Example - I tried using the mojo shade plugin to 
generate a jar to include in a web application.  Same type of use case - I want 
the shade plugin to use the artifact generated by a previous plugin. The shade 
plugin only accepts an artifact as an input, I cannot give it a path to a 
filename to include as I did above with the war plugin. So either way I need to 
modify the shade plugin - either to accept filesystem paths to include or a 
Boolean to tell it to check for dynamically attached plugins. Since Maven 3.0 
includes MavenProject.getAttachedArtifacts(), it seems silly not to use it. If 
a plugin is going to accept artifacts as input, why shouldn't an attached 
artifact be considered? It seems like the natural and transparent mechanism to 
propagate generated, attached artifacts between plugins. Just choose your 
classifier and go. 

Granted, the shade plugin should also have a parameter to include filesystem 
paths.

Best regards,

Richard

-Original Message-
From: Mirko Friedenhagen [mailto:mfriedenha...@gmail.com] 
Sent: Tuesday, August 20, 2013 11:40 AM
To: Maven Users List; Maven Developers List
Subject: RE: artifact attached by plugin not appearing in subsequent plugins

Hello Richard,

x-posted to dev, as the war-plugin is a core-plugin:
- IMO attaching would be the wrong term as it has another meaning.
- This is more of a generated jar (as generated sources,  classes etc.)
- IMO packages should go in Maven modules of their own.

Regards Mirko
--
Sent from my mobile
On Aug 20, 2013 5:13 PM, Richard Sand rs...@idfconnect.com wrote:

 Is there any merit to the idea of having a configuration option in 
 maven-war-plugin to include attached artifacts in the webapp in the 
 same way it includes dependent artifacts?

 -Richard

 -Original Message-
 From: Mirko Friedenhagen [mailto:mfriedenha...@gmail.com]
 Sent: Tuesday, August 20, 2013 6:20 AM
 To: Maven Users List
 Subject: RE: artifact attached by plugin not appearing in subsequent 
 plugins

 Richard,

 AFAIK attachArtifact just tells Maven to install an additional binary 
 to it's local cache resp. to deploy it to the distribution repository.

 What you want, as far as I understand, is to create an artifact which 
 will be picked up later on and included in a war? You should probably 
 create a separate module project, which creates the jar and just 
 include this jar as runtime dependency in your war project.

 Regards Mirko
 --
 Sent from my mobile
 On Aug 20, 2013 7:42 AM, Richard Sand rs...@idfconnect.com wrote:

  I concluded that this was a missing feature of maven-war-plugin, 
  where it simply wasn't looking to see if there were attached resources.
 
  I supplied a simple patch to the handleArtifacts() method to have 
  that method also handle attached artifacts,  You can see the report here.
  https://jira.codehaus.org/browse/MWAR-304
 
  -Richard
 
  -Original Message-
  From: Richard Sand [mailto:rs...@idfconnect.com]
  Sent: Monday, August 19, 2013 6:19 PM
  To: 'Maven Users List'
  Subject: artifact attached by plugin not appearing in subsequent 
  plugins
 
  Hi all - I've been stuck for a while trying to get an artifact 
  injected by a plugin to apply to subsequent plugins/goals in a 
  project.
 
  I have a project which generates a web application. My use case here 
  is the obfuscator plugin which I wrote, which creates a jar file 
  called projectname-small.jar. The plugin creates jar file using 
  MavenProjectHelper.attachArtifact(). The plugin executes during the 
  packaging phase, before the maven-war-plugin. The jar file is 
  created successfully, and the call to attachArtifact() returns with 
  no errors, but the maven-war-plugin does not see the jar file and 
  therefore doesn't include it in the results. When I turn on 
  debugging 

RE: artifact attached by plugin not appearing in subsequent plugins

2013-08-26 Thread Richard Sand
Hi Wayne - that seems a very inefficient approach, having 5 or 6 separate 
modules to manage to achieve a single assembly. The point is that maven does 
have phases, goals, lifecycles - why not use them? The MavenProject object 
already provides the mechanism for one plugin to see the attachments from a 
previous plugin, so again it’s a why-not question.

The way you present the choice of having a single maven module certainly makes 
it sound distasteful, but I think dividing the construction of a single atomic 
component into multiple modules because the plugins cannot be chained together 
is more unappealing. Especially when the maven architecture and API makes it so 
easy to do.

BTW I haven't touched Ant in at least 6 years, so I doubt I'm an Ant-oriented 
person.  :-)

-Richard

-Original Message-
From: Wayne Fay [mailto:wayne...@gmail.com]
Sent: Monday, August 26, 2013 5:59 PM
To: Maven Users List
Subject: Re: artifact attached by plugin not appearing in subsequent plugins

 Mirko thanks for your reply. I think the plugin API should have *some* 
 simple mechanism for transferring a generated artifact from one plugin to 
 the next.

One way people on this list implement this is by viewing their various Maven 
modules as being a series of transformations which are applied to successive 
artifacts.

lib, common, etc
depended on by war
depended on by obfuscator
depended on by shader
depended on by assembly
eventually produces an obfuscated, shaded war file in a tar.gz along with the 
latest compatible version of Tomcat in an OS-specific package ready for simple 
deployment.

An Ant-oriented person might put all this in a single Maven module and force 
the various plugins to bind to different phases and customize or hard-code the 
configuration to force this pipeline just to achieve all of this in a single 
module build. A Maven-oriented person probably would not.

Wayne

-
To unsubscribe, e-mail: users-unsubscr...@maven.apache.org
For additional commands, e-mail: users-h...@maven.apache.org




-
To unsubscribe, e-mail: users-unsubscr...@maven.apache.org
For additional commands, e-mail: users-h...@maven.apache.org




-
To unsubscribe, e-mail: dev-unsubscr...@maven.apache.org
For additional commands, e-mail: dev-h...@maven.apache.org