Re: suggestion regarding MJAVADOC-564 - javadoc:fix enforcing order of tags

2019-01-16 Thread Richard Sand
Comparator will help order the tags but we still want to preserve the order of 
found tags we aren’t fixing, such as if we aren’t fixing “see” tags and want to 
honor the original order of the existing see tags. 

I’ll give it a shot

Sent from my iPhone

> On Jan 16, 2019, at 3:40 PM, Robert Scholte  wrote:
> 
> It would be great if the fix-goal code could be simplified.
> In fact I thought about moving code to a different package+classes, which 
> should make it easier read, maintain and to unittest. But that's a huge task.
> It is still a bit too abstract for me to see the complete picture.
> 
> The first thing I had in mind when reading this was the need for a comparator 
> to reorder the tags.
> Not sure if that would replace the need for common-collections4
> 
> I would suggest to give it a try.
> 
> thanks,
> Robert
> 
>> On Wed, 16 Jan 2019 20:13:05 +0100, Richard Sand  
>> wrote:
>> 
>> Hi all - I'd like to provide a fix for MJAVADOC-564, where the javadoc:fix 
>> goal will sometimes output the javadoc tags in incorrect order. The problem 
>> occurs because of the way the fix goal initially reads in the tags and then 
>> iterates over them to make the prescribed corrections. There is an internal 
>> wrapper class inside the AbstractFixJavadocMojo called JavaEntityTags that 
>> maintains various state of the processing, including maintaining the 
>> original values of all tags, the mappings of @param and @throws to values, 
>> etc. There is a lot of logic in the mojo to keep track of original and 
>> updated values, determining whether an update is necessary, etc., and I 
>> think that it can be simplified by keeping an ordered, multi-valued map of 
>> all tags, both the original and added/updated. My suggestion is to add a 
>> trivial inner class called JavaEntityValue to hold both the original and 
>> updated value, i.e.:
>> 
>> class JavaEntityValue {
>>  String originalValue;
>>  String updatedValue;
>> }
>> 
>> and then keep a multi-valued, ordered map in JavaEntityTags, e.g.
>> 
>> LinkedHashMap allTags >;
>> 
>> This will allow all of the existing logic inside the mojo to manipulate the 
>> "allTags" map instead of directly writing to the output buffer, so we can 
>> output all of the results at the very end of the mojo, output the tags in 
>> the prescribed order, while still preserving the existing order of all 
>> instances of a particular tag or any unknown tags.
>> 
>> I'd like to use common-collections4 (which will not override or conflict 
>> with the existing collections dependency, which is 3.2.1) for the "allTags" 
>> map to create this combined linked/multivalued map with its decorators.
>> 
>> Any thoughts about this? If its ok I'll build it and issue a PR on the 
>> current 3.1.0-SNAPSHOT master
>> 
>> Best regards,
>> 
>> Richard
>> 
>> 
>> -
>> To unsubscribe, e-mail: dev-unsubscr...@maven.apache.org
>> For additional commands, e-mail: dev-h...@maven.apache.org
> 
> -
> To unsubscribe, e-mail: dev-unsubscr...@maven.apache.org
> For additional commands, e-mail: dev-h...@maven.apache.org
> 

-
To unsubscribe, e-mail: dev-unsubscr...@maven.apache.org
For additional commands, e-mail: dev-h...@maven.apache.org



suggestion regarding MJAVADOC-564 - javadoc:fix enforcing order of tags

2019-01-16 Thread Richard Sand
Hi all - I'd like to provide a fix for MJAVADOC-564, where the 
javadoc:fix goal will sometimes output the javadoc tags in incorrect 
order. The problem occurs because of the way the fix goal initially 
reads in the tags and then iterates over them to make the prescribed 
corrections. There is an internal wrapper class inside the 
AbstractFixJavadocMojo called JavaEntityTags that maintains various 
state of the processing, including maintaining the original values of 
all tags, the mappings of @param and @throws to values, etc. There is a 
lot of logic in the mojo to keep track of original and updated values, 
determining whether an update is necessary, etc., and I think that it 
can be simplified by keeping an ordered, multi-valued map of all tags, 
both the original and added/updated. My suggestion is to add a trivial 
inner class called JavaEntityValue to hold both the original and updated 
value, i.e.:


class JavaEntityValue {
 String originalValue;
 String updatedValue;
}

and then keep a multi-valued, ordered map in JavaEntityTags, e.g.

LinkedHashMap allTags >;

This will allow all of the existing logic inside the mojo to manipulate 
the "allTags" map instead of directly writing to the output buffer, so 
we can output all of the results at the very end of the mojo, output the 
tags in the prescribed order, while still preserving the existing order 
of all instances of a particular tag or any unknown tags.


I'd like to use common-collections4 (which will not override or conflict 
with the existing collections dependency, which is 3.2.1) for the 
"allTags" map to create this combined linked/multivalued map with its 
decorators.


Any thoughts about this? If its ok I'll build it and issue a PR on the 
current 3.1.0-SNAPSHOT master


Best regards,

Richard


-
To unsubscribe, e-mail: dev-unsubscr...@maven.apache.org
For additional commands, e-mail: dev-h...@maven.apache.org



Re: [GitHub] maven issue #69: MNG-5899 Reactor should use reduced dependency pom

2017-04-30 Thread Richard Sand
If the shaded artifact is built as a submodule with a different artifact 
ID, then the shade plugin can specify its DRP as its main POM. It lets 
other projects depend on the shaded artifact with only its remaining 
(unshaded) dependencies.


I had tried to use the shade plugin along with the install & deploy 
plugins so that I could build the main project, shaded artifact, and DRP 
within a single POM, but was told that such practice was contrary to the 
maven philosophy of all artifacts produced by a given POM have that POM 
as its parent. So moving shade into a submodule with settings like this 
was the only solution I could find that works:


  true
  
false

  false

Not sure if this adds anything to the thread but it worked for me

-Richard




-- Original Message --
From: "grahamtriggs" 
To: dev@maven.apache.org
Sent: 4/27/2017 1:30:24 PM
Subject: [GitHub] maven issue #69: MNG-5899 Reactor should use reduced 
dependency pom



Github user grahamtriggs commented on the issue:

https://github.com/apache/maven/pull/69

> Mutated pom.xml files must not invalidate original reactor 
ProjectDependencyGraph. More specifically, if the original graph 
allowed certain build order, the new graph must still allow the same 
order.


That might be a practical limitation right now, but I wouldn't mind 
having a dynamic build order. The two things that should matter are 
that builds complete, and have the same final outcome. Dealing with the 
issues of being able "pull up" a dependency in the reactor, and knowing 
what can be built / is waiting on something else to be built might 
actually benefit scalability with parallel executions.


Seems like there is a more important design question here - should 
a project, when built and installed to a repository, then depended on 
by a completely separate build behave the same when it is included in a 
reactor?


If you can create an artifact, and a custom pom for install / 
deployment to a repository that differs from the project pom, then to 
my mind that should be what is seen by any module including it as a 
dependency, even in the same reactor.


The concern is about adding new dependencies, and whilst that is 
technically possible, I'm not sure that it needs to be supported - it 
could have just been made a dependency of the project anyway.


The real issue - particularly with the shade plugin - is that you 
want to embed things in an artifact, and not have other projects having 
to try and pull them in as dependencies. To be honest, it would 
actually be better if this was native to the pom, rather than part of 
the shade plugin, because then you could express what dependencies you 
want to embed, and this information would then be communicated to other 
projects depending on it. Then they would not only not pull in the 
transitive dependency for the embedded code, they would also be able to 
determine if the embedded version is compatible with their 
requirements.



---
If your project is set up for it, you can reply to this email and have 
your
reply appear on GitHub as well. If your project does not have this 
feature
enabled and wishes so, or if the feature is enabled but not working, 
please
contact infrastructure at infrastruct...@apache.org or file a JIRA 
ticket

with INFRA.
---

-
To unsubscribe, e-mail: dev-unsubscr...@maven.apache.org
For additional commands, e-mail: dev-h...@maven.apache.org




-
To unsubscribe, e-mail: dev-unsubscr...@maven.apache.org
For additional commands, e-mail: dev-h...@maven.apache.org



Re: Building jar targeting multiple Java versions, including 9

2016-08-31 Thread Richard Sand
Agree with Andreas, I can see MRJars usefulness too. APIs with lambdas 
come to mind as a good example, or interfaces with default 
implementations, are just a couple features of Java8 which I've avoided 
adding because of needing backward compatibility.


Eclipse's maven support in general is very subpar... and it may be 
difficult to avoid having multiple projects for different runtimes - but 
I could envision a scenario where I'd simply have my project configured 
in Eclipse as Java9, but in my pom configured for other builds. Eclipse 
wouldn't be able to tell me if a particular piece of code was compatible 
with the other runtimes, but maven sure would, and maybe Eclipse will 
add the capability. In any event I wouldn't want the limitations of a 
particular IDE to drive the POM design and in particular require 
multi-projects when a single project could suffice. If I as the end user 
can live with the limitations of the IDE (e.g. not being multi-runtime 
aware) then I should be allowed to do so.


Richard Sand | CEO
IDF Connect, Inc.
2207 Concord Ave, #359
Wilmington | Delaware 19803 | USA
Office: +1 888 765 1611 | Fax: +1 866 765 7284
Mobile: +1 267 984 3651




-- Original Message --
From: "Andreas Gudian" <andreas.gud...@gmail.com>
To: "Maven Developers List" <dev@maven.apache.org>
Sent: 8/31/2016 4:52:16 PM
Subject: Re: Building jar targeting multiple Java versions, including 9


2016-08-31 22:02 GMT+02:00 Tibor Digana <tibor.dig...@googlemail.com>:

 >>So, we can try to have different source folders for different 
runtimes.

 No. If you are developers you would not say this.
 No developer would develop and merge the same code multiple times!!!
 Look, now let the maven-compiler-plugin to download jars of the same
 artifact which was released for Version 1.7 and 1.8, and now let the
 compiler wrap those two versions with current version 1.9 which is
 ready to be released. Now you have fat jar MRjar.
 This means any Java version of the artifact can be chosen on the top 
of

 JDK9.

 >>Most other plugins would need to be executed for each version as 
well -

 javadoc, junit
 No. because the previous Versions were already tested and processed.

 >>Again if they *don't* need a separate execution, then why is MRJar
 needed?
 Exactly, and now I cannot imaging company which is going to 
complicate

 their project with a stupid MRJar and why so if they have WildFly
 8.2.1 which supports Java 8 do you think they EE project would
 struggle with MRJar? Of course not. If they have WildFly 10.x
 supporting Java 9 then again no developer would build MRJar.

 I think MRJar is suitable only for JDK internals and it is only
 because of Oracle has problem with JCP because JCP does not let 
Oracle

 to break backwards compatibility and introduce dynamic language
 features. So Oracle is trying for compromise.
 Oracle is trying to let you build java.* JDK with javac, interchange
 internal modules, etc.
 Nothing for other non-JDK projects.



That's a bit off-topic, as this thread is about the _how_ and not the
_why_, but I'd like to point out that I disagree with you here ;). 
MRJars
would be great for a lot of frameworks and libraries that you would 
_use_

in your end-product. Frameworks that offer different alternatives or
extended features requiring a newer JDK now have to build and maintain
artifacts either with different versions or with different artifactIds
(e.g. funky-annotations and funky-annotations-jdk8, where the -jdk8 
variant
contains annotations with @Repeatable) -- there are a lot of examples 
out
there. And that's not only cumbersome for maintainers, but also for 
users

(which is the bad thing).

So go MRjars! Making them easily consumable for users is the 
top-prirority

and it looks like a solid approach.

Making them easy to build for the maintainers is way less important (as
that's only a tiny percentage of the maven-users out there), but it 
sure

needs to work with different IDEs and especially Eclipse would have
problems handling different target JDKs in the same project.

So for me as an Eclipse user, I'd have to take the road with the
multi-module projects. But that's okay, as I'm just building a 
framework
and separating the different variants for different JDKs has also a lot 
of
benefits, as already described above: testing, generating javadocs, and 
all
that other stuff is already there and can be used without having to 
change
anything. For me, a project collecting the internally seperate 
artifacts

and combining them into an MRjar would be totally fine.

Andreas





 --
 Cheers
 Tibor

 -
 To unsubscribe, e-mail: dev-unsubscr...@maven.apache.org
 For additional commands, e-mail: dev-h...@maven.apache.org





-
To unsubscribe, e-mail: dev-unsubscr...@maven.apache.org
For additional commands, e-mail: dev-h...@maven.apache.org



Re: Building jar targeting multiple Java versions, including 9

2016-08-31 Thread Richard Sand
Hi Paul - I certainly believe you that you've thought this through alot 
further than I have. :-) I'm just speaking about a general sense of 
striving towards keeping the projects as simple as possible. I think 
that sometimes its a bit too easy to state that some build pattern 
should just be done with a multi-module project, when a small fix or 
improvement will neatly accomplish the same in a single POM. So I'm just 
making a general sweeping statement and not specifically about MRJars.


Now I'm just typing out loud for discussion...

Java9 is definitely a paradigm shift more so than almost any other 
release of java. I see your point - the MRJar isn't just about having 
source and target versions in the compiler plugin. If the MRJar doesn't 
leverage new runtime features, e.g. lambdas in Java8, then whats the 
point of even having the MRJar? Just set the target to the lowest common 
denominator and be done with it.


So the actual source code will be different for each java runtime - 
otherwise there's no point. So, we can try to have different source 
folders for different runtimes. Of course, most IDEs aren't going to 
support enforcing different compiler settings etc. for different source 
folders within a single project, but lets even set that aside for now - 
just thinking about the maven implications:
Maven itself would have to be run with Java9The compiler plugin would 
need a different configuration for each java versionMost other plugins 
would need to be executed for each version as well - javadoc, junit, 
surefire etc. Again if they *don't* need a separate execution, then why 
is MRJar needed?At the very least, each execution would have unique 
attached artifacts, either with classifiers or new artifact idsState 
would need to be maintained from plugin to plugin about the specific jre 
in use for that invocation
That certainly sounds like a multi-module project - at least that may be 
the easiest solution. But if I wanted to use profiles and have a 
separate profile for each JRE, and at the very end of the POM aggregate 
the artifacts together - I can see that working as well.


You know, I've had similar struggles with multi-artifact projects as 
well regardless of Java9. Example, for one particular project I build 
multiple WAR files, each with a different classifier, with some 
packaging customizations for different deployment models e.g. I have 1 
war file that includes slf4j+logback, and another warfile that has 
slf4j+jul bridge. There are other permutations too, e.g. some packages 
have obfuscated code while others are not. So I have multiple executions 
of maven-war-plugin, and in some cases I need to explicitly invoke the 
install & deploy plugins too. But I have resisted breaking this up into 
multiple projects with multiple poms simply because its easier (for me) 
to work with all of the packaging and distribution logic in one place, 
and maven provides the capabilities with profiles, executions, etc. to 
give the flexibility I need. But there have certainly been times that 
I've thought about just going multi-module.


I guess my long rambling point is that I can see MRJar being done both 
ways, and wouldn't want to pigeon-hole anyone into doing it one way vs 
another.


Richard Sand | CEO
IDF Connect, Inc. <http://www.idfconnect.com/>
2207 Concord Ave, #359
Wilmington | Delaware 19803 | USA
Office: +1 888 765 1611 | Fax: +1 866 765 7284
Mobile: +1 267 984 3651




-- Original Message --
From: "Paul Benedict" <pbened...@apache.org>
To: "Maven Developers List" <dev@maven.apache.org>; "Richard Sand" 
<rs...@idfconnect.com>

Sent: 8/31/2016 11:50:26 AM
Subject: Re: Building jar targeting multiple Java versions, including 9

Richard, I share your sentiment. I've given this subject some thought 
and
I've come to the conclusion that a full project life cycle for each 
Java
version is necessary for good measure. You will want to write main and 
test
code for each class. IDEs treat individual projects as wholly contained 
so

that means their own IDE project preferences too (code style, compiler
version, etc.). I believe a mental shift is necessary here (not 
directed at
you, per se, but directed toward anyone wanting to do a Multi-Release 
JAR)
to accept that these really are individual projects -- not just 
subsets.


However, I am completely willing to hear the opposite and learn why my
opinion is wrong too. Feel free to tell me why it's better as one 
project.

MRJAR feature is so new I am bound to learn much from others.

Cheers,
Paul

On Wed, Aug 31, 2016 at 10:46 AM, Richard Sand <rs...@idfconnect.com> 
wrote:


 Understood. I guess it wouldn't be horrible if it required a 
multi-module
 maven project but I would still prefer to avoid introducing a 
requirement

 for multi-module projects anywhere.

 Richard Sand | CEO
 IDF Connect, Inc.
 2207 Concord Ave, #359
 Wilmington | Delaware 19803 | USA
 Office:

Re: Building jar targeting multiple Java versions, including 9

2016-08-31 Thread Richard Sand
Understood. I guess it wouldn't be horrible if it required a 
multi-module maven project but I would still prefer to avoid introducing 
a requirement for multi-module projects anywhere.


Richard Sand | CEO
IDF Connect, Inc.
2207 Concord Ave, #359
Wilmington | Delaware 19803 | USA
Office: +1 888 765 1611 | Fax: +1 866 765 7284
Mobile: +1 267 984 3651




-- Original Message --
From: "Paul Benedict" <pbened...@apache.org>
To: "Richard Sand" <rs...@idfconnect.com>
Cc: "ZML-Apache-Maven-Developers" <dev@maven.apache.org>
Sent: 8/31/2016 11:10:33 AM
Subject: Re: Building jar targeting multiple Java versions, including 9

To be clear, I was purely addressing the concern of a Multi-Release 
JAR.


Cheers,
Paul

On Wed, Aug 31, 2016 at 10:09 AM, Richard Sand <rs...@idfconnect.com> 
wrote:


 I definitely concur with Robert's point: "I don't think we make 
developers
 very happy  if they are advised to have a multimodule project just to 
be
 able to compile the module-info file.". I can live with the 
requirement
 that I must run Maven with Java9 to support creating module-info and 
having
 the addtional modules created by a separate plugin. Simplicity 
wherever

 possible.

 Best regards,

 Richard Sand | CEO
 IDF Connect, Inc.
 2207 Concord Ave, #359
 Wilmington | Delaware 19803 | USA
 Office: +1 888 765 1611 | Fax: +1 866 765 7284
 Mobile: +1 267 984 3651


 -- Original Message --
 From: "Robert Scholte" <rfscho...@apache.org>
 To: "ZML-Apache-Maven-Developers" <dev@maven.apache.org>; "Paul 
Benedict"

 <pbened...@apache.org>
 Sent: 8/31/2016 10:39:52 AM
 Subject: Re: Building jar targeting multiple Java versions, including 
9


 Hi Paul,


 no problem to move it to this thread. It is indeed about "The Maven
 Way",  although we may need to start from the beginning, explaining 
the

 issue  we're facing.

 Let's use ASM as an example, their 6.0_ALPHA has been built like 
this,

 although without Maven.
 ASM is an all purpose Java bytecode manipulation and analysis 
framework.
 IIRC their code is compatible with Java 1.2, but they've also added 
a

 module-info for those who want to use this dependency within a
 Java9/Jigsaw project.
 module-info MUST be built with Java9 with source/target or release = 
9.

 Other sources must be built with an older JDK with source/target 1.2

 There are several ways to solve this:
 - multi-module project and multi release jar like Paul suggest. 
However,

 IIRC the specs say that the module-info MUST exist in the root.
 - 2 source folders, src/main/java and src/main/jigsaw, both writing 
to
 target/classes. Here it is quite clear what happens per 
source-folder.

 - 1 source folder and all the magic of calling javac twice in the
 maven-compiler-plugin. I started with this, but I don't like it. 
Details

 are below.
 - 1 source folder and 2 execution blocks (one excluding module-info, 
one

 only including module-info).

 We shouldn't be looking at Maven alone, but also at IDE support. 
AFAIK
 Netbeans and IntelliJ simply call Maven. Eclipse is probably hard 
due to

 the m2eclipse extensions.

 Now back to Pauls suggestion. I don't think we make developers very
 happy  if they are advised to have a multimodule project just to be 
able

 to  compile the module-info file.
 I am aware that this is mainly an issue for library builders, end 
users

 simply build everything with Java9, no problems there. From library
 builders you can expect that they don't mind adding extra 
configuration to
 build their jar, but forcing them to set up a Maven multimodule 
project is

 not really nice.
 I would expect the module-info close to the related sourcefiles, so 
I

 would prefer in the same (single) MavenProject.

 *Unless* IDEs are so strong with handling multi release jars that it
 looks  like I'm adjusting the module-info, even though it is 
actually

 located  somewhere else.

 So let's see the opinions from others.

 thanks,
 Robert

 On Wed, 31 Aug 2016 15:59:04 +0200, Paul Benedict 
<pbened...@apache.org>

 wrote:

 Robert, I'm responding to dev@maven so we can discuss Maven

 philosophies...

 I believe the pattern should be based on a multi-module project. 
Each

 module should target the expected JDK version. Then introduce a new
 "mrjar"
 type for the parent that knows how to bind them all together into a
 Multi-Release JAR.

 Cheers,
 Paul

 On Wed, Aug 31, 2016 at 6:10 AM, Robert Scholte 
<rfscho...@apache.org>

 wrote:

 I've been working on the implementation of this in the

 maven-compiler-plugin, but I'm not really pleased with the result.
 The problem is that in the worst case scenario we have to work 
with 3

 different versions of Java:
 - The Maven Runtime (set as JAVA_HOME)
 - JDK for the module-info.java
 - JDK for all other source files.

 The example below worked because all three were set to JDK9.
 But b

Re: Building jar targeting multiple Java versions, including 9

2016-08-31 Thread Richard Sand
I definitely concur with Robert's point: "I don't think we make 
developers very happy  if they are advised to have a multimodule project 
just to be able to compile the module-info file.". I can live with the 
requirement that I must run Maven with Java9 to support creating 
module-info and having the addtional modules created by a separate 
plugin. Simplicity wherever possible.


Best regards,

Richard Sand | CEO
IDF Connect, Inc.
2207 Concord Ave, #359
Wilmington | Delaware 19803 | USA
Office: +1 888 765 1611 | Fax: +1 866 765 7284
Mobile: +1 267 984 3651

-- Original Message --
From: "Robert Scholte" <rfscho...@apache.org>
To: "ZML-Apache-Maven-Developers" <dev@maven.apache.org>; "Paul 
Benedict" <pbened...@apache.org>

Sent: 8/31/2016 10:39:52 AM
Subject: Re: Building jar targeting multiple Java versions, including 9


Hi Paul,

no problem to move it to this thread. It is indeed about "The Maven 
Way",  although we may need to start from the beginning, explaining the 
issue  we're facing.


Let's use ASM as an example, their 6.0_ALPHA has been built like this,  
although without Maven.
ASM is an all purpose Java bytecode manipulation and analysis 
framework.  IIRC their code is compatible with Java 1.2, but they've 
also added a  module-info for those who want to use this dependency 
within a  Java9/Jigsaw project.
module-info MUST be built with Java9 with source/target or release = 9. 
 Other sources must be built with an older JDK with source/target 1.2


There are several ways to solve this:
- multi-module project and multi release jar like Paul suggest. 
However,  IIRC the specs say that the module-info MUST exist in the 
root.
- 2 source folders, src/main/java and src/main/jigsaw, both writing to  
target/classes. Here it is quite clear what happens per source-folder.
- 1 source folder and all the magic of calling javac twice in the  
maven-compiler-plugin. I started with this, but I don't like it. 
Details  are below.
- 1 source folder and 2 execution blocks (one excluding module-info, 
one  only including module-info).


We shouldn't be looking at Maven alone, but also at IDE support. AFAIK  
Netbeans and IntelliJ simply call Maven. Eclipse is probably hard due 
to  the m2eclipse extensions.


Now back to Pauls suggestion. I don't think we make developers very 
happy  if they are advised to have a multimodule project just to be 
able to  compile the module-info file.
I am aware that this is mainly an issue for library builders, end users 
 simply build everything with Java9, no problems there. From library  
builders you can expect that they don't mind adding extra configuration 
to  build their jar, but forcing them to set up a Maven multimodule 
project is  not really nice.
I would expect the module-info close to the related sourcefiles, so I  
would prefer in the same (single) MavenProject.


*Unless* IDEs are so strong with handling multi release jars that it 
looks  like I'm adjusting the module-info, even though it is actually 
located  somewhere else.


So let's see the opinions from others.

thanks,
Robert

On Wed, 31 Aug 2016 15:59:04 +0200, Paul Benedict 
<pbened...@apache.org>  wrote:


Robert, I'm responding to dev@maven so we can discuss Maven  
philosophies...


I believe the pattern should be based on a multi-module project. Each
module should target the expected JDK version. Then introduce a new  
"mrjar"

type for the parent that knows how to bind them all together into a
Multi-Release JAR.

Cheers,
Paul

On Wed, Aug 31, 2016 at 6:10 AM, Robert Scholte <rfscho...@apache.org>
wrote:


I've been working on the implementation of this in the
maven-compiler-plugin, but I'm not really pleased with the result.
The problem is that in the worst case scenario we have to work with 3
different versions of Java:
- The Maven Runtime (set as JAVA_HOME)
- JDK for the module-info.java
- JDK for all other source files.

The example below worked because all three were set to JDK9.
But based on the source/target of 1.6 I cannot predict which JDK is  
used,
only that it is at least JDK6. Should the plugin switch to another 
JDK?

And if you want to compile with source/target 1.5 or less, you're in
trouble. There's something called toolchain, where you can specify 
the
JavaHome per version, but in case of maven-compiler-plugin it assumes 
 that

all java-related plugins and execution blocks want to use the same
toolchain through the whole Maven project.
The good news is that for the maven-jdeps-plugin I improved this part 
in

Maven 3.3.1, since this plugin only works with Java8 and above, which
doesn't have to be the same JDK to compile the sources with. Now you 
can
simple say: I want the toolchain for version X. This feature needs to 
be

added to the plugin.

That said I think I will write a recipe for this. This is first of 
all  an

issue for library writers who want to have their jar compat

Re: opinions on MJAVADOC-451

2016-08-06 Thread Richard Sand

Hi all -

the request for the skip-parameter started as a requirement to break an 
 infinite loop. When I discovered how the plugin was used and told that 
 binding the plugin to a different phase, the loop was gone. Even 
though,  the request for the skip parameter stayed.


Untrue. I opened MJAVADOC-451 first. It was the first time I'd submitted 
anything from the Maven community. I hadn't even found the looping 
behavior yet. The point was that the looping behavior was trivial to fix 
because I had *already* added the skip.



I consider the javadoc:fix goal in the same range as the 
release:prepare  release:perform combination (why as is everything 
executed twice) and  cobertura-maven-plugin (why are the tests executed 
twice) and the fix-goal  is probably even worse.
javadoc:fix with force=true is akin to running the checkstyle plugin but 
having it actually fixing the formatting in band. But - this behavior 
requires force=true. You accidentally stumble on this behavior.


So forget about force=true for a moment. What if I simply want to force 
ALL of my developers, because they are so lazy about entering proper 
javadocs, to have the fix goal executed interactively? But I want them 
to be able to explicitly skip it, or for the CI servers to skip it?


The point is, there are many ways that people can find to use the 
capabilities of a given plugin.


You really must understand what this goal does under the hood. It calls 
 clirr:check, which always executes compile up front since it requires  
compiled classes. if you bind javadoc:fix to the compile goal, you'll 
get  the infinite loop. Binding it to process-classes would fix this.
But from now on sources and classes are out of sync! This means that  
plugins like maven-xref-plugin gives you wrong information.
So to be perfect, after executing javadoc:fix, one should recompile the 
 code.
But wait, now the sources in SCM are out of sync too. If you get an  
exception and try to fix this based on the linenumbers in scm, you're  
fooled. So it must be javadoc:fix + compiler:compile + scm:commit.
We can only force plugins to be executed up front, not afterwards 
(which  is probably a good thing).

It is impossible to come to one truth right now.

For this reason I'd go for activating the requiresDirectInvocation 
flag,  which means it cannot be used as part of the lifecycle anymore. 
(so yes,  we have control over how goals can be used). However, this 
will make the  plugin useless for persons like Richard.
Here is one point I agree with you - *if* this goal must NEVER be 
allowed to run in the lifecycle, then the skip parameter makes no sense, 
and requiresDirectInvocation should be set to true.


But remember the goal already has the force=true option. Its not like 
people can accidentally introduce this without giving some thought to 
what they are doing.


So in the end I think you are simply taking away a potentially useful 
option. I totally understand the risks of having the local files 
changing and the line numbers not lining up to what is in SCM. But I've 
decided, the way my team operates, that I accept that risk, and I am 
otherwise getting a good value out of this plugin. So I would be 
disappointed if the plugin changed to requireDirectInvocation. We could 
add an additional INFO message when using the goal with force=true 
saying "Hey we're about to automatically scan through and potentially 
change your code!"


I think everyone on the list agrees that a skip parameter is good *if* 
the plugin remains as it is and can be executed as part of the 
lifecycle. So I'm  pushing for a decision here:


1) add the skip option (and potentially a stronger warning message on 
force=true as well)


or

2) set requireDirectInvocation to true


So for me the impact  of adding skip to the javadoc:fix goal is way too 
 big if it reflects uses as part of the lifecycle.

And as said: in this case the profile solution is fine.

regards,
Robert

ps. Adding the skip-parameter parameter to the AbstractMojo is not  
possible unless every plugin adds the skip logic to its own execute()  
method. Give it a try ;) We need to wait for Java8 runtime, where we 
can  add a default method for it to the Mojo interface.
Agreed, I was thinking the same thing about java8 and the Mojo 
interface. Maven 4.0?


Best regards,

Richard


-
To unsubscribe, e-mail: dev-unsubscr...@maven.apache.org
For additional commands, e-mail: dev-h...@maven.apache.org



Re: opinions on MJAVADOC-451

2016-08-05 Thread Richard Sand
What about putting the skip option into an abstract base plugin class? Just 
thinking out loud

Sent from my iPhone

> On Aug 5, 2016, at 6:51 PM, Christopher  wrote:
> 
>> On Fri, Aug 5, 2016 at 11:19 AM Gary Gregory  wrote:
>> 
>>> On Aug 5, 2016 7:41 AM, "Benson Margulies"  wrote:
>>> 
>>> On Fri, Aug 5, 2016 at 10:37 AM, Christopher 
>> wrote:
 I'm always in favor of adding skip properties. They are very useful for
 manipulating builds for debugging, running on CI servers, controlling
 executions in child poms, etc. Even if it's not recommended to run
 unattended, it's still possible, so a skip option is useful.
>>> 
>>> Me too.  I don't see the point about arguing that some goal 'doesn't
>>> belong in a lifecycle'. We can't predict what people will find useful.
>> 
>> Why not add the ability to skip various (named) activities to maven itself?
> 
> That exists. It's called profiles... but it's very verbose in the POM. It's
> much more pleasant to work with if the behavior of the individual plugin is
> flexible through configuration.
> 
> A feature could be added to Maven to skip specific named executions of
> plugins, I suppose... but that sounds like a feature that would require
> some discussion and debate before showing up in some future version of
> Maven. The proposal is something which could be done today, which would
> work, regardless of the version of Maven you're using, and which is
> consistent with similar configurable behavior in countless existing plugins.

-
To unsubscribe, e-mail: dev-unsubscr...@maven.apache.org
For additional commands, e-mail: dev-h...@maven.apache.org



Re: opinions on MJAVADOC-451

2016-08-04 Thread Richard Sand

Anyone want to give this a quick read/opinion? :-)

-Richard

-- Original Message --

From: "Richard Sand" <rs...@idfconnect.com>
To: dev@maven.apache.org
Sent: 8/1/2016 6:33:30 PM
Subject: opinions on MJAVADOC-451


Hi all,

I'd like to ask for opinions on 
https://issues.apache.org/jira/browse/MJAVADOC-451. Robert Scholte and 
I have been discussing this off list and essentially disagree on it.


The request is very simple - to add a "skip" parameter to the 
javadoc:fix goal. In my projects we are using the fix goal unattended, 
i.e. with the parameter "force=true", as part of the regular build 
lifecycle.


Most goals (including javadoc) that run in the regular lifecycle have a 
skip option. Robert's position (and forgive me if I misrepresent this 
at all Robert and please weigh in) is that javadoc:fix should not be 
used in the lifecycle and that the goal should in fact have 
requireDirectInvocation=true. He also pointed out to me that I can 
create a profile to enable/disable the goal as an alternative.


My opinion is that, since the goal does not require direct invocation, 
then running within the lifecycle has to be considered acceptable use 
of the goal. And having a skip parameter adds 5 lines of code, is a 
common and normal pattern used by many other plugins/goals, and allows 
the goal to be used in this fashion without introducing even more 
profiles.


I've submitted patches for this issue and also several other issues in 
the javadoc plugin as I continue to work through getting the goal to 
work well automated. Just pointing out that I'm not just asking for the 
larger community to do stuff to make my life easier - I'm trying to 
contribute as best I can and provide patches for what I uncover.


Best regards,

Richard



opinions on MJAVADOC-451

2016-08-01 Thread Richard Sand

Hi all,

I'd like to ask for opinions on 
https://issues.apache.org/jira/browse/MJAVADOC-451. Robert Scholte and I 
have been discussing this off list and essentially disagree on it.


The request is very simple - to add a "skip" parameter to the 
javadoc:fix goal. In my projects we are using the fix goal unattended, 
i.e. with the parameter "force=true", as part of the regular build 
lifecycle.


Most goals (including javadoc) that run in the regular lifecycle have a 
skip option. Robert's position (and forgive me if I misrepresent this at 
all Robert and please weigh in) is that javadoc:fix should not be used 
in the lifecycle and that the goal should in fact have 
requireDirectInvocation=true. He also pointed out to me that I can 
create a profile to enable/disable the goal as an alternative.


My opinion is that, since the goal does not require direct invocation, 
then running within the lifecycle has to be considered acceptable use of 
the goal. And having a skip parameter adds 5 lines of code, is a common 
and normal pattern used by many other plugins/goals, and allows the goal 
to be used in this fashion without introducing even more profiles.


I've submitted patches for this issue and also several other issues in 
the javadoc plugin as I continue to work through getting the goal to 
work well automated. Just pointing out that I'm not just asking for the 
larger community to do stuff to make my life easier - I'm trying to 
contribute as best I can and provide patches for what I uncover.


Best regards,

Richard



Re[2]: [VOTE] Release Apache Maven Javadoc Plugin version 2.10.4

2016-07-08 Thread Richard Sand
Hi Robert - I attached a patch to MJAVADOC-452 for the onlySaveIfChanged 
fix following your suggestion - I pass a change-detection boolean into 
the rewriting methods so we know if a change has been made during 
processing.


Best regards,

Richard

-- Original Message --
From: "Robert Scholte" <rfscho...@apache.org>
To: "Maven Developers List" <dev@maven.apache.org>
Sent: 6/29/2016 1:34:19 PM
Subject: Re: [VOTE] Release Apache Maven Javadoc Plugin version 2.10.4

Let's start with a simple segment: don't update the file if there are 
no  changes. That's an isolated issue.
You suggestion is to do a String compare, which seems to me quite  
expensive with large files. Instead I would introduce a changed-flag.  
AFAIK all rewriting methods are now void-methods. Return true if 
changed;  if changed then rewrite file.

After that we can go for the next part.

Robert

On Wed, 29 Jun 2016 06:30:34 +0200, Richard Sand <rs...@idfconnect.com> 
 wrote:




Hi Maven Developers - now that Javadoc plugin 2.10.4 is released and
work has commenced on the 3.0 version, can we revisit the patch I
submitted for MJAVADOC-452? Its difficult to break the patch into
smaller patches because most of the fixes are interrelated, but I'd be
happy to recreate the fixes on the 2.10.4 tag or the latest trunk. How
can I help?

-Richard

Richard Sand <mailto:rs...@idfconnect.com>
June 8, 2016 at 2:11 PM
Hi Robert,

For the skip parameter, the test is very simple. Usage of the fix 
goal
without disabling CLIRR will result in a loop where fix invokes 
clirr,

and clirr runs fix.

I understand your reticence about this skip option - but in scenarios
where one wants to have the plugin execute automatically with 
"force",

having an override is useful. So from my perspective it is useful and
it follows the patterns that other plugins use, including the javadoc
plugin - just this goal is missing a skip option.

I didn't break the patch up into smaller patches, but I did annotate
all of the changes that the patch performs line-by-line in the Jira
case. I hope that helps "demystify" the fixes.

Best regards,

Richard


Robert Scholte <mailto:rfscho...@apache.org>
June 8, 2016 at 2:01 PM
Hi Richard,

this release is to push the final 2.x version before moving to a pure
Maven 3 implementation (there were quite some fixed issues waiting to
be released).
Once released I will continue, also including some Java9 related
improvements. That will give me enough time to go through your
patches. Since you weren't able to split it into smaller pieces I 
need

more time to verify and apply it.

thanks,
Robert

ps. In case you are still convinced the skip parameter is required,
you need a complete testcase to show it to me. In general *I* won't
apply any requests to add a skip parameter, there are often better
solutions.

On Wed, 08 Jun 2016 00:53:33 +0200, Richard Sand
<rs...@idfconnect.com> wrote:


-
To unsubscribe, e-mail: dev-unsubscr...@maven.apache.org
For additional commands, e-mail: dev-h...@maven.apache.org



-
To unsubscribe, e-mail: dev-unsubscr...@maven.apache.org
For additional commands, e-mail: dev-h...@maven.apache.org




-
To unsubscribe, e-mail: dev-unsubscr...@maven.apache.org
For additional commands, e-mail: dev-h...@maven.apache.org



Re[2]: [VOTE] Release Apache Maven Javadoc Plugin version 2.10.4

2016-06-29 Thread Richard Sand
OK I can do that, agreed that is an isolated change. Should I go off the 
trunk or the 2.10.4 tag?


-Richard

-- Original Message --
From: "Robert Scholte" <rfscho...@apache.org>
To: "Maven Developers List" <dev@maven.apache.org>
Sent: 6/29/2016 1:34:19 PM
Subject: Re: [VOTE] Release Apache Maven Javadoc Plugin version 2.10.4

Let's start with a simple segment: don't update the file if there are 
no  changes. That's an isolated issue.
You suggestion is to do a String compare, which seems to me quite  
expensive with large files. Instead I would introduce a changed-flag.  
AFAIK all rewriting methods are now void-methods. Return true if 
changed;  if changed then rewrite file.

After that we can go for the next part.

Robert

On Wed, 29 Jun 2016 06:30:34 +0200, Richard Sand <rs...@idfconnect.com> 
 wrote:




Hi Maven Developers - now that Javadoc plugin 2.10.4 is released and
work has commenced on the 3.0 version, can we revisit the patch I
submitted for MJAVADOC-452? Its difficult to break the patch into
smaller patches because most of the fixes are interrelated, but I'd be
happy to recreate the fixes on the 2.10.4 tag or the latest trunk. How
can I help?

-Richard

Richard Sand <mailto:rs...@idfconnect.com>
June 8, 2016 at 2:11 PM
Hi Robert,

For the skip parameter, the test is very simple. Usage of the fix 
goal
without disabling CLIRR will result in a loop where fix invokes 
clirr,

and clirr runs fix.

I understand your reticence about this skip option - but in scenarios
where one wants to have the plugin execute automatically with 
"force",

having an override is useful. So from my perspective it is useful and
it follows the patterns that other plugins use, including the javadoc
plugin - just this goal is missing a skip option.

I didn't break the patch up into smaller patches, but I did annotate
all of the changes that the patch performs line-by-line in the Jira
case. I hope that helps "demystify" the fixes.

Best regards,

Richard


Robert Scholte <mailto:rfscho...@apache.org>
June 8, 2016 at 2:01 PM
Hi Richard,

this release is to push the final 2.x version before moving to a pure
Maven 3 implementation (there were quite some fixed issues waiting to
be released).
Once released I will continue, also including some Java9 related
improvements. That will give me enough time to go through your
patches. Since you weren't able to split it into smaller pieces I 
need

more time to verify and apply it.

thanks,
Robert

ps. In case you are still convinced the skip parameter is required,
you need a complete testcase to show it to me. In general *I* won't
apply any requests to add a skip parameter, there are often better
solutions.

On Wed, 08 Jun 2016 00:53:33 +0200, Richard Sand
<rs...@idfconnect.com> wrote:


-
To unsubscribe, e-mail: dev-unsubscr...@maven.apache.org
For additional commands, e-mail: dev-h...@maven.apache.org



-
To unsubscribe, e-mail: dev-unsubscr...@maven.apache.org
For additional commands, e-mail: dev-h...@maven.apache.org




-
To unsubscribe, e-mail: dev-unsubscr...@maven.apache.org
For additional commands, e-mail: dev-h...@maven.apache.org



Re: [VOTE] Release Apache Maven Javadoc Plugin version 2.10.4

2016-06-28 Thread Richard Sand


Hi Maven Developers - now that Javadoc plugin 2.10.4 is released and 
work has commenced on the 3.0 version, can we revisit the patch I 
submitted for MJAVADOC-452? Its difficult to break the patch into 
smaller patches because most of the fixes are interrelated, but I'd be 
happy to recreate the fixes on the 2.10.4 tag or the latest trunk. How 
can I help?


-Richard

Richard Sand <mailto:rs...@idfconnect.com>
June 8, 2016 at 2:11 PM
Hi Robert,

For the skip parameter, the test is very simple. Usage of the fix goal 
without disabling CLIRR will result in a loop where fix invokes clirr, 
and clirr runs fix.


I understand your reticence about this skip option - but in scenarios 
where one wants to have the plugin execute automatically with "force", 
having an override is useful. So from my perspective it is useful and 
it follows the patterns that other plugins use, including the javadoc 
plugin - just this goal is missing a skip option.


I didn't break the patch up into smaller patches, but I did annotate 
all of the changes that the patch performs line-by-line in the Jira 
case. I hope that helps "demystify" the fixes.


Best regards,

Richard


Robert Scholte <mailto:rfscho...@apache.org>
June 8, 2016 at 2:01 PM
Hi Richard,

this release is to push the final 2.x version before moving to a pure 
Maven 3 implementation (there were quite some fixed issues waiting to 
be released).
Once released I will continue, also including some Java9 related 
improvements. That will give me enough time to go through your 
patches. Since you weren't able to split it into smaller pieces I need 
more time to verify and apply it.


thanks,
Robert

ps. In case you are still convinced the skip parameter is required, 
you need a complete testcase to show it to me. In general *I* won't 
apply any requests to add a skip parameter, there are often better 
solutions.


On Wed, 08 Jun 2016 00:53:33 +0200, Richard Sand 
<rs...@idfconnect.com> wrote:



-
To unsubscribe, e-mail: dev-unsubscr...@maven.apache.org
For additional commands, e-mail: dev-h...@maven.apache.org





Re: [VOTE] Release Apache Maven Javadoc Plugin version 2.10.4

2016-06-08 Thread Richard Sand

Hi Robert,

For the skip parameter, the test is very simple. Usage of the fix goal 
without disabling CLIRR will result in a loop where fix invokes clirr, 
and clirr runs fix.


I understand your reticence about this skip option - but in scenarios 
where one wants to have the plugin execute automatically with "force", 
having an override is useful. So from my perspective it is useful and it 
follows the patterns that other plugins use, including the javadoc 
plugin - just this goal is missing a skip option.


I didn't break the patch up into smaller patches, but I did annotate all 
of the changes that the patch performs line-by-line in the Jira case. I 
hope that helps "demystify" the fixes.


Best regards,

Richard


Robert Scholte <mailto:rfscho...@apache.org>
June 8, 2016 at 2:01 PM
Hi Richard,

this release is to push the final 2.x version before moving to a pure 
Maven 3 implementation (there were quite some fixed issues waiting to 
be released).
Once released I will continue, also including some Java9 related 
improvements. That will give me enough time to go through your 
patches. Since you weren't able to split it into smaller pieces I need 
more time to verify and apply it.


thanks,
Robert

ps. In case you are still convinced the skip parameter is required, 
you need a complete testcase to show it to me. In general *I* won't 
apply any requests to add a skip parameter, there are often better 
solutions.


On Wed, 08 Jun 2016 00:53:33 +0200, Richard Sand 
<rs...@idfconnect.com> wrote:



-
To unsubscribe, e-mail: dev-unsubscr...@maven.apache.org
For additional commands, e-mail: dev-h...@maven.apache.org

Richard Sand <mailto:rs...@idfconnect.com>
June 7, 2016 at 6:53 PM
The patch I supplied for MJAVADOC-452, 451, 434 and 420 won't be 
considered for inclusion? I can recreate the patch off of the latest 
trunk if it would help


Best regards,

Richard




Sent from my iPhone




Re: [VOTE] Release Apache Maven Javadoc Plugin version 2.10.4

2016-06-07 Thread Richard Sand
The patch I supplied for MJAVADOC-452, 451, 434 and 420 won't be considered for 
inclusion? I can recreate the patch off of the latest trunk if it would help

Best regards,

Richard




Sent from my iPhone
> On Jun 7, 2016, at 2:39 PM, Robert Scholte  wrote:
> 
> Hi,
> 
> We solved 12 issues:
> https://issues.apache.org/jira/secure/ReleaseNote.jspa?projectId=12317529=12331967=Text
> 
> There are still a couple of issues left in JIRA:
> https://issues.apache.org/jira/issues/?jql=project%20%3D%2012317529%20AND%20status%20%3D%20Open%20ORDER%20BY%20key%20DESC%2C%20priority%20DESC
> 
> Staging repo:
> https://repository.apache.org/content/repositories/maven-1277/
> https://repository.apache.org/content/repositories/maven-1277/org/apache/maven/plugins/maven-javadoc-plugin/2.10.4/maven-javadoc-plugin-2.10.4-source-release.zip
> Source release checksum(s):
> maven-javadoc-plugin-2.10.4-source-release.zip sha1: 
> 9bd4cef08be14f6c313db68a6cd57b97e449e0aa
> 
> Staging site:
> http://maven.apache.org/plugins-archives/maven-javadoc-plugin-LATEST/
> 
> Guide to testing staged releases:
> http://maven.apache.org/guides/development/guide-testing-releases.html
> 
> Vote open for 72 hours.
> 
> [ ] +1
> [ ] +0
> [ ] -1
> 
> -
> To unsubscribe, e-mail: dev-unsubscr...@maven.apache.org
> For additional commands, e-mail: dev-h...@maven.apache.org
> 

-
To unsubscribe, e-mail: dev-unsubscr...@maven.apache.org
For additional commands, e-mail: dev-h...@maven.apache.org



patch for MJAVADOC-452

2016-05-26 Thread Richard Sand
Hi Dev community –



I recently opened MJAVADOC-452 and uploaded a patch to it that fixes
several problems with the Javadoc plugin’s “fix” goal, making it much more
reliable to use the goal in an automated fashion. Is anyone from that
plugin team interested in taking a look at it and evaluating it for
inclusion in the next minor build? The patch I provided was off of the
2.10.3 release tag but I’d be happy to recreate the patch against the trunk
if you want to incorporate it.



Best regards,



Richard


need filesystem-level operations in assembly plugin

2013-09-10 Thread Richard Sand
Hi all - I'm using maven-assembly-plugin to create zip distributions of
our web application. It takes in the Apache Tomcat zip artifact, and the
war artifact for the app, and creates an assembly.

There are a couple of file operations I haven't figured out how to do yet.
For example, Tomcat unzips to a folder called apache-tomcat-7.0.42,
which I want to rename to just tomcat. I figured id use the exec plugin
to fork a rm command, which I guess will work in theory but seems
shoddy.

Does anyone support having some swiss army knife file manipulation
capabilities (at least rename  move) in the assembly plugin?

Best regards,

Richard

-
To unsubscribe, e-mail: dev-unsubscr...@maven.apache.org
For additional commands, e-mail: dev-h...@maven.apache.org



maven shade patch

2013-09-04 Thread Richard Sand
Hi all - if anyone is interested, I opened a jira ticket MSHADE-154 to
submit a patch to the shade plugin to be able to use an attached artifact as
input, similar to that of the assembly plugin.  The patch adds a
configuration parameter alternativeInputClassifier, which tells the plugin
to look for an artifact with the specified classifier in the project
attachments, and to use that as the primary artifact. While at it, I also
submitted a patch for ticket 139 which adds a parameter called
replaceOriginalArtifact, which defaults to true, but when explicitly set
to false tells shade not to overwrite the original project artifact. I hope
these patches are useful!

Best regards,

Richard




-
To unsubscribe, e-mail: dev-unsubscr...@maven.apache.org
For additional commands, e-mail: dev-h...@maven.apache.org



RE: artifact attached by plugin not appearing in subsequent plugins

2013-08-29 Thread Richard Sand
Notice that the maven-assembly-plugin behaves exactly as I'd hoped - it has
a Boolean useProjectAttachments for dependencySets and
attachmentClassifier to look for an attached alternative to the main
project artifact. Those are exactly the two features I added to
maven-shade-plugin and maven-war-plugin. 

-Original Message-
From: Richard Sand [mailto:rs...@idfconnect.com] 
Sent: Wednesday, August 28, 2013 5:05 PM
To: 'Maven Developers List'
Subject: RE: artifact attached by plugin not appearing in subsequent plugins

I always appreciate a good starwars reference. Actually there are 12 maven
modules overall in this particular project, so I don't think I fear multiple
modules. Could we have split the actual Jersey application classes into a
separate project outside of the war project - yes, agreed. 

What I do object to is creating another module with no purpose other except
to hold a single build plugin. That way leads to the dreaded Maven Tribbles,
and nobody wants to see that...!

I'm just not following where this aversion to using multiple plugins
together is coming from. I really just don't get it. Can you address that?
I'll concede to you that I can allow the poms to be prosper - but I'd like
to approach the conversation from the angle of why *wouldn't* we want the
output artifact of one plugin to be the input artifact to another? So all
kidding aside, if this is contrary to the Tao of Maven I'd really appreciate
understanding why. Thanks!

-Richard

-Original Message-
From: Stephen Connolly [mailto:stephen.alan.conno...@gmail.com]
Sent: Wednesday, August 28, 2013 4:20 PM
To: Maven Developers List
Subject: Re: artifact attached by plugin not appearing in subsequent plugins

But you shouldn't be having java code in module 3 (as it is a war module) so
that should be split in two to follow best practice... It sounds like you
have multiple-module-phobia... That way is the way of fear. Fear leads to
anger... Anger leads to the dark side. Embrace the many modules on the light
side of the maven-force

On Wednesday, 28 August 2013, Richard Sand wrote:

 I thought about it, but its very complicated to obfuscate in stages 
 because some of the classes to be obfuscated in projects 3 and 4 rely 
 on the obfuscated entrypoints in project 2.

 Basically obfuscation has to be the last thing done before final 
 testing and assembly.

 -Original Message-
 From: Stephen Connolly
 [mailto:stephen.alan.conno...@gmail.comjavascript:;
 ]
 Sent: Wednesday, August 28, 2013 3:12 PM
 To: Maven Developers List
 Subject: Re: artifact attached by plugin not appearing in subsequent 
 plugins

 Sounds like you should be obfuscating *in* module 2 and then adding 
 the dependency with classifier to module 3 and 4 (or do two 
 obfuscations in module 2 if you need different flavours)

 On Wednesday, 28 August 2013, Richard Sand wrote:

  Hi all,
 
  Wayne, thanks for the feedback. I understand where you're coming from.
  I've written out here a concrete example of what I want to do with 
  these plugins, and with maven in general wrt passing generated 
  artifacts between plugins. Hopefully others will weigh in on whether 
  this approach is good and maven-esque.
 
  I have 4 maven projects:
 
  1) a global parent pom
  2) a common API project which creates a jar
  3) a web application project (a Jersey JAX-RS service fwiw) which 
  creates a war and depends on #2
  4) a java client for #3 (based on JerseyClient), which also depends 
  on
  #2
 
  1st example: In project 3, I use my obfuscation plugin on the common 
  API and the generated classes. That resulting jar file needs to go 
  into the WEB-INF/lib folder of the war file. My proposed change to 
  the maven war plugin would make it automatically look for a 
  generated jar artifact attached to the project from a previous 
  plug-in and package it appropriately. The current alternative, which 
  admittedly is no big deal, is to simply have the output artifact 
  created by my plugin go directly into WEB-INF/lib. But adding a 
  boolean config variable to maven-war-plugin to tell it to look for 
  attached jars and include them seems logical as well as trivial to 
  me, but this could just be because of my relative newness to all 
  things Maven
 
  2nd example: in project 4, I again use my obfuscation plugin on the 
  common API and generated classes, and then want to take that 
  artifact and, along with the rest of the dependencies, use the 
  maven-shade-plugin to build an uberjar. More problematic than 
  example 1 above because the shade plugin only looks at artifacts and 
  has no hook to include anything directly from the filesystem. So 
  what I did was modify the shade 2.1 source to include a new 
  configuration
 parameter. For simplicy's sake I simply called it 
  alternativeInputClassifier. When present, it instructs shade to 
  look for an attached artifact with the artifact name of the project, 
  of type jar, and with the specified

RE: artifact attached by plugin not appearing in subsequent plugins

2013-08-28 Thread Richard Sand
Hi all,

Wayne, thanks for the feedback. I understand where you're coming from. I've 
written out here a concrete example of what I want to do with these plugins, 
and with maven in general wrt passing generated artifacts between plugins. 
Hopefully others will weigh in on whether this approach is good and maven-esque.

I have 4 maven projects:

1) a global parent pom
2) a common API project which creates a jar
3) a web application project (a Jersey JAX-RS service fwiw) which creates a war 
and depends on #2
4) a java client for #3 (based on JerseyClient), which also depends on #2

1st example: In project 3, I use my obfuscation plugin on the common API and 
the generated classes. That resulting jar file needs to go into the WEB-INF/lib 
folder of the war file. My proposed change to the maven war plugin would make 
it automatically look for a generated jar artifact attached to the project from 
a previous plug-in and package it appropriately. The current alternative, which 
admittedly is no big deal, is to simply have the output artifact created by my 
plugin go directly into WEB-INF/lib. But adding a boolean config variable to 
maven-war-plugin to tell it to look for attached jars and include them seems 
logical as well as trivial to me, but this could just be because of my relative 
newness to all things Maven

2nd example: in project 4, I again use my obfuscation plugin on the common API 
and generated classes, and then want to take that artifact and, along with the 
rest of the dependencies, use the maven-shade-plugin to build an uberjar. 
More problematic than example 1 above because the shade plugin only looks at 
artifacts and has no hook to include anything directly from the filesystem. So 
what I did was modify the shade 2.1 source to include a new configuration 
parameter. For simplicy's sake I simply called it  
alternativeInputClassifier. When present, it instructs shade to look for an 
attached artifact with the artifact name of the project, of type jar, and with 
the specified classifier, in lieu of processing only the main project as input. 
It was a very minor change and I'm happy to provide the diff for it. The result 
is that my pom.xml for project 4 looks like below and works beautifully. One 
pom with two profiles:

dependencies

dependency
groupIdcom.idfconnect.myproject/groupId
artifactIdcommon-tools/artifactId
/dependency
/dependencies
profiles
profile
!-- REGULAR (unobfuscated) PROFILE --
idregular/id
activation
activeByDefaulttrue/activeByDefault
/activation
build
plugins
!-- BEGIN SHADE --
plugin

groupIdorg.apache.maven.plugins/groupId

artifactIdmaven-shade-plugin/artifactId
version2.1-idfc1/version
executions
execution

phasepackage/phase
goals

goalshade/goal
/goals
configuration

minimizeJartrue/minimizeJar

shadedArtifactAttachedtrue/shadedArtifactAttached

shadedClassifierNamefull/shadedClassifierName
/configuration
/execution
/executions
/plugin
!-- END SHADE --
/plugins
/build
/profile

!-- OBFUSCATION PROFILE --
profile
idobfuscate/id
build
plugins
!-- BEGIN OBFUSCATE --
plugin

groupIdcom.idfconnect.devtools/groupId

artifactIdidfc-proguard-maven-plugin/artifactId
 

RE: artifact attached by plugin not appearing in subsequent plugins

2013-08-28 Thread Richard Sand
I thought about it, but its very complicated to obfuscate in stages because
some of the classes to be obfuscated in projects 3 and 4 rely on the
obfuscated entrypoints in project 2. 

Basically obfuscation has to be the last thing done before final testing and
assembly.

-Original Message-
From: Stephen Connolly [mailto:stephen.alan.conno...@gmail.com] 
Sent: Wednesday, August 28, 2013 3:12 PM
To: Maven Developers List
Subject: Re: artifact attached by plugin not appearing in subsequent plugins

Sounds like you should be obfuscating *in* module 2 and then adding the
dependency with classifier to module 3 and 4 (or do two obfuscations in
module 2 if you need different flavours)

On Wednesday, 28 August 2013, Richard Sand wrote:

 Hi all,

 Wayne, thanks for the feedback. I understand where you're coming from.
 I've written out here a concrete example of what I want to do with 
 these plugins, and with maven in general wrt passing generated 
 artifacts between plugins. Hopefully others will weigh in on whether 
 this approach is good and maven-esque.

 I have 4 maven projects:

 1) a global parent pom
 2) a common API project which creates a jar
 3) a web application project (a Jersey JAX-RS service fwiw) which 
 creates a war and depends on #2
 4) a java client for #3 (based on JerseyClient), which also depends on 
 #2

 1st example: In project 3, I use my obfuscation plugin on the common 
 API and the generated classes. That resulting jar file needs to go 
 into the WEB-INF/lib folder of the war file. My proposed change to the 
 maven war plugin would make it automatically look for a generated jar 
 artifact attached to the project from a previous plug-in and package 
 it appropriately. The current alternative, which admittedly is no big 
 deal, is to simply have the output artifact created by my plugin go 
 directly into WEB-INF/lib. But adding a boolean config variable to 
 maven-war-plugin to tell it to look for attached jars and include them 
 seems logical as well as trivial to me, but this could just be because 
 of my relative newness to all things Maven

 2nd example: in project 4, I again use my obfuscation plugin on the 
 common API and generated classes, and then want to take that artifact 
 and, along with the rest of the dependencies, use the 
 maven-shade-plugin to build an uberjar. More problematic than 
 example 1 above because the shade plugin only looks at artifacts and 
 has no hook to include anything directly from the filesystem. So what 
 I did was modify the shade 2.1 source to include a new configuration
parameter. For simplicy's sake I simply called it 
 alternativeInputClassifier. When present, it instructs shade to look 
 for an attached artifact with the artifact name of the project, of 
 type jar, and with the specified classifier, in lieu of processing 
 only the main project as input. It was a very minor change and I'm 
 happy to provide the diff for it. The result is that my pom.xml for 
 project 4 looks like below and works beautifully. One pom with two
profiles:

 dependencies
 
 dependency
 groupIdcom.idfconnect.myproject/groupId
 artifactIdcommon-tools/artifactId
 /dependency
 /dependencies
 profiles
 profile
 !-- REGULAR (unobfuscated) PROFILE --
 idregular/id
 activation
 activeByDefaulttrue/activeByDefault
 /activation
 build
 plugins
 !-- BEGIN SHADE --
 plugin

 groupIdorg.apache.maven.plugins/groupId

 artifactIdmaven-shade-plugin/artifactId

 version2.1-idfc1/version
 executions
 execution

 phasepackage/phase
 
 goals

 goalshade/goal
 
 /goals

 configuration

 minimizeJartrue/minimizeJar

 shadedArtifactAttachedtrue/shadedArtifactAttached

 shadedClassifierNamefull/shadedClassifierName

 /configuration
 /execution
 /executions
 /plugin
 !-- END SHADE --
 /plugins
 /build
 /profile

 !-- OBFUSCATION PROFILE --
 profile
 idobfuscate/id
 build
 plugins
 !-- BEGIN OBFUSCATE --
 plugin

RE: artifact attached by plugin not appearing in subsequent plugins

2013-08-28 Thread Richard Sand
I always appreciate a good starwars reference. Actually there are 12 maven
modules overall in this particular project, so I don't think I fear multiple
modules. Could we have split the actual Jersey application classes into a
separate project outside of the war project - yes, agreed. 

What I do object to is creating another module with no purpose other except
to hold a single build plugin. That way leads to the dreaded Maven Tribbles,
and nobody wants to see that...!

I'm just not following where this aversion to using multiple plugins
together is coming from. I really just don't get it. Can you address that?
I'll concede to you that I can allow the poms to be prosper - but I'd like
to approach the conversation from the angle of why *wouldn't* we want the
output artifact of one plugin to be the input artifact to another? So all
kidding aside, if this is contrary to the Tao of Maven I'd really appreciate
understanding why. Thanks!

-Richard

-Original Message-
From: Stephen Connolly [mailto:stephen.alan.conno...@gmail.com] 
Sent: Wednesday, August 28, 2013 4:20 PM
To: Maven Developers List
Subject: Re: artifact attached by plugin not appearing in subsequent plugins

But you shouldn't be having java code in module 3 (as it is a war module) so
that should be split in two to follow best practice... It sounds like you
have multiple-module-phobia... That way is the way of fear. Fear leads to
anger... Anger leads to the dark side. Embrace the many modules on the light
side of the maven-force

On Wednesday, 28 August 2013, Richard Sand wrote:

 I thought about it, but its very complicated to obfuscate in stages 
 because some of the classes to be obfuscated in projects 3 and 4 rely 
 on the obfuscated entrypoints in project 2.

 Basically obfuscation has to be the last thing done before final 
 testing and assembly.

 -Original Message-
 From: Stephen Connolly 
 [mailto:stephen.alan.conno...@gmail.comjavascript:;
 ]
 Sent: Wednesday, August 28, 2013 3:12 PM
 To: Maven Developers List
 Subject: Re: artifact attached by plugin not appearing in subsequent 
 plugins

 Sounds like you should be obfuscating *in* module 2 and then adding 
 the dependency with classifier to module 3 and 4 (or do two 
 obfuscations in module 2 if you need different flavours)

 On Wednesday, 28 August 2013, Richard Sand wrote:

  Hi all,
 
  Wayne, thanks for the feedback. I understand where you're coming from.
  I've written out here a concrete example of what I want to do with 
  these plugins, and with maven in general wrt passing generated 
  artifacts between plugins. Hopefully others will weigh in on whether 
  this approach is good and maven-esque.
 
  I have 4 maven projects:
 
  1) a global parent pom
  2) a common API project which creates a jar
  3) a web application project (a Jersey JAX-RS service fwiw) which 
  creates a war and depends on #2
  4) a java client for #3 (based on JerseyClient), which also depends 
  on
  #2
 
  1st example: In project 3, I use my obfuscation plugin on the common 
  API and the generated classes. That resulting jar file needs to go 
  into the WEB-INF/lib folder of the war file. My proposed change to 
  the maven war plugin would make it automatically look for a 
  generated jar artifact attached to the project from a previous 
  plug-in and package it appropriately. The current alternative, which 
  admittedly is no big deal, is to simply have the output artifact 
  created by my plugin go directly into WEB-INF/lib. But adding a 
  boolean config variable to maven-war-plugin to tell it to look for 
  attached jars and include them seems logical as well as trivial to 
  me, but this could just be because of my relative newness to all 
  things Maven
 
  2nd example: in project 4, I again use my obfuscation plugin on the 
  common API and generated classes, and then want to take that 
  artifact and, along with the rest of the dependencies, use the 
  maven-shade-plugin to build an uberjar. More problematic than 
  example 1 above because the shade plugin only looks at artifacts and 
  has no hook to include anything directly from the filesystem. So 
  what I did was modify the shade 2.1 source to include a new 
  configuration
 parameter. For simplicy's sake I simply called it 
  alternativeInputClassifier. When present, it instructs shade to 
  look for an attached artifact with the artifact name of the project, 
  of type jar, and with the specified classifier, in lieu of 
  processing only the main project as input. It was a very minor 
  change and I'm happy to provide the diff for it. The result is that 
  my pom.xml for project 4 looks like below and works beautifully. One 
  pom with two
 profiles:
 
  dependencies
  
  dependency
  groupIdcom.idfconnect.myproject/groupId
  artifactIdcommon-tools/artifactId
  /dependency
  /dependencies
  profiles

RE: artifact attached by plugin not appearing in subsequent plugins

2013-08-27 Thread Richard Sand
I've tried it to the extent possible, yes. I attach with a classifier
small to indicate the obfuscated (and shrunken) jar. The maven war plugin
will look at the MavenProject.getArtifacts() to pull in the scoped
dependencies, but it doesn't look at MavenProject.getAttachedArtifacts().
There just isn't any hook there to get the plugin to process the attached
artifact - best I can do is write it directly into WEB-INF/lib and let the
packager pick it up.

So my proposal was to add a Boolean to the war plugin to tell it to include
any attached jar artifacts. I could optionally add a classifier name, e.g.
add only an attached artifact matching the artifactid of the project and
with this classifier. Or a list of any arbitrary artifact ID strings which
match attached artifacts... for my specific use case simply the Boolean will
do. 

And similarly I want to make the same change in the maven shade plugin. The
shade plugin is very focused on the primary artifact of the project, so for
that I was thinking to just adding the Boolean and classifier - as Stephen
points out, the attached artifact with classifier should have the same
transitive dependencies anyway.

The larger question is how to move generated artifacts from one plugin to
another. The pushback I've gotten on these proposed changes to the plugins
is that attachment isn't supposed to be used for that purpose - attachment
is solely for putting artifacts into the repository, and not for passing
newly generated artifacts between plugins.

New feature perhaps? Idea being the notion of transient artifacts which only
persist for the life of a maven session? Then other plugins in the same
execution which potentially could use these transient artifacts could either
automatically look for them or be configured to do so, like I suggested
above for the war and shade plugins. The only difference between the
transient and attached artifacts is that the transient wouldn't get
persisted to a repository. And if you want to attach it, a plugin like the
mojo buildhelper could have a goal to attach any (or specific) transient
artifacts.

If I'm talking crazy, please forgive me, school started today for my 2nd
grader and I haven't had my coffee yet. :-)

Thanks!

-Richard

-Original Message-
From: Stephen Connolly [mailto:stephen.alan.conno...@gmail.com] 
Sent: Tuesday, August 27, 2013 3:32 AM
To: Maven Users List
Subject: Re: artifact attached by plugin not appearing in subsequent plugins

On Tuesday, 27 August 2013, Barrie Treloar wrote:

 On 21 August 2013 00:42, Richard Sand rs...@idfconnect.com 
 javascript:;
 wrote:
  Is there any merit to the idea of having a configuration option in
 maven-war-plugin to include attached artifacts in the webapp in the 
 same way it includes dependent artifacts?

 The MavenProjectHelper.attachArtifact requires a type or classifier in 
 order to be used.

 The default artifact and the attached artifact are different 
 dependencies, and so you need to specify them twice for them to be 
 included in the webapp.
 One for the default artifact and one for the attached artifact (which 
 has the type or classifier specified so that you can tell the 
 difference).

 I dont use the war plugin, but I specifying the dependency on the 
 attached artifact should mean that it appears in your war.


And better still, since the obfuscated artifact *should* have the same
transitive dependencies, it would be the one case where two artifacts from
the same module would be ok.

OTOH if obfuscation adds shade-like dependency reduction *then* it needs to
get a separate GAV



 Have you tried this?

 -
 To unsubscribe, e-mail: users-unsubscr...@maven.apache.org 
 javascript:; For additional commands, e-mail: 
 users-h...@maven.apache.orgjavascript:;



--
Sent from my phone



-
To unsubscribe, e-mail: dev-unsubscr...@maven.apache.org
For additional commands, e-mail: dev-h...@maven.apache.org



RE: artifact attached by plugin not appearing in subsequent plugins

2013-08-26 Thread Richard Sand
Sorry just correcting myself, I was referring to the maven shade plugin not 
mojo shade plugin.

-Richard

-Original Message-
From: Richard Sand [mailto:rs...@idfconnect.com] 
Sent: Monday, August 26, 2013 4:54 PM
To: 'Maven Users List'; 'Maven Developers List'
Subject: RE: artifact attached by plugin not appearing in subsequent plugins

Hi all,

Mirko thanks for your reply. I think the plugin API should have *some* simple 
mechanism for transferring a generated artifact from one plugin to the next.

In my example, the project creates a standard named artifact i.e. 
${project.build.finalName}.war, and the obfuscation plugin also a new 
(generated) jar artifact with classifier small, i.e. 
${project.build.finalName}-small.jar, specifically with the classes it has 
processed.

To get the war file to include this new generated artifact, I have to place it 
directly into the WEB-INF/lib folder, whereas by default it would go into 
${project.build.directory}. Obviously this isn't burdensome. But my question 
was whether it would make sense to have the war plugin consider attached 
artifacts. I actually created issue MWAR-304 with a simple patch to do this, 
but the developer agreed with your point that attaching an artifact should only 
be used for placing said artifact into a repository, not as a mechanism for 
propagating the artifact to other plugins 
(https://jira.codehaus.org/browse/MWAR-304).

However this general use case (propagating a generated artifact between 
plugins) continues to vex me. Example - I tried using the mojo shade plugin to 
generate a jar to include in a web application.  Same type of use case - I want 
the shade plugin to use the artifact generated by a previous plugin. The shade 
plugin only accepts an artifact as an input, I cannot give it a path to a 
filename to include as I did above with the war plugin. So either way I need to 
modify the shade plugin - either to accept filesystem paths to include or a 
Boolean to tell it to check for dynamically attached plugins. Since Maven 3.0 
includes MavenProject.getAttachedArtifacts(), it seems silly not to use it. If 
a plugin is going to accept artifacts as input, why shouldn't an attached 
artifact be considered? It seems like the natural and transparent mechanism to 
propagate generated, attached artifacts between plugins. Just choose your 
classifier and go. 

Granted, the shade plugin should also have a parameter to include filesystem 
paths.

Best regards,

Richard

-Original Message-
From: Mirko Friedenhagen [mailto:mfriedenha...@gmail.com]
Sent: Tuesday, August 20, 2013 11:40 AM
To: Maven Users List; Maven Developers List
Subject: RE: artifact attached by plugin not appearing in subsequent plugins

Hello Richard,

x-posted to dev, as the war-plugin is a core-plugin:
- IMO attaching would be the wrong term as it has another meaning.
- This is more of a generated jar (as generated sources,  classes etc.)
- IMO packages should go in Maven modules of their own.

Regards Mirko
--
Sent from my mobile
On Aug 20, 2013 5:13 PM, Richard Sand rs...@idfconnect.com wrote:

 Is there any merit to the idea of having a configuration option in 
 maven-war-plugin to include attached artifacts in the webapp in the 
 same way it includes dependent artifacts?

 -Richard

 -Original Message-
 From: Mirko Friedenhagen [mailto:mfriedenha...@gmail.com]
 Sent: Tuesday, August 20, 2013 6:20 AM
 To: Maven Users List
 Subject: RE: artifact attached by plugin not appearing in subsequent 
 plugins

 Richard,

 AFAIK attachArtifact just tells Maven to install an additional binary 
 to it's local cache resp. to deploy it to the distribution repository.

 What you want, as far as I understand, is to create an artifact which 
 will be picked up later on and included in a war? You should probably 
 create a separate module project, which creates the jar and just 
 include this jar as runtime dependency in your war project.

 Regards Mirko
 --
 Sent from my mobile
 On Aug 20, 2013 7:42 AM, Richard Sand rs...@idfconnect.com wrote:

  I concluded that this was a missing feature of maven-war-plugin, 
  where it simply wasn't looking to see if there were attached resources.
 
  I supplied a simple patch to the handleArtifacts() method to have 
  that method also handle attached artifacts,  You can see the report here.
  https://jira.codehaus.org/browse/MWAR-304
 
  -Richard
 
  -Original Message-
  From: Richard Sand [mailto:rs...@idfconnect.com]
  Sent: Monday, August 19, 2013 6:19 PM
  To: 'Maven Users List'
  Subject: artifact attached by plugin not appearing in subsequent 
  plugins
 
  Hi all - I've been stuck for a while trying to get an artifact 
  injected by a plugin to apply to subsequent plugins/goals in a 
  project.
 
  I have a project which generates a web application. My use case here 
  is the obfuscator plugin which I wrote, which creates a jar file 
  called projectname-small.jar. The plugin creates jar file using

RE: artifact attached by plugin not appearing in subsequent plugins

2013-08-26 Thread Richard Sand
Hi all,

Mirko thanks for your reply. I think the plugin API should have *some* simple 
mechanism for transferring a generated artifact from one plugin to the next.

In my example, the project creates a standard named artifact i.e. 
${project.build.finalName}.war, and the obfuscation plugin also a new 
(generated) jar artifact with classifier small, i.e. 
${project.build.finalName}-small.jar, specifically with the classes it has 
processed.

To get the war file to include this new generated artifact, I have to place it 
directly into the WEB-INF/lib folder, whereas by default it would go into 
${project.build.directory}. Obviously this isn't burdensome. But my question 
was whether it would make sense to have the war plugin consider attached 
artifacts. I actually created issue MWAR-304 with a simple patch to do this, 
but the developer agreed with your point that attaching an artifact should only 
be used for placing said artifact into a repository, not as a mechanism for 
propagating the artifact to other plugins 
(https://jira.codehaus.org/browse/MWAR-304).

However this general use case (propagating a generated artifact between 
plugins) continues to vex me. Example - I tried using the mojo shade plugin to 
generate a jar to include in a web application.  Same type of use case - I want 
the shade plugin to use the artifact generated by a previous plugin. The shade 
plugin only accepts an artifact as an input, I cannot give it a path to a 
filename to include as I did above with the war plugin. So either way I need to 
modify the shade plugin - either to accept filesystem paths to include or a 
Boolean to tell it to check for dynamically attached plugins. Since Maven 3.0 
includes MavenProject.getAttachedArtifacts(), it seems silly not to use it. If 
a plugin is going to accept artifacts as input, why shouldn't an attached 
artifact be considered? It seems like the natural and transparent mechanism to 
propagate generated, attached artifacts between plugins. Just choose your 
classifier and go. 

Granted, the shade plugin should also have a parameter to include filesystem 
paths.

Best regards,

Richard

-Original Message-
From: Mirko Friedenhagen [mailto:mfriedenha...@gmail.com] 
Sent: Tuesday, August 20, 2013 11:40 AM
To: Maven Users List; Maven Developers List
Subject: RE: artifact attached by plugin not appearing in subsequent plugins

Hello Richard,

x-posted to dev, as the war-plugin is a core-plugin:
- IMO attaching would be the wrong term as it has another meaning.
- This is more of a generated jar (as generated sources,  classes etc.)
- IMO packages should go in Maven modules of their own.

Regards Mirko
--
Sent from my mobile
On Aug 20, 2013 5:13 PM, Richard Sand rs...@idfconnect.com wrote:

 Is there any merit to the idea of having a configuration option in 
 maven-war-plugin to include attached artifacts in the webapp in the 
 same way it includes dependent artifacts?

 -Richard

 -Original Message-
 From: Mirko Friedenhagen [mailto:mfriedenha...@gmail.com]
 Sent: Tuesday, August 20, 2013 6:20 AM
 To: Maven Users List
 Subject: RE: artifact attached by plugin not appearing in subsequent 
 plugins

 Richard,

 AFAIK attachArtifact just tells Maven to install an additional binary 
 to it's local cache resp. to deploy it to the distribution repository.

 What you want, as far as I understand, is to create an artifact which 
 will be picked up later on and included in a war? You should probably 
 create a separate module project, which creates the jar and just 
 include this jar as runtime dependency in your war project.

 Regards Mirko
 --
 Sent from my mobile
 On Aug 20, 2013 7:42 AM, Richard Sand rs...@idfconnect.com wrote:

  I concluded that this was a missing feature of maven-war-plugin, 
  where it simply wasn't looking to see if there were attached resources.
 
  I supplied a simple patch to the handleArtifacts() method to have 
  that method also handle attached artifacts,  You can see the report here.
  https://jira.codehaus.org/browse/MWAR-304
 
  -Richard
 
  -Original Message-
  From: Richard Sand [mailto:rs...@idfconnect.com]
  Sent: Monday, August 19, 2013 6:19 PM
  To: 'Maven Users List'
  Subject: artifact attached by plugin not appearing in subsequent 
  plugins
 
  Hi all - I've been stuck for a while trying to get an artifact 
  injected by a plugin to apply to subsequent plugins/goals in a 
  project.
 
  I have a project which generates a web application. My use case here 
  is the obfuscator plugin which I wrote, which creates a jar file 
  called projectname-small.jar. The plugin creates jar file using 
  MavenProjectHelper.attachArtifact(). The plugin executes during the 
  packaging phase, before the maven-war-plugin. The jar file is 
  created successfully, and the call to attachArtifact() returns with 
  no errors, but the maven-war-plugin does not see the jar file and 
  therefore doesn't include it in the results. When I turn on 
  debugging

RE: artifact attached by plugin not appearing in subsequent plugins

2013-08-26 Thread Richard Sand
Hi Wayne - that seems a very inefficient approach, having 5 or 6 separate 
modules to manage to achieve a single assembly. The point is that maven does 
have phases, goals, lifecycles - why not use them? The MavenProject object 
already provides the mechanism for one plugin to see the attachments from a 
previous plugin, so again it’s a why-not question.

The way you present the choice of having a single maven module certainly makes 
it sound distasteful, but I think dividing the construction of a single atomic 
component into multiple modules because the plugins cannot be chained together 
is more unappealing. Especially when the maven architecture and API makes it so 
easy to do.

BTW I haven't touched Ant in at least 6 years, so I doubt I'm an Ant-oriented 
person.  :-)

-Richard

-Original Message-
From: Wayne Fay [mailto:wayne...@gmail.com]
Sent: Monday, August 26, 2013 5:59 PM
To: Maven Users List
Subject: Re: artifact attached by plugin not appearing in subsequent plugins

 Mirko thanks for your reply. I think the plugin API should have *some* 
 simple mechanism for transferring a generated artifact from one plugin to 
 the next.

One way people on this list implement this is by viewing their various Maven 
modules as being a series of transformations which are applied to successive 
artifacts.

lib, common, etc
depended on by war
depended on by obfuscator
depended on by shader
depended on by assembly
eventually produces an obfuscated, shaded war file in a tar.gz along with the 
latest compatible version of Tomcat in an OS-specific package ready for simple 
deployment.

An Ant-oriented person might put all this in a single Maven module and force 
the various plugins to bind to different phases and customize or hard-code the 
configuration to force this pipeline just to achieve all of this in a single 
module build. A Maven-oriented person probably would not.

Wayne

-
To unsubscribe, e-mail: users-unsubscr...@maven.apache.org
For additional commands, e-mail: users-h...@maven.apache.org




-
To unsubscribe, e-mail: users-unsubscr...@maven.apache.org
For additional commands, e-mail: users-h...@maven.apache.org




-
To unsubscribe, e-mail: dev-unsubscr...@maven.apache.org
For additional commands, e-mail: dev-h...@maven.apache.org