Re: POM Model version 4.1.0 in 3.4.0-SNAPSHOTs

2016-08-31 Thread Christian Schulte
Am 08/31/16 um 22:51 schrieb Stephen Connolly:
> So for the tests-jar.., that would have a declared dependency on the
> regular .jar and provide the tree for the test-runtime scope... It may even
> be that there are therefore closer overrides of the main artifact's
> dependencies in the test jar's tree... That should all be fine.

So the test jar's tree is different to the main jar's tree. The selector
used to distinguish this in that scenario is not the scope, but the
artifact coordinates (jar vs. test-jar), correct? Where I say "scope",
you say "coordinates", correct? I think I got it now.

Regards,
-- 
Christian


-
To unsubscribe, e-mail: dev-unsubscr...@maven.apache.org
For additional commands, e-mail: dev-h...@maven.apache.org



Re: Questions about how to contribute to Plexus

2016-08-31 Thread Gabriel Belingueres
Hi Robert:

Regarding Plexus components, I created a couple of pull requests on
plexus-interpolator (and some related maven-filtering patches).
I have a few free hours this week that I could use to merge those PR to
speed up a new release. Is it possible to assign me write permission to
that repo?
In addition, I saw that Plexus components are not configured in public
Sonarqube instance (https://sonarqube.com/), so I could request if they
could be included too.

Gabriel

2016-08-30 12:20 GMT-03:00 Robert Scholte :

> Hi,
>
> all development on Plexus and Modello is done at
> https://github.com/codehaus-plexus
> Since Maven is probably one of the few projects still using this, we
> haven't invested in all the infra related stuff like mailinglists, etc.
> So we have the sources and the documentation, that's about it.
> This project is maintained by a subset of the Apache Maven team, so we can
> fix our own issues.
> Best way is to create pull requests there and ping us if you're waiting
> too long for a response.
>
> thanks,
> Robert
>
>
> On Mon, 29 Aug 2016 19:57:19 +0200, Plamen Totev 
> wrote:
>
> Hi,
>>
>> Recently I was granted write access to the plexus-io and plexus-archiver
>> GitHub repositories. I'm not sure where I can found any guides for
>> contributors. Also I can't found the plexus project mail list - not sure
>> where to write if I have questions related to the project. I would love to
>> contribute to the project and help it with the maintenance but not sure
>> about what to do so I'll be very grateful if you could give me some
>> directions where I can get more info or help.
>>
>> Thanks,
>>
>> Plamen Totev
>>
>> -
>> To unsubscribe, e-mail: dev-unsubscr...@maven.apache.org
>> For additional commands, e-mail: dev-h...@maven.apache.org
>>
>
> -
> To unsubscribe, e-mail: dev-unsubscr...@maven.apache.org
> For additional commands, e-mail: dev-h...@maven.apache.org
>
>


Re: JUnit categories in surefire/failsafe

2016-08-31 Thread Christopher
On Wed, Aug 31, 2016 at 4:43 AM Tibor Digana  wrote:

> >>One thing we noticed was that the user property is the same for both
> plugins.
> We are aware of this and we will break the backwards compatibility.
> This change is planned in Surefire 3.0.
> I am thinking of temporary fix but better not to do it.
> Please reference and use these user/properties since they will be in 3.0:
> ${surefire.groups}
> ${failsafe.groups}
>
>
Thanks. We'll do that. We had created our own, but it's better to use the
property names which will appear in future versions of the plugins.


> Do you still need "_ALL_, _NONE_, _UNCATEGORIZED_"?
>
>
>
Not at the moment, and these were secondary. After thinking about it, "ALL"
and "NONE" aren't needed, so long as it's easy enough to set the
groups/excludedGroups properties to empty with an override. My main concern
about those two options were that it might be hard to set a property to
"empty" to override a default non-empty value, because Maven might just
treat it as though it weren't set at all, and ignore it. I'd have to
experiment to see if that's the case or if it properly sets it to null or
empty string or whatever when it's in the DOM without a value or if it's a
system property without a value.

The most useful of the three would be UNCATEGORIZED, so it can be combined
with other categories and I'd expect that would require JUnit support
underneath.


>
> On Wed, Aug 31, 2016 at 10:23 AM, Tibor Digana 
> wrote:
>
> > Hi Christopher,
> >
> > Some offtopic. I will answer your email but first I have a question for
> you
> > and Accumulo project.
> > I visited Accumulo cca one week ago. Why the build [1] hangs on IT test
> > executed by maven-failsafe-plugin:2.19.1?
> > I asked INFRA team to display Thread Dump button. Do you see this button,
> > can you grap the thread dump? I would like to see what's going on in
> > Failsafe plugin. We have such issue [3] but now it most probably is user
> > error because the developer overrides std/out via System.setOut() but
> this
> > hypothesis has not been confirmed yet because it's too early.
> >
> > [1] https://builds.apache.org/job/Accumulo-master-IT/
> > [2]
> https://wiki.jenkins-ci.org/display/JENKINS/Thread+Dump+Action+Plugin
> > [3] https://issues.apache.org/jira/browse/SUREFIRE-1193
> >
> >
> >
> > On Wed, Aug 31, 2016 at 3:00 AM, Christopher [via Maven] <
> > ml-node+s40175n5879500...@n5.nabble.com> wrote:
> >
> > > tl;dr - A proposal for config independence for groups/excludeGroups
> > > properties and some special keywords for ALL, NONE, and UNCATEGORIZED
> > > groups.
> > >
> > > ***
> > >
> > > In the Apache Accumulo project, we're currently in process of trying to
> > > make use of JUnit categories to separate different classes of tests.
> > >
> > > So, we're using the maven-surefire-plugin and maven-failsafe-plugin
> > > configuration properties: groups, excludeGroups
> > >
> > > One thing we noticed was that the user property is the same for both
> > > plugins. This is a problem, because one cannot pass in a system
> property
> > > on
> > > the command-line to affect one without affecting the other.
> > >
> > > I propose that maven-surefire-plugin and maven-failsafe-plugin
> deprecate
> > > the existing groups/excludeGroups properties, and replace them with
> > > tests.groups/tests.excludeGroups, and it.groups/it.excludeGroups. (This
> > > should probably be done for other shared properties as well.)
> > >
> > > Users can simulate this by doing something like this:
> > > ${it.groups}
> > >
> > > However, this may cause problems if the property is not defined I
> > > haven't tested to be sure.
> > >
> > > ***
> > >
> > > That leads me to a question and a second proposal:
> > > Is there a way to specify uncategorized test classes? Or all test
> > classes?
> > > Or none?
> > >
> > > If not, I would like to propose that some special keywords be created
> > > which
> > > can represent:
> > > _ALL_, _NONE_, _UNCATEGORIZED_ (or similar)
> > >
> > > That way, users can do things like:
> > > my.special.Category,_UNCATEGORIZED_
> > > _NONE_
> > > or
> > > _NONE_
> > > or
> > > _ALL_
> > >
> > > These keywords may require some support from the underlying test
> > > framework,
> > > like JUnit, so I can understand if these keywords cannot happen.
> > >
> > > Even if the keywords cannot be made to work, I still think it'd be good
> > to
> > > deprecate-and-separate the properties for the two plugins, so they can
> be
> > > controlled independently with user properties.
> > >
> > > Thanks.
> > >
> > >
> > > --
> > > If you reply to this email, your message will be added to the
> discussion
> > > below:
> > > http://maven.40175.n5.nabble.com/JUnit-categories-in-
> > > surefire-failsafe-tp5879500.html
> > > To start a new topic under Maven Developers, email
> > > ml-node+s40175n142166...@n5.nabble.com
> > > To unsubscribe from Maven Developers, click here
> 

Re: Building jar targeting multiple Java versions, including 9

2016-08-31 Thread Richard Sand
Agree with Andreas, I can see MRJars usefulness too. APIs with lambdas 
come to mind as a good example, or interfaces with default 
implementations, are just a couple features of Java8 which I've avoided 
adding because of needing backward compatibility.


Eclipse's maven support in general is very subpar... and it may be 
difficult to avoid having multiple projects for different runtimes - but 
I could envision a scenario where I'd simply have my project configured 
in Eclipse as Java9, but in my pom configured for other builds. Eclipse 
wouldn't be able to tell me if a particular piece of code was compatible 
with the other runtimes, but maven sure would, and maybe Eclipse will 
add the capability. In any event I wouldn't want the limitations of a 
particular IDE to drive the POM design and in particular require 
multi-projects when a single project could suffice. If I as the end user 
can live with the limitations of the IDE (e.g. not being multi-runtime 
aware) then I should be allowed to do so.


Richard Sand | CEO
IDF Connect, Inc.
2207 Concord Ave, #359
Wilmington | Delaware 19803 | USA
Office: +1 888 765 1611 | Fax: +1 866 765 7284
Mobile: +1 267 984 3651




-- Original Message --
From: "Andreas Gudian" 
To: "Maven Developers List" 
Sent: 8/31/2016 4:52:16 PM
Subject: Re: Building jar targeting multiple Java versions, including 9


2016-08-31 22:02 GMT+02:00 Tibor Digana :

 >>So, we can try to have different source folders for different 
runtimes.

 No. If you are developers you would not say this.
 No developer would develop and merge the same code multiple times!!!
 Look, now let the maven-compiler-plugin to download jars of the same
 artifact which was released for Version 1.7 and 1.8, and now let the
 compiler wrap those two versions with current version 1.9 which is
 ready to be released. Now you have fat jar MRjar.
 This means any Java version of the artifact can be chosen on the top 
of

 JDK9.

 >>Most other plugins would need to be executed for each version as 
well -

 javadoc, junit
 No. because the previous Versions were already tested and processed.

 >>Again if they *don't* need a separate execution, then why is MRJar
 needed?
 Exactly, and now I cannot imaging company which is going to 
complicate

 their project with a stupid MRJar and why so if they have WildFly
 8.2.1 which supports Java 8 do you think they EE project would
 struggle with MRJar? Of course not. If they have WildFly 10.x
 supporting Java 9 then again no developer would build MRJar.

 I think MRJar is suitable only for JDK internals and it is only
 because of Oracle has problem with JCP because JCP does not let 
Oracle

 to break backwards compatibility and introduce dynamic language
 features. So Oracle is trying for compromise.
 Oracle is trying to let you build java.* JDK with javac, interchange
 internal modules, etc.
 Nothing for other non-JDK projects.



That's a bit off-topic, as this thread is about the _how_ and not the
_why_, but I'd like to point out that I disagree with you here ;). 
MRJars
would be great for a lot of frameworks and libraries that you would 
_use_

in your end-product. Frameworks that offer different alternatives or
extended features requiring a newer JDK now have to build and maintain
artifacts either with different versions or with different artifactIds
(e.g. funky-annotations and funky-annotations-jdk8, where the -jdk8 
variant
contains annotations with @Repeatable) -- there are a lot of examples 
out
there. And that's not only cumbersome for maintainers, but also for 
users

(which is the bad thing).

So go MRjars! Making them easily consumable for users is the 
top-prirority

and it looks like a solid approach.

Making them easy to build for the maintainers is way less important (as
that's only a tiny percentage of the maven-users out there), but it 
sure

needs to work with different IDEs and especially Eclipse would have
problems handling different target JDKs in the same project.

So for me as an Eclipse user, I'd have to take the road with the
multi-module projects. But that's okay, as I'm just building a 
framework
and separating the different variants for different JDKs has also a lot 
of
benefits, as already described above: testing, generating javadocs, and 
all
that other stuff is already there and can be used without having to 
change
anything. For me, a project collecting the internally seperate 
artifacts

and combining them into an MRjar would be totally fine.

Andreas





 --
 Cheers
 Tibor

 -
 To unsubscribe, e-mail: dev-unsubscr...@maven.apache.org
 For additional commands, e-mail: dev-h...@maven.apache.org





-
To unsubscribe, e-mail: dev-unsubscr...@maven.apache.org
For additional commands, e-mail: dev-h...@maven.apache.org



Re: Building jar targeting multiple Java versions, including 9

2016-08-31 Thread Robert Scholte
On Wed, 31 Aug 2016 22:02:04 +0200, Tibor Digana  
 wrote:



So, we can try to have different source folders for different runtimes.

No. If you are developers you would not say this.
No developer would develop and merge the same code multiple times!!!
Look, now let the maven-compiler-plugin to download jars of the same
artifact which was released for Version 1.7 and 1.8, and now let the
compiler wrap those two versions with current version 1.9 which is
ready to be released. Now you have fat jar MRjar.
This means any Java version of the artifact can be chosen on the top of  
JDK9.


Most other plugins would need to be executed for each version as well  
- javadoc, junit

No. because the previous Versions were already tested and processed.

Again if they *don't* need a separate execution, then why is MRJar  
needed?

Exactly, and now I cannot imaging company which is going to complicate
their project with a stupid MRJar and why so if they have WildFly
8.2.1 which supports Java 8 do you think they EE project would
struggle with MRJar? Of course not. If they have WildFly 10.x
supporting Java 9 then again no developer would build MRJar.

I think MRJar is suitable only for JDK internals and it is only
because of Oracle has problem with JCP because JCP does not let Oracle
to break backwards compatibility and introduce dynamic language
features. So Oracle is trying for compromise.
Oracle is trying to let you build java.* JDK with javac, interchange
internal modules, etc.
Nothing for other non-JDK projects.



Shared-Utils is a very good candidate for a multirelease jar.
It has a class called Java7Support[1] which uses reflection to make  
benefit of Java7 features, otherwise it falls back to Java6. Thanks to  
reflection this code is compatible with Java6.


There are probably more cases, and with the introduction I can imagine  
there are projects who will consider to use it.
Even if this is mainly introduced to solve JDK internal issues (I doubt  
that this is really true), it is great to see that such feature is also  
exposed and available for the Java Community.


Robert

[1]  
https://maven.apache.org/shared/maven-shared-utils/xref/org/apache/maven/shared/utils/io/Java7Support.html


-
To unsubscribe, e-mail: dev-unsubscr...@maven.apache.org
For additional commands, e-mail: dev-h...@maven.apache.org



Re: JUnit categories in surefire/failsafe

2016-08-31 Thread Christopher
On Wed, Aug 31, 2016 at 4:23 AM Tibor Digana  wrote:

> Hi Christopher,
>
> Some offtopic. I will answer your email but first I have a question for you
> and Accumulo project.
> I visited Accumulo cca one week ago. Why the build [1] hangs on IT test
> executed by maven-failsafe-plugin:2.19.1?
>

I wasn't aware it was hanging. That particular Jenkins job is an
experimental attempt by one of our developers to try to run all of our
tests in parallel, and it's quite spammy when there's a failure. I don't
track it. I'm tracking some of our other jobs.

Usually, though, our problem is with our tests, not necessarily failsafe,
but I can't be sure unless you point me directly to the failure, instead of
the job page.


> I asked INFRA team to display Thread Dump button. Do you see this button,
> can you grap the thread dump? I would like to see what's going on in
> Failsafe plugin. We have such issue [3] but now it most probably is user
> error because the developer overrides std/out via System.setOut() but this
> hypothesis has not been confirmed yet because it's too early.
>
>
I don't see the button, but our tests do set the surefire/maven option to
redirect test output to a file. I didn't think we're doing System.setOut()
at all, but it looks like we are doing it in one failsafe IT, and one
surefire test. I'll have to investigate these a bit more. I think there's
probably a better way to do those tests.

I'm adding our dev list to the distro. Let's continue this conversation
there, if necessary, and bring it back to the Maven list if we can figure
out what's going on.


> [1] https://builds.apache.org/job/Accumulo-master-IT/
> [2] https://wiki.jenkins-ci.org/display/JENKINS/Thread+Dump+Action+Plugin
> [3] https://issues.apache.org/jira/browse/SUREFIRE-1193
>
>
>
> On Wed, Aug 31, 2016 at 3:00 AM, Christopher [via Maven] <
> ml-node+s40175n5879500...@n5.nabble.com> wrote:
>
> > tl;dr - A proposal for config independence for groups/excludeGroups
> > properties and some special keywords for ALL, NONE, and UNCATEGORIZED
> > groups.
> >
> > ***
> >
> > In the Apache Accumulo project, we're currently in process of trying to
> > make use of JUnit categories to separate different classes of tests.
> >
> > So, we're using the maven-surefire-plugin and maven-failsafe-plugin
> > configuration properties: groups, excludeGroups
> >
> > One thing we noticed was that the user property is the same for both
> > plugins. This is a problem, because one cannot pass in a system property
> > on
> > the command-line to affect one without affecting the other.
> >
> > I propose that maven-surefire-plugin and maven-failsafe-plugin deprecate
> > the existing groups/excludeGroups properties, and replace them with
> > tests.groups/tests.excludeGroups, and it.groups/it.excludeGroups. (This
> > should probably be done for other shared properties as well.)
> >
> > Users can simulate this by doing something like this:
> > ${it.groups}
> >
> > However, this may cause problems if the property is not defined I
> > haven't tested to be sure.
> >
> > ***
> >
> > That leads me to a question and a second proposal:
> > Is there a way to specify uncategorized test classes? Or all test
> classes?
> > Or none?
> >
> > If not, I would like to propose that some special keywords be created
> > which
> > can represent:
> > _ALL_, _NONE_, _UNCATEGORIZED_ (or similar)
> >
> > That way, users can do things like:
> > my.special.Category,_UNCATEGORIZED_
> > _NONE_
> > or
> > _NONE_
> > or
> > _ALL_
> >
> > These keywords may require some support from the underlying test
> > framework,
> > like JUnit, so I can understand if these keywords cannot happen.
> >
> > Even if the keywords cannot be made to work, I still think it'd be good
> to
> > deprecate-and-separate the properties for the two plugins, so they can be
> > controlled independently with user properties.
> >
> > Thanks.
> >
> >
> > --
> > If you reply to this email, your message will be added to the discussion
> > below:
> > http://maven.40175.n5.nabble.com/JUnit-categories-in-
> > surefire-failsafe-tp5879500.html
> > To start a new topic under Maven Developers, email
> > ml-node+s40175n142166...@n5.nabble.com
> > To unsubscribe from Maven Developers, click here
> > <
> http://maven.40175.n5.nabble.com/template/NamlServlet.jtp?macro=unsubscribe_by_code=142166=dGlib3JkaWdhbmFAYXBhY2hlLm9yZ3wxNDIxNjZ8LTI4OTQ5MjEwMg==
> >
> > .
> > NAML
> > <
> http://maven.40175.n5.nabble.com/template/NamlServlet.jtp?macro=macro_viewer=instant_html%21nabble%3Aemail.naml=nabble.naml.namespaces.BasicNamespace-nabble.view.web.template.NabbleNamespace-nabble.view.web.template.NodeNamespace=notify_subscribers%21nabble%3Aemail.naml-instant_emails%21nabble%3Aemail.naml-send_instant_email%21nabble%3Aemail.naml
> >
> >
>
>
>
>
> --
> View this message in context:
> http://maven.40175.n5.nabble.com/JUnit-categories-in-surefire-failsafe-tp5879500p5879510.html
> Sent from the 

Re: Building jar targeting multiple Java versions, including 9

2016-08-31 Thread Andreas Gudian
2016-08-31 22:02 GMT+02:00 Tibor Digana :

> >>So, we can try to have different source folders for different runtimes.
> No. If you are developers you would not say this.
> No developer would develop and merge the same code multiple times!!!
> Look, now let the maven-compiler-plugin to download jars of the same
> artifact which was released for Version 1.7 and 1.8, and now let the
> compiler wrap those two versions with current version 1.9 which is
> ready to be released. Now you have fat jar MRjar.
> This means any Java version of the artifact can be chosen on the top of
> JDK9.
>
> >>Most other plugins would need to be executed for each version as well -
> javadoc, junit
> No. because the previous Versions were already tested and processed.
>
> >>Again if they *don't* need a separate execution, then why is MRJar
> needed?
> Exactly, and now I cannot imaging company which is going to complicate
> their project with a stupid MRJar and why so if they have WildFly
> 8.2.1 which supports Java 8 do you think they EE project would
> struggle with MRJar? Of course not. If they have WildFly 10.x
> supporting Java 9 then again no developer would build MRJar.
>
> I think MRJar is suitable only for JDK internals and it is only
> because of Oracle has problem with JCP because JCP does not let Oracle
> to break backwards compatibility and introduce dynamic language
> features. So Oracle is trying for compromise.
> Oracle is trying to let you build java.* JDK with javac, interchange
> internal modules, etc.
> Nothing for other non-JDK projects.
>

That's a bit off-topic, as this thread is about the _how_ and not the
_why_, but I'd like to point out that I disagree with you here ;). MRJars
would be great for a lot of frameworks and libraries that you would _use_
in your end-product. Frameworks that offer different alternatives or
extended features requiring a newer JDK now have to build and maintain
artifacts either with different versions or with different artifactIds
(e.g. funky-annotations and funky-annotations-jdk8, where the -jdk8 variant
contains annotations with @Repeatable) -- there are a lot of examples out
there. And that's not only cumbersome for maintainers, but also for users
(which is the bad thing).

So go MRjars! Making them easily consumable for users is the top-prirority
and it looks like a solid approach.

Making them easy to build for the maintainers is way less important (as
that's only a tiny percentage of the maven-users out there), but it sure
needs to work with different IDEs and especially Eclipse would have
problems handling different target JDKs in the same project.

So for me as an Eclipse user, I'd have to take the road with the
multi-module projects. But that's okay, as I'm just building a framework
and separating the different variants for different JDKs has also a lot of
benefits, as already described above: testing, generating javadocs, and all
that other stuff is already there and can be used without having to change
anything. For me, a project collecting the internally seperate artifacts
and combining them into an MRjar would be totally fine.

Andreas



>
> --
> Cheers
> Tibor
>
> -
> To unsubscribe, e-mail: dev-unsubscr...@maven.apache.org
> For additional commands, e-mail: dev-h...@maven.apache.org
>
>


Re: POM Model version 4.1.0 in 3.4.0-SNAPSHOTs

2016-08-31 Thread Stephen Connolly
On Wednesday 31 August 2016, Robert Scholte  wrote:

> On Wed, 31 Aug 2016 19:35:02 +0200, Stephen Connolly <
> stephen.alan.conno...@gmail.com> wrote:
>
> On Wednesday 31 August 2016, Christian Schulte  wrote:
>>
>> Am 08/31/16 um 18:39 schrieb Christian Schulte:
>>> > Am 08/31/16 um 07:52 schrieb Stephen Connolly:
>>> >> I've been thinking about what to call the "consumer Pom"...
>>> >>
>>> >> I think this is actually not a project object model, but the project
>>> >> dependency trees
>>> >>
>>> >> It should list each side artifact and their dependency trees...
>>> >>
>>> >> So for example:
>>> >>
>>> >> * the java doc artifacts should depend on the corresponding dependency
>>> java
>>> >> doc artifacts (in an ideal world) because we expect {@link} references
>>> >>
>>> >> * the source artifacts do not depend on anything else (normally) but
>>> for an
>>> >> über jar (which yes is a bad pattern) you would actually be correct to
>>> >> depend on the bundled artifacts source jars... So the concept still
>>> makes
>>> >> sense
>>> >>
>>> >> * the test jar artifact would have the full test dependency tree
>>> exposed as
>>> >> this would allow for test reuse
>>> >
>>> > +1
>>> >
>>> > Sounds like dependency trees by scope. The compile scope tree, the
>>> > runtime scope tree, the test scope tree, the documentation scope tree,
>>> > the source code scope tree, the invented by a 3rd party scope tree,
>>> etc.
>>> >
>>>
>>>
>>> public interface DependencyTreeProvider
>>> {
>>>
>>>   DependencyNode getDependencyTree( String logicalScopeName )
>>> throws IOException;
>>>
>>> }
>>>
>>
>>
>> Not `String logicalScopeName` but `Artifact artifact`
>>
>> Because... Otherwise we have to pick a defined list of scope names and
>> their meaning... Any defined list of scope names will be different... So
>> better is to just publish the one tree per artifact
>>
>> Exposing the logical scope is a mistake IMHO
>>
>> There is one gotcha though... Sometimes you need a transitive dependency
>> to
>> compile but do not need it at runtime... How to handle that needs some
>> thought... But right now it is a problem anyway, and it is probably safer
>> to just force it onto the "effective" tree and let the consumer who *knows
>> they don't really need it* strip it from their tree (at the final point of
>> use... Because it would also need to be on their compile path)
>>
>> I think the original view we had with maven where compile scope would not
>> be transitive is no longer compatible with JavaC and basically compile
>> scope will always need to be transitive.
>>
>> So I am -1 on exposing details like "scope" to consumers
>>
>> We do need to expose "optional" so the consumer can resolve version
>> conflicts with the optional dependency
>>
>> HTH
>>
>>
>>
> I agree that scope is something to filter on, it should not be part of the
> resolution result.
> The challenge I see is to select the proper artifact in case of multiple
> scopes.
> For example: the compiler:compile wants all dependencies with the scopes
> 'compile', 'provided' (and 'system'). if the same GA has different versions
> per scope it must be clear which one to choose.
> I hope we can resolve this without "nearest wins" strategy, the answer
> should somehow be in the .pdt;


Well I am not seeing a case for *consumers* to need different trees for the
same artifact.

The only case I can imagine is our *old* vision for compile scope... But I
think that need can be handled by optional in the worst case... My
preference is to let the consumer of the dependency remove the bits they
don't want...

In essence each jar artifact's dependency tree would be that artifacts
effective runtime tree.

So for the tests-jar.., that would have a declared dependency on the
regular .jar and provide the tree for the test-runtime scope... It may even
be that there are therefore closer overrides of the main artifact's
dependencies in the test jar's tree... That should all be fine.

If you are compiling against the dependency and want to use a different
version of the dependencies then you *as consumer* can override the tree by
configuration... I think that is the best compromise.



>
> Robert
>
>
>>> The implementation of that method is broken down to just reading/parsing
>>> an XML file that way. You can implement that easily with every
>>> programming language of interest. There is no scope logic to implement.
>>> That's done at build time.
>>>
>>> When all those different trees have been fetched, the 'only' thing left
>>> to implement is conflict resolution / building the effective dependency
>>> tree. Since scopes have already been resolved, that could possibly also
>>> be broken down to just one very simple rule: nearest wins - no
>>> exceptions. Version ranges are only there to be able to check for
>>> conflicting ranges or hard requirements, right?
>>>
>>> Regards,
>>> --
>>> Christian
>>>
>>>
>>> 

Re: Building jar targeting multiple Java versions, including 9

2016-08-31 Thread Tibor Digana
>>So, we can try to have different source folders for different runtimes.
No. If you are developers you would not say this.
No developer would develop and merge the same code multiple times!!!
Look, now let the maven-compiler-plugin to download jars of the same
artifact which was released for Version 1.7 and 1.8, and now let the
compiler wrap those two versions with current version 1.9 which is
ready to be released. Now you have fat jar MRjar.
This means any Java version of the artifact can be chosen on the top of JDK9.

>>Most other plugins would need to be executed for each version as well - 
>>javadoc, junit
No. because the previous Versions were already tested and processed.

>>Again if they *don't* need a separate execution, then why is MRJar needed?
Exactly, and now I cannot imaging company which is going to complicate
their project with a stupid MRJar and why so if they have WildFly
8.2.1 which supports Java 8 do you think they EE project would
struggle with MRJar? Of course not. If they have WildFly 10.x
supporting Java 9 then again no developer would build MRJar.

I think MRJar is suitable only for JDK internals and it is only
because of Oracle has problem with JCP because JCP does not let Oracle
to break backwards compatibility and introduce dynamic language
features. So Oracle is trying for compromise.
Oracle is trying to let you build java.* JDK with javac, interchange
internal modules, etc.
Nothing for other non-JDK projects.

-- 
Cheers
Tibor

-
To unsubscribe, e-mail: dev-unsubscr...@maven.apache.org
For additional commands, e-mail: dev-h...@maven.apache.org



Re: POM Model version 4.1.0 in 3.4.0-SNAPSHOTs

2016-08-31 Thread Robert Scholte
On Wed, 31 Aug 2016 19:35:02 +0200, Stephen Connolly  
 wrote:



On Wednesday 31 August 2016, Christian Schulte  wrote:


Am 08/31/16 um 18:39 schrieb Christian Schulte:
> Am 08/31/16 um 07:52 schrieb Stephen Connolly:
>> I've been thinking about what to call the "consumer Pom"...
>>
>> I think this is actually not a project object model, but the project
>> dependency trees
>>
>> It should list each side artifact and their dependency trees...
>>
>> So for example:
>>
>> * the java doc artifacts should depend on the corresponding  
dependency

java
>> doc artifacts (in an ideal world) because we expect {@link}  
references

>>
>> * the source artifacts do not depend on anything else (normally) but
for an
>> über jar (which yes is a bad pattern) you would actually be correct  
to

>> depend on the bundled artifacts source jars... So the concept still
makes
>> sense
>>
>> * the test jar artifact would have the full test dependency tree
exposed as
>> this would allow for test reuse
>
> +1
>
> Sounds like dependency trees by scope. The compile scope tree, the
> runtime scope tree, the test scope tree, the documentation scope tree,
> the source code scope tree, the invented by a 3rd party scope tree,  
etc.

>


public interface DependencyTreeProvider
{

  DependencyNode getDependencyTree( String logicalScopeName )
throws IOException;

}



Not `String logicalScopeName` but `Artifact artifact`

Because... Otherwise we have to pick a defined list of scope names and
their meaning... Any defined list of scope names will be different... So
better is to just publish the one tree per artifact

Exposing the logical scope is a mistake IMHO

There is one gotcha though... Sometimes you need a transitive dependency  
to

compile but do not need it at runtime... How to handle that needs some
thought... But right now it is a problem anyway, and it is probably safer
to just force it onto the "effective" tree and let the consumer who  
*knows
they don't really need it* strip it from their tree (at the final point  
of

use... Because it would also need to be on their compile path)

I think the original view we had with maven where compile scope would not
be transitive is no longer compatible with JavaC and basically compile
scope will always need to be transitive.

So I am -1 on exposing details like "scope" to consumers

We do need to expose "optional" so the consumer can resolve version
conflicts with the optional dependency

HTH




I agree that scope is something to filter on, it should not be part of the  
resolution result.
The challenge I see is to select the proper artifact in case of multiple  
scopes.
For example: the compiler:compile wants all dependencies with the scopes  
'compile', 'provided' (and 'system'). if the same GA has different  
versions per scope it must be clear which one to choose.
I hope we can resolve this without "nearest wins" strategy, the answer  
should somehow be in the .pdt;


Robert



The implementation of that method is broken down to just reading/parsing
an XML file that way. You can implement that easily with every
programming language of interest. There is no scope logic to implement.
That's done at build time.

When all those different trees have been fetched, the 'only' thing left
to implement is conflict resolution / building the effective dependency
tree. Since scopes have already been resolved, that could possibly also
be broken down to just one very simple rule: nearest wins - no
exceptions. Version ranges are only there to be able to check for
conflicting ranges or hard requirements, right?

Regards,
--
Christian


-
To unsubscribe, e-mail: dev-unsubscr...@maven.apache.org 
For additional commands, e-mail: dev-h...@maven.apache.org  






-
To unsubscribe, e-mail: dev-unsubscr...@maven.apache.org
For additional commands, e-mail: dev-h...@maven.apache.org



Re: maven-cucumber-reporting Build failure

2016-08-31 Thread Kashif BHATTI
I was able to dig up a solution for now so please ignore this email for
now. If I have any further questions I will reach back out again..hopefully
not.

Thanks for your help =)

On Mon, Aug 29, 2016 at 5:07 PM, Kashif BHATTI  wrote:

> Hello everyone,
>
> I am running a Selenium JAVA test with maven and when I run the project as
> a Maven test it runs all the tests practically just fine. However there are
> two major issues that I need help with
>
> 1. at the end of every run I get this..no clue why:
>
> Tests run: 0, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 2,257.786 sec 
> - in RunnerClass.CucumberRunnerTest
> Results: Tests run: 0, Failures: 0, Errors: 0, Skipped: 0
>
> 2. I keep getting a BUILD failure..and I believe the reason is that it says 
> could Could not parse build number: ${${build.number}}. when it is about to 
> generate a Cucumber report. "Failed to execute goal 
> net.masterthought:maven-cucumber-reporting:2.0.0:generate (execution)..."
>
>
> I also see the report getting generated but the Build keeps failing which 
> throws everyone on the team off. I'm also attaching my POM.xml here. Any help 
> on these 2 issues would be greatly appreciated.
>
>


Re: POM Model version 4.1.0 in 3.4.0-SNAPSHOTs

2016-08-31 Thread Christian Schulte
Am 08/31/16 um 19:35 schrieb Stephen Connolly:
> On Wednesday 31 August 2016, Christian Schulte  wrote:
>>
>>
>> public interface DependencyTreeProvider
>> {
>>
>>   DependencyNode getDependencyTree( String logicalScopeName )
>> throws IOException;
>>
>> }
> 
> 
> Not `String logicalScopeName` but `Artifact artifact`
> 
> Because... Otherwise we have to pick a defined list of scope names and
> their meaning... Any defined list of scope names will be different... So
> better is to just publish the one tree per artifact

Are you referring to interface 'org.apache.maven.artifact.Artifact'?
Choose the tree to return based on scope? There is more than just one
tree per artifact. We need to provide many trees per artifact and then
need a way to select the tree of interest based on something. There have
been requests like "prioritize test dependencies over compile
dependencies when running tests". That's two different trees we need to
be able to read from the pdt files. Those pdt files need to support that
without the build tool having to apply any tree building logic. The pdt
files may not have been build using Maven, for example. Different tools
could apply different logic building those files. Consumer component is
not affected by this. If build tool A does not provide dependency tree
NAME of an artifact, build tool B gets no NAME tree. If build tool A
provides features like optional dependencies or dependency management
build tool B does not support, build tool B still can process the result
published by build tool A.

> 
> Exposing the logical scope is a mistake IMHO

[...]

> So I am -1 on exposing details like "scope" to consumers

Maybe scope is misleading. It's more a logical name of a dependency tree.

> 
> We do need to expose "optional" so the consumer can resolve version
> conflicts with the optional dependency

We need to keep the software processing those pdt files as dumb as
possible. Do as much as possible during building that pdt file. No need
for optional or provided in those pdt files. Optional already is a
feature of the build tool. Set it to true and the build tool will not
add that dependency to the pdt. Same with dependency management. Build
tool feature. pdt is all about pre-built final dependency trees the
consumer does not need to post-process in any way (build different
compile/test classpaths e.g.).

Regards,
-- 
Christian


-
To unsubscribe, e-mail: dev-unsubscr...@maven.apache.org
For additional commands, e-mail: dev-h...@maven.apache.org



Re: Building jar targeting multiple Java versions, including 9

2016-08-31 Thread Richard Sand
Hi Paul - I certainly believe you that you've thought this through alot 
further than I have. :-) I'm just speaking about a general sense of 
striving towards keeping the projects as simple as possible. I think 
that sometimes its a bit too easy to state that some build pattern 
should just be done with a multi-module project, when a small fix or 
improvement will neatly accomplish the same in a single POM. So I'm just 
making a general sweeping statement and not specifically about MRJars.


Now I'm just typing out loud for discussion...

Java9 is definitely a paradigm shift more so than almost any other 
release of java. I see your point - the MRJar isn't just about having 
source and target versions in the compiler plugin. If the MRJar doesn't 
leverage new runtime features, e.g. lambdas in Java8, then whats the 
point of even having the MRJar? Just set the target to the lowest common 
denominator and be done with it.


So the actual source code will be different for each java runtime - 
otherwise there's no point. So, we can try to have different source 
folders for different runtimes. Of course, most IDEs aren't going to 
support enforcing different compiler settings etc. for different source 
folders within a single project, but lets even set that aside for now - 
just thinking about the maven implications:
Maven itself would have to be run with Java9The compiler plugin would 
need a different configuration for each java versionMost other plugins 
would need to be executed for each version as well - javadoc, junit, 
surefire etc. Again if they *don't* need a separate execution, then why 
is MRJar needed?At the very least, each execution would have unique 
attached artifacts, either with classifiers or new artifact idsState 
would need to be maintained from plugin to plugin about the specific jre 
in use for that invocation
That certainly sounds like a multi-module project - at least that may be 
the easiest solution. But if I wanted to use profiles and have a 
separate profile for each JRE, and at the very end of the POM aggregate 
the artifacts together - I can see that working as well.


You know, I've had similar struggles with multi-artifact projects as 
well regardless of Java9. Example, for one particular project I build 
multiple WAR files, each with a different classifier, with some 
packaging customizations for different deployment models e.g. I have 1 
war file that includes slf4j+logback, and another warfile that has 
slf4j+jul bridge. There are other permutations too, e.g. some packages 
have obfuscated code while others are not. So I have multiple executions 
of maven-war-plugin, and in some cases I need to explicitly invoke the 
install & deploy plugins too. But I have resisted breaking this up into 
multiple projects with multiple poms simply because its easier (for me) 
to work with all of the packaging and distribution logic in one place, 
and maven provides the capabilities with profiles, executions, etc. to 
give the flexibility I need. But there have certainly been times that 
I've thought about just going multi-module.


I guess my long rambling point is that I can see MRJar being done both 
ways, and wouldn't want to pigeon-hole anyone into doing it one way vs 
another.


Richard Sand | CEO
IDF Connect, Inc. 
2207 Concord Ave, #359
Wilmington | Delaware 19803 | USA
Office: +1 888 765 1611 | Fax: +1 866 765 7284
Mobile: +1 267 984 3651




-- Original Message --
From: "Paul Benedict" 
To: "Maven Developers List" ; "Richard Sand" 


Sent: 8/31/2016 11:50:26 AM
Subject: Re: Building jar targeting multiple Java versions, including 9

Richard, I share your sentiment. I've given this subject some thought 
and
I've come to the conclusion that a full project life cycle for each 
Java
version is necessary for good measure. You will want to write main and 
test
code for each class. IDEs treat individual projects as wholly contained 
so

that means their own IDE project preferences too (code style, compiler
version, etc.). I believe a mental shift is necessary here (not 
directed at
you, per se, but directed toward anyone wanting to do a Multi-Release 
JAR)
to accept that these really are individual projects -- not just 
subsets.


However, I am completely willing to hear the opposite and learn why my
opinion is wrong too. Feel free to tell me why it's better as one 
project.

MRJAR feature is so new I am bound to learn much from others.

Cheers,
Paul

On Wed, Aug 31, 2016 at 10:46 AM, Richard Sand  
wrote:


 Understood. I guess it wouldn't be horrible if it required a 
multi-module
 maven project but I would still prefer to avoid introducing a 
requirement

 for multi-module projects anywhere.

 Richard Sand | CEO
 IDF Connect, Inc.
 2207 Concord Ave, #359
 Wilmington | Delaware 19803 | USA
 Office: +1 888 765 1611 | Fax: +1 866 765 7284
 Mobile: +1 267 984 3651





Re: POM Model version 4.1.0 in 3.4.0-SNAPSHOTs

2016-08-31 Thread Stephen Connolly
On Wednesday 31 August 2016, Christian Schulte  wrote:

> Am 08/31/16 um 18:39 schrieb Christian Schulte:
> > Am 08/31/16 um 07:52 schrieb Stephen Connolly:
> >> I've been thinking about what to call the "consumer Pom"...
> >>
> >> I think this is actually not a project object model, but the project
> >> dependency trees
> >>
> >> It should list each side artifact and their dependency trees...
> >>
> >> So for example:
> >>
> >> * the java doc artifacts should depend on the corresponding dependency
> java
> >> doc artifacts (in an ideal world) because we expect {@link} references
> >>
> >> * the source artifacts do not depend on anything else (normally) but
> for an
> >> über jar (which yes is a bad pattern) you would actually be correct to
> >> depend on the bundled artifacts source jars... So the concept still
> makes
> >> sense
> >>
> >> * the test jar artifact would have the full test dependency tree
> exposed as
> >> this would allow for test reuse
> >
> > +1
> >
> > Sounds like dependency trees by scope. The compile scope tree, the
> > runtime scope tree, the test scope tree, the documentation scope tree,
> > the source code scope tree, the invented by a 3rd party scope tree, etc.
> >
>
>
> public interface DependencyTreeProvider
> {
>
>   DependencyNode getDependencyTree( String logicalScopeName )
> throws IOException;
>
> }


Not `String logicalScopeName` but `Artifact artifact`

Because... Otherwise we have to pick a defined list of scope names and
their meaning... Any defined list of scope names will be different... So
better is to just publish the one tree per artifact

Exposing the logical scope is a mistake IMHO

There is one gotcha though... Sometimes you need a transitive dependency to
compile but do not need it at runtime... How to handle that needs some
thought... But right now it is a problem anyway, and it is probably safer
to just force it onto the "effective" tree and let the consumer who *knows
they don't really need it* strip it from their tree (at the final point of
use... Because it would also need to be on their compile path)

I think the original view we had with maven where compile scope would not
be transitive is no longer compatible with JavaC and basically compile
scope will always need to be transitive.

So I am -1 on exposing details like "scope" to consumers

We do need to expose "optional" so the consumer can resolve version
conflicts with the optional dependency

HTH


>
> The implementation of that method is broken down to just reading/parsing
> an XML file that way. You can implement that easily with every
> programming language of interest. There is no scope logic to implement.
> That's done at build time.
>
> When all those different trees have been fetched, the 'only' thing left
> to implement is conflict resolution / building the effective dependency
> tree. Since scopes have already been resolved, that could possibly also
> be broken down to just one very simple rule: nearest wins - no
> exceptions. Version ranges are only there to be able to check for
> conflicting ranges or hard requirements, right?
>
> Regards,
> --
> Christian
>
>
> -
> To unsubscribe, e-mail: dev-unsubscr...@maven.apache.org 
> For additional commands, e-mail: dev-h...@maven.apache.org 
>
>

-- 
Sent from my phone


Re: POM Model version 4.1.0 in 3.4.0-SNAPSHOTs

2016-08-31 Thread Christian Schulte
Am 08/31/16 um 18:39 schrieb Christian Schulte:
> Am 08/31/16 um 07:52 schrieb Stephen Connolly:
>> I've been thinking about what to call the "consumer Pom"...
>>
>> I think this is actually not a project object model, but the project
>> dependency trees
>>
>> It should list each side artifact and their dependency trees...
>>
>> So for example:
>>
>> * the java doc artifacts should depend on the corresponding dependency java
>> doc artifacts (in an ideal world) because we expect {@link} references
>>
>> * the source artifacts do not depend on anything else (normally) but for an
>> über jar (which yes is a bad pattern) you would actually be correct to
>> depend on the bundled artifacts source jars... So the concept still makes
>> sense
>>
>> * the test jar artifact would have the full test dependency tree exposed as
>> this would allow for test reuse
> 
> +1
> 
> Sounds like dependency trees by scope. The compile scope tree, the
> runtime scope tree, the test scope tree, the documentation scope tree,
> the source code scope tree, the invented by a 3rd party scope tree, etc.
> 


public interface DependencyTreeProvider
{

  DependencyNode getDependencyTree( String logicalScopeName )
throws IOException;

}

The implementation of that method is broken down to just reading/parsing
an XML file that way. You can implement that easily with every
programming language of interest. There is no scope logic to implement.
That's done at build time.

When all those different trees have been fetched, the 'only' thing left
to implement is conflict resolution / building the effective dependency
tree. Since scopes have already been resolved, that could possibly also
be broken down to just one very simple rule: nearest wins - no
exceptions. Version ranges are only there to be able to check for
conflicting ranges or hard requirements, right?

Regards,
-- 
Christian


-
To unsubscribe, e-mail: dev-unsubscr...@maven.apache.org
For additional commands, e-mail: dev-h...@maven.apache.org



Re: POM Model version 4.1.0 in 3.4.0-SNAPSHOTs

2016-08-31 Thread Christian Schulte
Am 08/31/16 um 07:52 schrieb Stephen Connolly:
> I've been thinking about what to call the "consumer Pom"...
> 
> I think this is actually not a project object model, but the project
> dependency trees
> 
> It should list each side artifact and their dependency trees...
> 
> So for example:
> 
> * the java doc artifacts should depend on the corresponding dependency java
> doc artifacts (in an ideal world) because we expect {@link} references
> 
> * the source artifacts do not depend on anything else (normally) but for an
> über jar (which yes is a bad pattern) you would actually be correct to
> depend on the bundled artifacts source jars... So the concept still makes
> sense
> 
> * the test jar artifact would have the full test dependency tree exposed as
> this would allow for test reuse

+1

Sounds like dependency trees by scope. The compile scope tree, the
runtime scope tree, the test scope tree, the documentation scope tree,
the source code scope tree, the invented by a 3rd party scope tree, etc.

> 
> Now I guess the question is if .pdt or .adt (artifact dependency trees) are
> too entrenched in some other domain that we'd want to avoid using one
> of those extensions
> 
> Next steps:
> 
> * start fleshing out a schema for the .pdt files
> * start fleshing out a spec for the repository layout (should be "parsable"
> by modelVersion 4.0.0 aware clients, but need to decide how to expose new
> features)

+1

Regards,
-- 
Christian


-
To unsubscribe, e-mail: dev-unsubscr...@maven.apache.org
For additional commands, e-mail: dev-h...@maven.apache.org



Re: Building jar targeting multiple Java versions, including 9

2016-08-31 Thread Paul Benedict
Richard, I share your sentiment. I've given this subject some thought and
I've come to the conclusion that a full project life cycle for each Java
version is necessary for good measure. You will want to write main and test
code for each class. IDEs treat individual projects as wholly contained so
that means their own IDE project preferences too (code style, compiler
version, etc.). I believe a mental shift is necessary here (not directed at
you, per se, but directed toward anyone wanting to do a Multi-Release JAR)
to accept that these really are individual projects -- not just subsets.

However, I am completely willing to hear the opposite and learn why my
opinion is wrong too. Feel free to tell me why it's better as one project.
MRJAR feature is so new I am bound to learn much from others.

Cheers,
Paul

On Wed, Aug 31, 2016 at 10:46 AM, Richard Sand  wrote:

> Understood. I guess it wouldn't be horrible if it required a multi-module
> maven project but I would still prefer to avoid introducing a requirement
> for multi-module projects anywhere.
>
> Richard Sand | CEO
> IDF Connect, Inc.
> 2207 Concord Ave, #359
> Wilmington | Delaware 19803 | USA
> Office: +1 888 765 1611 | Fax: +1 866 765 7284
> Mobile: +1 267 984 3651
>
>
>
>
> -- Original Message --
> From: "Paul Benedict" 
> To: "Richard Sand" 
> Cc: "ZML-Apache-Maven-Developers" 
> Sent: 8/31/2016 11:10:33 AM
> Subject: Re: Building jar targeting multiple Java versions, including 9
>
> To be clear, I was purely addressing the concern of a Multi-Release JAR.
>>
>> Cheers,
>> Paul
>>
>> On Wed, Aug 31, 2016 at 10:09 AM, Richard Sand 
>> wrote:
>>
>>  I definitely concur with Robert's point: "I don't think we make
>>> developers
>>>  very happy  if they are advised to have a multimodule project just to be
>>>  able to compile the module-info file.". I can live with the requirement
>>>  that I must run Maven with Java9 to support creating module-info and
>>> having
>>>  the addtional modules created by a separate plugin. Simplicity wherever
>>>  possible.
>>>
>>>  Best regards,
>>>
>>>  Richard Sand | CEO
>>>  IDF Connect, Inc.
>>>  2207 Concord Ave, #359
>>>  Wilmington | Delaware 19803 | USA
>>>  Office: +1 888 765 1611 | Fax: +1 866 765 7284
>>>  Mobile: +1 267 984 3651
>>>
>>>
>>>  -- Original Message --
>>>  From: "Robert Scholte" 
>>>  To: "ZML-Apache-Maven-Developers" ; "Paul
>>> Benedict"
>>>  
>>>  Sent: 8/31/2016 10:39:52 AM
>>>  Subject: Re: Building jar targeting multiple Java versions, including 9
>>>
>>>  Hi Paul,
>>>

  no problem to move it to this thread. It is indeed about "The Maven
  Way",  although we may need to start from the beginning, explaining the
  issue  we're facing.

  Let's use ASM as an example, their 6.0_ALPHA has been built like this,
  although without Maven.
  ASM is an all purpose Java bytecode manipulation and analysis
 framework.
  IIRC their code is compatible with Java 1.2, but they've also added a
  module-info for those who want to use this dependency within a
  Java9/Jigsaw project.
  module-info MUST be built with Java9 with source/target or release = 9.
  Other sources must be built with an older JDK with source/target 1.2

  There are several ways to solve this:
  - multi-module project and multi release jar like Paul suggest.
 However,
  IIRC the specs say that the module-info MUST exist in the root.
  - 2 source folders, src/main/java and src/main/jigsaw, both writing to
  target/classes. Here it is quite clear what happens per source-folder.
  - 1 source folder and all the magic of calling javac twice in the
  maven-compiler-plugin. I started with this, but I don't like it.
 Details
  are below.
  - 1 source folder and 2 execution blocks (one excluding module-info,
 one
  only including module-info).

  We shouldn't be looking at Maven alone, but also at IDE support. AFAIK
  Netbeans and IntelliJ simply call Maven. Eclipse is probably hard due
 to
  the m2eclipse extensions.

  Now back to Pauls suggestion. I don't think we make developers very
  happy  if they are advised to have a multimodule project just to be
 able
  to  compile the module-info file.
  I am aware that this is mainly an issue for library builders, end users
  simply build everything with Java9, no problems there. From library
  builders you can expect that they don't mind adding extra
 configuration to
  build their jar, but forcing them to set up a Maven multimodule
 project is
  not really nice.
  I would expect the module-info close to the related sourcefiles, so I
  would prefer in the same (single) MavenProject.

  *Unless* IDEs are so strong with 

Re: Building jar targeting multiple Java versions, including 9

2016-08-31 Thread Richard Sand
Understood. I guess it wouldn't be horrible if it required a 
multi-module maven project but I would still prefer to avoid introducing 
a requirement for multi-module projects anywhere.


Richard Sand | CEO
IDF Connect, Inc.
2207 Concord Ave, #359
Wilmington | Delaware 19803 | USA
Office: +1 888 765 1611 | Fax: +1 866 765 7284
Mobile: +1 267 984 3651




-- Original Message --
From: "Paul Benedict" 
To: "Richard Sand" 
Cc: "ZML-Apache-Maven-Developers" 
Sent: 8/31/2016 11:10:33 AM
Subject: Re: Building jar targeting multiple Java versions, including 9

To be clear, I was purely addressing the concern of a Multi-Release 
JAR.


Cheers,
Paul

On Wed, Aug 31, 2016 at 10:09 AM, Richard Sand  
wrote:


 I definitely concur with Robert's point: "I don't think we make 
developers
 very happy  if they are advised to have a multimodule project just to 
be
 able to compile the module-info file.". I can live with the 
requirement
 that I must run Maven with Java9 to support creating module-info and 
having
 the addtional modules created by a separate plugin. Simplicity 
wherever

 possible.

 Best regards,

 Richard Sand | CEO
 IDF Connect, Inc.
 2207 Concord Ave, #359
 Wilmington | Delaware 19803 | USA
 Office: +1 888 765 1611 | Fax: +1 866 765 7284
 Mobile: +1 267 984 3651


 -- Original Message --
 From: "Robert Scholte" 
 To: "ZML-Apache-Maven-Developers" ; "Paul 
Benedict"

 
 Sent: 8/31/2016 10:39:52 AM
 Subject: Re: Building jar targeting multiple Java versions, including 
9


 Hi Paul,


 no problem to move it to this thread. It is indeed about "The Maven
 Way",  although we may need to start from the beginning, explaining 
the

 issue  we're facing.

 Let's use ASM as an example, their 6.0_ALPHA has been built like 
this,

 although without Maven.
 ASM is an all purpose Java bytecode manipulation and analysis 
framework.
 IIRC their code is compatible with Java 1.2, but they've also added 
a

 module-info for those who want to use this dependency within a
 Java9/Jigsaw project.
 module-info MUST be built with Java9 with source/target or release = 
9.

 Other sources must be built with an older JDK with source/target 1.2

 There are several ways to solve this:
 - multi-module project and multi release jar like Paul suggest. 
However,

 IIRC the specs say that the module-info MUST exist in the root.
 - 2 source folders, src/main/java and src/main/jigsaw, both writing 
to
 target/classes. Here it is quite clear what happens per 
source-folder.

 - 1 source folder and all the magic of calling javac twice in the
 maven-compiler-plugin. I started with this, but I don't like it. 
Details

 are below.
 - 1 source folder and 2 execution blocks (one excluding module-info, 
one

 only including module-info).

 We shouldn't be looking at Maven alone, but also at IDE support. 
AFAIK
 Netbeans and IntelliJ simply call Maven. Eclipse is probably hard 
due to

 the m2eclipse extensions.

 Now back to Pauls suggestion. I don't think we make developers very
 happy  if they are advised to have a multimodule project just to be 
able

 to  compile the module-info file.
 I am aware that this is mainly an issue for library builders, end 
users

 simply build everything with Java9, no problems there. From library
 builders you can expect that they don't mind adding extra 
configuration to
 build their jar, but forcing them to set up a Maven multimodule 
project is

 not really nice.
 I would expect the module-info close to the related sourcefiles, so 
I

 would prefer in the same (single) MavenProject.

 *Unless* IDEs are so strong with handling multi release jars that it
 looks  like I'm adjusting the module-info, even though it is 
actually

 located  somewhere else.

 So let's see the opinions from others.

 thanks,
 Robert

 On Wed, 31 Aug 2016 15:59:04 +0200, Paul Benedict 


 wrote:

 Robert, I'm responding to dev@maven so we can discuss Maven

 philosophies...

 I believe the pattern should be based on a multi-module project. 
Each

 module should target the expected JDK version. Then introduce a new
 "mrjar"
 type for the parent that knows how to bind them all together into a
 Multi-Release JAR.

 Cheers,
 Paul

 On Wed, Aug 31, 2016 at 6:10 AM, Robert Scholte 


 wrote:

 I've been working on the implementation of this in the

 maven-compiler-plugin, but I'm not really pleased with the result.
 The problem is that in the worst case scenario we have to work 
with 3

 different versions of Java:
 - The Maven Runtime (set as JAVA_HOME)
 - JDK for the module-info.java
 - JDK for all other source files.

 The example below worked because all three were set to JDK9.
 But based on the source/target of 1.6 I cannot predict which JDK 
is

 used,
 only that it is at least JDK6. Should the plugin switch to another 
JDK?
 

Re: Building jar targeting multiple Java versions, including 9

2016-08-31 Thread Paul Benedict
To be clear, I was purely addressing the concern of a Multi-Release JAR.

Cheers,
Paul

On Wed, Aug 31, 2016 at 10:09 AM, Richard Sand  wrote:

> I definitely concur with Robert's point: "I don't think we make developers
> very happy  if they are advised to have a multimodule project just to be
> able to compile the module-info file.". I can live with the requirement
> that I must run Maven with Java9 to support creating module-info and having
> the addtional modules created by a separate plugin. Simplicity wherever
> possible.
>
> Best regards,
>
> Richard Sand | CEO
> IDF Connect, Inc.
> 2207 Concord Ave, #359
> Wilmington | Delaware 19803 | USA
> Office: +1 888 765 1611 | Fax: +1 866 765 7284
> Mobile: +1 267 984 3651
>
>
> -- Original Message --
> From: "Robert Scholte" 
> To: "ZML-Apache-Maven-Developers" ; "Paul Benedict"
> 
> Sent: 8/31/2016 10:39:52 AM
> Subject: Re: Building jar targeting multiple Java versions, including 9
>
> Hi Paul,
>>
>> no problem to move it to this thread. It is indeed about "The Maven
>> Way",  although we may need to start from the beginning, explaining the
>> issue  we're facing.
>>
>> Let's use ASM as an example, their 6.0_ALPHA has been built like this,
>> although without Maven.
>> ASM is an all purpose Java bytecode manipulation and analysis framework.
>> IIRC their code is compatible with Java 1.2, but they've also added a
>> module-info for those who want to use this dependency within a
>> Java9/Jigsaw project.
>> module-info MUST be built with Java9 with source/target or release = 9.
>> Other sources must be built with an older JDK with source/target 1.2
>>
>> There are several ways to solve this:
>> - multi-module project and multi release jar like Paul suggest. However,
>> IIRC the specs say that the module-info MUST exist in the root.
>> - 2 source folders, src/main/java and src/main/jigsaw, both writing to
>> target/classes. Here it is quite clear what happens per source-folder.
>> - 1 source folder and all the magic of calling javac twice in the
>> maven-compiler-plugin. I started with this, but I don't like it. Details
>> are below.
>> - 1 source folder and 2 execution blocks (one excluding module-info, one
>> only including module-info).
>>
>> We shouldn't be looking at Maven alone, but also at IDE support. AFAIK
>> Netbeans and IntelliJ simply call Maven. Eclipse is probably hard due to
>> the m2eclipse extensions.
>>
>> Now back to Pauls suggestion. I don't think we make developers very
>> happy  if they are advised to have a multimodule project just to be able
>> to  compile the module-info file.
>> I am aware that this is mainly an issue for library builders, end users
>> simply build everything with Java9, no problems there. From library
>> builders you can expect that they don't mind adding extra configuration to
>> build their jar, but forcing them to set up a Maven multimodule project is
>> not really nice.
>> I would expect the module-info close to the related sourcefiles, so I
>> would prefer in the same (single) MavenProject.
>>
>> *Unless* IDEs are so strong with handling multi release jars that it
>> looks  like I'm adjusting the module-info, even though it is actually
>> located  somewhere else.
>>
>> So let's see the opinions from others.
>>
>> thanks,
>> Robert
>>
>> On Wed, 31 Aug 2016 15:59:04 +0200, Paul Benedict 
>> wrote:
>>
>> Robert, I'm responding to dev@maven so we can discuss Maven
>>> philosophies...
>>>
>>> I believe the pattern should be based on a multi-module project. Each
>>> module should target the expected JDK version. Then introduce a new
>>> "mrjar"
>>> type for the parent that knows how to bind them all together into a
>>> Multi-Release JAR.
>>>
>>> Cheers,
>>> Paul
>>>
>>> On Wed, Aug 31, 2016 at 6:10 AM, Robert Scholte 
>>> wrote:
>>>
>>> I've been working on the implementation of this in the
 maven-compiler-plugin, but I'm not really pleased with the result.
 The problem is that in the worst case scenario we have to work with 3
 different versions of Java:
 - The Maven Runtime (set as JAVA_HOME)
 - JDK for the module-info.java
 - JDK for all other source files.

 The example below worked because all three were set to JDK9.
 But based on the source/target of 1.6 I cannot predict which JDK is
 used,
 only that it is at least JDK6. Should the plugin switch to another JDK?
 And if you want to compile with source/target 1.5 or less, you're in
 trouble. There's something called toolchain, where you can specify the
 JavaHome per version, but in case of maven-compiler-plugin it assumes
 that
 all java-related plugins and execution blocks want to use the same
 toolchain through the whole Maven project.
 The good news is that for the maven-jdeps-plugin I improved this part in
 Maven 3.3.1, since this plugin only 

Re: Building jar targeting multiple Java versions, including 9

2016-08-31 Thread Richard Sand
I definitely concur with Robert's point: "I don't think we make 
developers very happy  if they are advised to have a multimodule project 
just to be able to compile the module-info file.". I can live with the 
requirement that I must run Maven with Java9 to support creating 
module-info and having the addtional modules created by a separate 
plugin. Simplicity wherever possible.


Best regards,

Richard Sand | CEO
IDF Connect, Inc.
2207 Concord Ave, #359
Wilmington | Delaware 19803 | USA
Office: +1 888 765 1611 | Fax: +1 866 765 7284
Mobile: +1 267 984 3651

-- Original Message --
From: "Robert Scholte" 
To: "ZML-Apache-Maven-Developers" ; "Paul 
Benedict" 

Sent: 8/31/2016 10:39:52 AM
Subject: Re: Building jar targeting multiple Java versions, including 9


Hi Paul,

no problem to move it to this thread. It is indeed about "The Maven 
Way",  although we may need to start from the beginning, explaining the 
issue  we're facing.


Let's use ASM as an example, their 6.0_ALPHA has been built like this,  
although without Maven.
ASM is an all purpose Java bytecode manipulation and analysis 
framework.  IIRC their code is compatible with Java 1.2, but they've 
also added a  module-info for those who want to use this dependency 
within a  Java9/Jigsaw project.
module-info MUST be built with Java9 with source/target or release = 9. 
 Other sources must be built with an older JDK with source/target 1.2


There are several ways to solve this:
- multi-module project and multi release jar like Paul suggest. 
However,  IIRC the specs say that the module-info MUST exist in the 
root.
- 2 source folders, src/main/java and src/main/jigsaw, both writing to  
target/classes. Here it is quite clear what happens per source-folder.
- 1 source folder and all the magic of calling javac twice in the  
maven-compiler-plugin. I started with this, but I don't like it. 
Details  are below.
- 1 source folder and 2 execution blocks (one excluding module-info, 
one  only including module-info).


We shouldn't be looking at Maven alone, but also at IDE support. AFAIK  
Netbeans and IntelliJ simply call Maven. Eclipse is probably hard due 
to  the m2eclipse extensions.


Now back to Pauls suggestion. I don't think we make developers very 
happy  if they are advised to have a multimodule project just to be 
able to  compile the module-info file.
I am aware that this is mainly an issue for library builders, end users 
 simply build everything with Java9, no problems there. From library  
builders you can expect that they don't mind adding extra configuration 
to  build their jar, but forcing them to set up a Maven multimodule 
project is  not really nice.
I would expect the module-info close to the related sourcefiles, so I  
would prefer in the same (single) MavenProject.


*Unless* IDEs are so strong with handling multi release jars that it 
looks  like I'm adjusting the module-info, even though it is actually 
located  somewhere else.


So let's see the opinions from others.

thanks,
Robert

On Wed, 31 Aug 2016 15:59:04 +0200, Paul Benedict 
  wrote:


Robert, I'm responding to dev@maven so we can discuss Maven  
philosophies...


I believe the pattern should be based on a multi-module project. Each
module should target the expected JDK version. Then introduce a new  
"mrjar"

type for the parent that knows how to bind them all together into a
Multi-Release JAR.

Cheers,
Paul

On Wed, Aug 31, 2016 at 6:10 AM, Robert Scholte 
wrote:


I've been working on the implementation of this in the
maven-compiler-plugin, but I'm not really pleased with the result.
The problem is that in the worst case scenario we have to work with 3
different versions of Java:
- The Maven Runtime (set as JAVA_HOME)
- JDK for the module-info.java
- JDK for all other source files.

The example below worked because all three were set to JDK9.
But based on the source/target of 1.6 I cannot predict which JDK is  
used,
only that it is at least JDK6. Should the plugin switch to another 
JDK?

And if you want to compile with source/target 1.5 or less, you're in
trouble. There's something called toolchain, where you can specify 
the
JavaHome per version, but in case of maven-compiler-plugin it assumes 
 that

all java-related plugins and execution blocks want to use the same
toolchain through the whole Maven project.
The good news is that for the maven-jdeps-plugin I improved this part 
in

Maven 3.3.1, since this plugin only works with Java8 and above, which
doesn't have to be the same JDK to compile the sources with. Now you 
can
simple say: I want the toolchain for version X. This feature needs to 
be

added to the plugin.

That said I think I will write a recipe for this. This is first of 
all  an

issue for library writers who want to have their jar compatible with
multiple Java versions for their end users.
Result: One javac call per execution 

Re: POM Model version 4.1.0 in 3.4.0-SNAPSHOTs

2016-08-31 Thread Robert Scholte
On Wed, 31 Aug 2016 07:52:20 +0200, Stephen Connolly  
 wrote:



I've been thinking about what to call the "consumer Pom"...

I think this is actually not a project object model, but the project
dependency trees

It should list each side artifact and their dependency trees...

So for example:

* the java doc artifacts should depend on the corresponding dependency  
java

doc artifacts (in an ideal world) because we expect {@link} references

* the source artifacts do not depend on anything else (normally) but for  
an

über jar (which yes is a bad pattern) you would actually be correct to
depend on the bundled artifacts source jars... So the concept still makes
sense

* the test jar artifact would have the full test dependency tree exposed  
as

this would allow for test reuse

Now I guess the question is if .pdt or .adt (artifact dependency trees)  
are

too entrenched in some other domain that we'd want to avoid using one
of those extensions

Next steps:

* start fleshing out a schema for the .pdt files
* start fleshing out a spec for the repository layout (should be  
"parsable"

by modelVersion 4.0.0 aware clients, but need to decide how to expose new
features)



+1 for .pdt and the described next steps

Robert

On Tuesday 30 August 2016, Stephen Connolly  


wrote:


On 29 August 2016 at 23:27, Christian Schulte > wrote:


Am 08/30/16 um 00:16 schrieb Paul Benedict:
> I see a deployed faulty "consumer pom" to be more more harmful than
> generating it locally on demand. At least with the local one I can
upgrade
> my client to fix a dependency calculation. There will be no such  
relief

in
> the case of your proposal.

It's not my proposal but I agree to what is proposed. This whole
discussion started because users have requested to revert commits due  
to
compatibility issues. They want to keep such "faulty" behaviour. If  
they

want to fix it, they can deploy a new version using a more recent Maven
version. The older Maven version will then also see this new behaviour.
If the consumer pom contains the complete resolved dependency tree, the
code interpreting that data is not much more than downloading some  
files

from some repository. Yes. Repository information needs to be part of
that consumer pom as well. So the resolved dependency tree including
repository information from where to get the resolved artifacts. And we
also need to find a way to handle version ranges.



So the way I see this, different types of artifacts have different tree
requirements.

I may have one type of artifact (i.e. jar) where it really is not
supported to have multiple versions of the same artifact on the same
classpath at the same time.

We currently view artifact dependency resolution as building a linear  
path
from the dependency tree based on the assumption of single version of  
any

dependency.

In the JavaScript world... and even in the OSGi world... that assumption
is simply not true.

This implies - to me - that the consumer pom should contain the tree  
that

was used at build time... *as a tree*

By all means, each node could include the version range as well as the
resolved version, e.g.


  
...
  
  ...
  


The child elements are the dependencies of the resolved version.

Now the consumer of the consumer pom has all the dependency information
used at build time as well as the information to perform  
substitutions...


This means that if I - as a consumer - am already pulling in a different
resolvedVersion (but valid within the advertised range) of a child
dependency and I am using the tree to produce a single-version  
classpath, I

can prune the tree and know I have removed the unrequired dependencies.

If I - as a consumer - need to produce a multi-version classpath -  
because
I am producing for an OSGi container - I can build the correct tree  
rather
than being forced to look at a flattened classpath that may not be  
aligned

with the requirements of the dependency system I am using

If I - as a consumer - decide that I want to push the ranges to newer,  
we

still have the range information to allow for range validation... but by
default I will be using the versions that were resolved at build time  
and

consequently tested with by the builder.

HTH

-Stephen




Regards,
--
Christian


-
To unsubscribe, e-mail: dev-unsubscr...@maven.apache.org

For additional commands, e-mail: dev-h...@maven.apache.org







-
To unsubscribe, e-mail: dev-unsubscr...@maven.apache.org
For additional commands, e-mail: dev-h...@maven.apache.org



Re: Building jar targeting multiple Java versions, including 9

2016-08-31 Thread Robert Scholte

Hi Paul,

no problem to move it to this thread. It is indeed about "The Maven Way",  
although we may need to start from the beginning, explaining the issue  
we're facing.


Let's use ASM as an example, their 6.0_ALPHA has been built like this,  
although without Maven.
ASM is an all purpose Java bytecode manipulation and analysis framework.  
IIRC their code is compatible with Java 1.2, but they've also added a  
module-info for those who want to use this dependency within a  
Java9/Jigsaw project.
module-info MUST be built with Java9 with source/target or release = 9.  
Other sources must be built with an older JDK with source/target 1.2


There are several ways to solve this:
- multi-module project and multi release jar like Paul suggest. However,  
IIRC the specs say that the module-info MUST exist in the root.
- 2 source folders, src/main/java and src/main/jigsaw, both writing to  
target/classes. Here it is quite clear what happens per source-folder.
- 1 source folder and all the magic of calling javac twice in the  
maven-compiler-plugin. I started with this, but I don't like it. Details  
are below.
- 1 source folder and 2 execution blocks (one excluding module-info, one  
only including module-info).


We shouldn't be looking at Maven alone, but also at IDE support. AFAIK  
Netbeans and IntelliJ simply call Maven. Eclipse is probably hard due to  
the m2eclipse extensions.


Now back to Pauls suggestion. I don't think we make developers very happy  
if they are advised to have a multimodule project just to be able to  
compile the module-info file.
I am aware that this is mainly an issue for library builders, end users  
simply build everything with Java9, no problems there. From library  
builders you can expect that they don't mind adding extra configuration to  
build their jar, but forcing them to set up a Maven multimodule project is  
not really nice.
I would expect the module-info close to the related sourcefiles, so I  
would prefer in the same (single) MavenProject.


*Unless* IDEs are so strong with handling multi release jars that it looks  
like I'm adjusting the module-info, even though it is actually located  
somewhere else.


So let's see the opinions from others.

thanks,
Robert

On Wed, 31 Aug 2016 15:59:04 +0200, Paul Benedict   
wrote:


Robert, I'm responding to dev@maven so we can discuss Maven  
philosophies...


I believe the pattern should be based on a multi-module project. Each
module should target the expected JDK version. Then introduce a new  
"mrjar"

type for the parent that knows how to bind them all together into a
Multi-Release JAR.

Cheers,
Paul

On Wed, Aug 31, 2016 at 6:10 AM, Robert Scholte 
wrote:


I've been working on the implementation of this in the
maven-compiler-plugin, but I'm not really pleased with the result.
The problem is that in the worst case scenario we have to work with 3
different versions of Java:
- The Maven Runtime (set as JAVA_HOME)
- JDK for the module-info.java
- JDK for all other source files.

The example below worked because all three were set to JDK9.
But based on the source/target of 1.6 I cannot predict which JDK is  
used,

only that it is at least JDK6. Should the plugin switch to another JDK?
And if you want to compile with source/target 1.5 or less, you're in
trouble. There's something called toolchain, where you can specify the
JavaHome per version, but in case of maven-compiler-plugin it assumes  
that

all java-related plugins and execution blocks want to use the same
toolchain through the whole Maven project.
The good news is that for the maven-jdeps-plugin I improved this part in
Maven 3.3.1, since this plugin only works with Java8 and above, which
doesn't have to be the same JDK to compile the sources with. Now you can
simple say: I want the toolchain for version X. This feature needs to be
added to the plugin.

That said I think I will write a recipe for this. This is first of all  
an

issue for library writers who want to have their jar compatible with
multiple Java versions for their end users.
Result: One javac call per execution block as it was meant to be.

thanks,
Robert

On Fri, 26 Aug 2016 15:31:07 +0200, Oliver Gondža 
wrote:

Thank you all for your suggestions. I managed to get the project to  
build

with following maven setup:

```
   
   
 
   
 org.apache.maven.plugins
 maven-compiler-plugin
 3.5.1
 
   1.6
   1.6
 
 
   
 default-compile
 
   
 **/module-info.java
   
 
   
 
   
 
   

   
 
   jigsaw
   
 [1.9,)
   
   
 
   
 org.apache.maven.plugins
 maven-compiler-plugin
 3.5.1
 
   1.9
   1.9
 
 
   
   

Re: Building jar targeting multiple Java versions, including 9

2016-08-31 Thread Paul Benedict
Robert, I'm responding to dev@maven so we can discuss Maven philosophies...

I believe the pattern should be based on a multi-module project. Each
module should target the expected JDK version. Then introduce a new "mrjar"
type for the parent that knows how to bind them all together into a
Multi-Release JAR.

Cheers,
Paul

On Wed, Aug 31, 2016 at 6:10 AM, Robert Scholte 
wrote:

> I've been working on the implementation of this in the
> maven-compiler-plugin, but I'm not really pleased with the result.
> The problem is that in the worst case scenario we have to work with 3
> different versions of Java:
> - The Maven Runtime (set as JAVA_HOME)
> - JDK for the module-info.java
> - JDK for all other source files.
>
> The example below worked because all three were set to JDK9.
> But based on the source/target of 1.6 I cannot predict which JDK is used,
> only that it is at least JDK6. Should the plugin switch to another JDK?
> And if you want to compile with source/target 1.5 or less, you're in
> trouble. There's something called toolchain, where you can specify the
> JavaHome per version, but in case of maven-compiler-plugin it assumes that
> all java-related plugins and execution blocks want to use the same
> toolchain through the whole Maven project.
> The good news is that for the maven-jdeps-plugin I improved this part in
> Maven 3.3.1, since this plugin only works with Java8 and above, which
> doesn't have to be the same JDK to compile the sources with. Now you can
> simple say: I want the toolchain for version X. This feature needs to be
> added to the plugin.
>
> That said I think I will write a recipe for this. This is first of all an
> issue for library writers who want to have their jar compatible with
> multiple Java versions for their end users.
> Result: One javac call per execution block as it was meant to be.
>
> thanks,
> Robert
>
> On Fri, 26 Aug 2016 15:31:07 +0200, Oliver Gondža 
> wrote:
>
> Thank you all for your suggestions. I managed to get the project to build
>> with following maven setup:
>>
>> ```
>>
>>
>>  
>>
>>  org.apache.maven.plugins
>>  maven-compiler-plugin
>>  3.5.1
>>  
>>1.6
>>1.6
>>  
>>  
>>
>>  default-compile
>>  
>>
>>  **/module-info.java
>>
>>  
>>
>>  
>>
>>  
>>
>>
>>
>>  
>>jigsaw
>>
>>  [1.9,)
>>
>>
>>  
>>
>>  org.apache.maven.plugins
>>  maven-compiler-plugin
>>  3.5.1
>>  
>>1.9
>>1.9
>>  
>>  
>>
>>  module-infos
>>  compile
>>  
>>compile
>>  
>>  
>>
>>  **/module-info.java
>>
>>  
>>
>>  
>>
>>  
>>
>>  
>>
>> ```
>>
>> It does compile with older javac versions as a bonus. Given this is
>> nothing else than using `-source 9 -target 9` for module-info.java if
>> present, I dare to say maven-compiler-plugin can be adapted to figure this
>> out on its own.
>>
>


Re: JUnit categories in surefire/failsafe

2016-08-31 Thread Tibor Digana
>>One thing we noticed was that the user property is the same for both
plugins.
We are aware of this and we will break the backwards compatibility.
This change is planned in Surefire 3.0.
I am thinking of temporary fix but better not to do it.
Please reference and use these user/properties since they will be in 3.0:
${surefire.groups}
${failsafe.groups}

Do you still need "_ALL_, _NONE_, _UNCATEGORIZED_"?



On Wed, Aug 31, 2016 at 10:23 AM, Tibor Digana 
wrote:

> Hi Christopher,
>
> Some offtopic. I will answer your email but first I have a question for you
> and Accumulo project.
> I visited Accumulo cca one week ago. Why the build [1] hangs on IT test
> executed by maven-failsafe-plugin:2.19.1?
> I asked INFRA team to display Thread Dump button. Do you see this button,
> can you grap the thread dump? I would like to see what's going on in
> Failsafe plugin. We have such issue [3] but now it most probably is user
> error because the developer overrides std/out via System.setOut() but this
> hypothesis has not been confirmed yet because it's too early.
>
> [1] https://builds.apache.org/job/Accumulo-master-IT/
> [2] https://wiki.jenkins-ci.org/display/JENKINS/Thread+Dump+Action+Plugin
> [3] https://issues.apache.org/jira/browse/SUREFIRE-1193
>
>
>
> On Wed, Aug 31, 2016 at 3:00 AM, Christopher [via Maven] <
> ml-node+s40175n5879500...@n5.nabble.com> wrote:
>
> > tl;dr - A proposal for config independence for groups/excludeGroups
> > properties and some special keywords for ALL, NONE, and UNCATEGORIZED
> > groups.
> >
> > ***
> >
> > In the Apache Accumulo project, we're currently in process of trying to
> > make use of JUnit categories to separate different classes of tests.
> >
> > So, we're using the maven-surefire-plugin and maven-failsafe-plugin
> > configuration properties: groups, excludeGroups
> >
> > One thing we noticed was that the user property is the same for both
> > plugins. This is a problem, because one cannot pass in a system property
> > on
> > the command-line to affect one without affecting the other.
> >
> > I propose that maven-surefire-plugin and maven-failsafe-plugin deprecate
> > the existing groups/excludeGroups properties, and replace them with
> > tests.groups/tests.excludeGroups, and it.groups/it.excludeGroups. (This
> > should probably be done for other shared properties as well.)
> >
> > Users can simulate this by doing something like this:
> > ${it.groups}
> >
> > However, this may cause problems if the property is not defined I
> > haven't tested to be sure.
> >
> > ***
> >
> > That leads me to a question and a second proposal:
> > Is there a way to specify uncategorized test classes? Or all test
> classes?
> > Or none?
> >
> > If not, I would like to propose that some special keywords be created
> > which
> > can represent:
> > _ALL_, _NONE_, _UNCATEGORIZED_ (or similar)
> >
> > That way, users can do things like:
> > my.special.Category,_UNCATEGORIZED_
> > _NONE_
> > or
> > _NONE_
> > or
> > _ALL_
> >
> > These keywords may require some support from the underlying test
> > framework,
> > like JUnit, so I can understand if these keywords cannot happen.
> >
> > Even if the keywords cannot be made to work, I still think it'd be good
> to
> > deprecate-and-separate the properties for the two plugins, so they can be
> > controlled independently with user properties.
> >
> > Thanks.
> >
> >
> > --
> > If you reply to this email, your message will be added to the discussion
> > below:
> > http://maven.40175.n5.nabble.com/JUnit-categories-in-
> > surefire-failsafe-tp5879500.html
> > To start a new topic under Maven Developers, email
> > ml-node+s40175n142166...@n5.nabble.com
> > To unsubscribe from Maven Developers, click here
> >  macro=unsubscribe_by_code=142166=dGlib3JkaWdhbmFAYXBhY2hlLm9yZ3
> wxNDIxNjZ8LTI4OTQ5MjEwMg==>
> > .
> > NAML
> >  macro=macro_viewer=instant_html%21nabble%3Aemail.naml&
> base=nabble.naml.namespaces.BasicNamespace-nabble.view.
> web.template.NabbleNamespace-nabble.view.web.template.
> NodeNamespace=notify_subscribers%21nabble%
> 3Aemail.naml-instant_emails%21nabble%3Aemail.naml-send_
> instant_email%21nabble%3Aemail.naml>
> >
>
>
>
>
> --
> View this message in context: http://maven.40175.n5.nabble.
> com/JUnit-categories-in-surefire-failsafe-tp5879500p5879510.html
> Sent from the Maven Developers mailing list archive at Nabble.com.


Re: JUnit categories in surefire/failsafe

2016-08-31 Thread Tibor Digana
Hi Christopher,

Some offtopic. I will answer your email but first I have a question for you
and Accumulo project.
I visited Accumulo cca one week ago. Why the build [1] hangs on IT test
executed by maven-failsafe-plugin:2.19.1?
I asked INFRA team to display Thread Dump button. Do you see this button,
can you grap the thread dump? I would like to see what's going on in
Failsafe plugin. We have such issue [3] but now it most probably is user
error because the developer overrides std/out via System.setOut() but this
hypothesis has not been confirmed yet because it's too early.

[1] https://builds.apache.org/job/Accumulo-master-IT/
[2] https://wiki.jenkins-ci.org/display/JENKINS/Thread+Dump+Action+Plugin
[3] https://issues.apache.org/jira/browse/SUREFIRE-1193



On Wed, Aug 31, 2016 at 3:00 AM, Christopher [via Maven] <
ml-node+s40175n5879500...@n5.nabble.com> wrote:

> tl;dr - A proposal for config independence for groups/excludeGroups
> properties and some special keywords for ALL, NONE, and UNCATEGORIZED
> groups.
>
> ***
>
> In the Apache Accumulo project, we're currently in process of trying to
> make use of JUnit categories to separate different classes of tests.
>
> So, we're using the maven-surefire-plugin and maven-failsafe-plugin
> configuration properties: groups, excludeGroups
>
> One thing we noticed was that the user property is the same for both
> plugins. This is a problem, because one cannot pass in a system property
> on
> the command-line to affect one without affecting the other.
>
> I propose that maven-surefire-plugin and maven-failsafe-plugin deprecate
> the existing groups/excludeGroups properties, and replace them with
> tests.groups/tests.excludeGroups, and it.groups/it.excludeGroups. (This
> should probably be done for other shared properties as well.)
>
> Users can simulate this by doing something like this:
> ${it.groups}
>
> However, this may cause problems if the property is not defined I
> haven't tested to be sure.
>
> ***
>
> That leads me to a question and a second proposal:
> Is there a way to specify uncategorized test classes? Or all test classes?
> Or none?
>
> If not, I would like to propose that some special keywords be created
> which
> can represent:
> _ALL_, _NONE_, _UNCATEGORIZED_ (or similar)
>
> That way, users can do things like:
> my.special.Category,_UNCATEGORIZED_
> _NONE_
> or
> _NONE_
> or
> _ALL_
>
> These keywords may require some support from the underlying test
> framework,
> like JUnit, so I can understand if these keywords cannot happen.
>
> Even if the keywords cannot be made to work, I still think it'd be good to
> deprecate-and-separate the properties for the two plugins, so they can be
> controlled independently with user properties.
>
> Thanks.
>
>
> --
> If you reply to this email, your message will be added to the discussion
> below:
> http://maven.40175.n5.nabble.com/JUnit-categories-in-
> surefire-failsafe-tp5879500.html
> To start a new topic under Maven Developers, email
> ml-node+s40175n142166...@n5.nabble.com
> To unsubscribe from Maven Developers, click here
> 
> .
> NAML
> 
>




--
View this message in context: 
http://maven.40175.n5.nabble.com/JUnit-categories-in-surefire-failsafe-tp5879500p5879510.html
Sent from the Maven Developers mailing list archive at Nabble.com.