Re: maven-assembly-plugin:2.4 The following patterns were never triggered in this artifact inclusion filter

2014-02-12 Thread Ron Wheeler

On 11/02/2014 9:06 PM, Barrie Treloar wrote:

On 12 February 2014 11:43, Ron Wheeler rwhee...@artifact-software.com wrote:

I am trying to build a timesaving module that will build tar files to be
installed on a server
I have defined a bunch of dependencies in the pom and want to build 2 tar
files by invoking 2 instances of the plug-in.
In each one I want to specify a list of dependencies to be included from the
repository.
   in the assembly descriptor I have
   dependencySets
 dependencySet
   includes
 include com.artifact-software.taw:taw-localized-iumessage-ws
 include com.artifact-software.taw:taw-dataccess-ws:

I would like it to take just these 2 modules for this tar file.
There are a lot more dependencies that will go in the second invocation of
the assembly plug-in.


This gets the following message

[INFO] --- maven-assembly-plugin:2.4:single (default) @
taw-webapps-assembler ---
[INFO] Reading assembly descriptor: src/assembly/webapps-tar.xml
[WARNING] The following patterns were never triggered in this artifact
inclusion filter:
o  'com.artifact-software.taw:taw-localized-iumessage-ws'
o  'com.artifact-software.taw:taw-dataccess-ws:'

[INFO]

[INFO] BUILD FAILURE
[INFO]

[INFO] Total time: 9.658s
[INFO] Finished at: Tue Feb 11 19:54:13 EST 2014
[INFO] Final Memory: 10M/148M
[INFO]

[ERROR] Failed to execute goal
org.apache.maven.plugins:maven-assembly-plugin:2.4:single (default) on
project taw-webapps-assembler: Failed to create
assembly: Error creating assembly archive webapps: You must set at least one
file. - [Help 1]

What have I misunderstood about the assembly.

I'm assuming you've read the docs.

From 
http://maven.apache.org/plugins/maven-assembly-plugin/examples/single/including-and-excluding-artifacts.html
I think the shortened form assumes a jar classifier.
I'm guessing from your assembly name webapps-tar.xml that these might be wars?
Have you tried using the long form of the dependency conflict id?

I am not sure what you mean by this.
The dependency specification in the pom has the right type.
I tried adding the version and type in the assembly file and just got an 
error message with a more specific file name.


I have not specified a specific phase but I assume that the plug-in is 
being triggered since the error comes from the plug-in.


I did read the docs but there is no example showing including of 
specific dependencies.


Ron




From 
http://maven.apache.org/plugins/maven-assembly-plugin/examples/multimodule/module-binary-inclusion-simple.html
What phase to did you attach the plugin to?
It may be possible you haven't bound it to a phase that has resolved
the dependencies and so there are none for it to find.




--
Ron Wheeler
President
Artifact Software Inc
email: rwhee...@artifact-software.com
skype: ronaldmwheeler
phone: 866-970-2435, ext 102


-
To unsubscribe, e-mail: users-unsubscr...@maven.apache.org
For additional commands, e-mail: users-h...@maven.apache.org



Re: Adding dependency management to parent pom causes errors

2014-02-12 Thread geoffh
Laird, Barrie

Thanks for taking the time with some useful explanations
I'd like to put more detail into my problem in the hope that you may be able
to spot something I am doing wrong:

I started off with parent and child poms with no dependency management -
along the lines of:

Parent pom (without dependencyManagement)

dependencies
dependency
groupIdlog4j/groupId
artifactIdlog4j/artifactId
version${log4j-version}/version
/dependency

.
.
.
/dependencies

Child pom (without dependencyManagement)

dependencies
dependency
groupIdlog4j/groupId
artifactIdlog4j/artifactId
version${log4j-version}/version
/dependency
.
.
.
/dependencies

As I said, 'mvn clean deploy' completed successfully with no compilation
errors


I then changed the parent and child poms to use dependency management -
along the lines of:

Parent pom (with dependencyManagement)
dependencyManagement
dependencies
dependency
groupIdlog4j/groupId
artifactIdlog4j/artifactId
version${log4j-version}/version
/dependency

.
.
.
/dependencies
/dependencyManagement


Child pom (with dependencyManagement)

dependencies
dependency
groupIdlog4j/groupId
artifactIdlog4j/artifactId
/dependency
.
.
.
/dependencies

I arranged that each child pom only referenced the dependencies it needed
- with the parent pom having all dependencies required by all children
After that I got the compilation error
Interestingly, I got the same error trace when invoking 'mvn
dependency:analyze'

The compilation error you're talking about looks like something more
substantial than a bad dependency--you're getting a compiler error from
within javac itself.

However,  the compilation error did not occur in the situation without
dependencyManagement
If there is a genuine compilation error, would it be expected to show in
both cases

Is that the correct way to use dependencyManagement ?

Thanks for reading and any help / suggestions
Geoff



--
View this message in context: 
http://maven.40175.n5.nabble.com/Adding-dependency-management-to-parent-pom-causes-errors-tp5784035p5784252.html
Sent from the Maven - Users mailing list archive at Nabble.com.

-
To unsubscribe, e-mail: users-unsubscr...@maven.apache.org
For additional commands, e-mail: users-h...@maven.apache.org



Re: Adding dependency management to parent pom causes errors

2014-02-12 Thread Adrien Rivard
Hi,

DependencyManagement is declaring dependencies(mostly for versions) and
dependencies is using them,
So in general you should have only dependencyManagement in the parent poms
and only dependencies in the childs.

Now there are exceptions, for example junit is generally used in all child
projects , so you can add it in dependencies of the parent pom and it will
be avaible in all childs.




On Wed, Feb 12, 2014 at 10:33 AM, geoffh hartnel...@yahoo.co.uk wrote:

 Laird, Barrie

 Thanks for taking the time with some useful explanations
 I'd like to put more detail into my problem in the hope that you may be
 able
 to spot something I am doing wrong:

 I started off with parent and child poms with no dependency management -
 along the lines of:

 Parent pom (without dependencyManagement)

 dependencies
 dependency
 groupIdlog4j/groupId
 artifactIdlog4j/artifactId
 version${log4j-version}/version
 /dependency

 .
 .
 .
 /dependencies

 Child pom (without dependencyManagement)

 dependencies
 dependency
 groupIdlog4j/groupId
 artifactIdlog4j/artifactId
 version${log4j-version}/version
 /dependency
 .
 .
 .
 /dependencies

 As I said, 'mvn clean deploy' completed successfully with no compilation
 errors


 I then changed the parent and child poms to use dependency management -
 along the lines of:

 Parent pom (with dependencyManagement)
 dependencyManagement
 dependencies
 dependency
 groupIdlog4j/groupId
 artifactIdlog4j/artifactId
 version${log4j-version}/version
 /dependency

 .
 .
 .
 /dependencies
 /dependencyManagement


 Child pom (with dependencyManagement)

 dependencies
 dependency
 groupIdlog4j/groupId
 artifactIdlog4j/artifactId
 /dependency
 .
 .
 .
 /dependencies

 I arranged that each child pom only referenced the dependencies it needed
 - with the parent pom having all dependencies required by all children
 After that I got the compilation error
 Interestingly, I got the same error trace when invoking 'mvn
 dependency:analyze'

 The compilation error you're talking about looks like something more
 substantial than a bad dependency--you're getting a compiler error from
 within javac itself.

 However,  the compilation error did not occur in the situation without
 dependencyManagement
 If there is a genuine compilation error, would it be expected to show in
 both cases

 Is that the correct way to use dependencyManagement ?

 Thanks for reading and any help / suggestions
 Geoff



 --
 View this message in context:
 http://maven.40175.n5.nabble.com/Adding-dependency-management-to-parent-pom-causes-errors-tp5784035p5784252.html
 Sent from the Maven - Users mailing list archive at Nabble.com.

 -
 To unsubscribe, e-mail: users-unsubscr...@maven.apache.org
 For additional commands, e-mail: users-h...@maven.apache.org




-- 
Adrien Rivard


Re: Why is dependency:analyze lying to me?

2014-02-12 Thread Mark H. Wood
On Wed, Feb 12, 2014 at 08:11:29AM +1030, Barrie Treloar wrote:
[snip]
 I think Maven is missing a scope, it needs to break up test into two
 phases; testCompile and testRuntime instead of having one scope which
 means both.
 This means that the analyze code can't tell what stuff is needed for
 tests at compile time and what is needed at runtime.

Picky terminology tweak:  I went off to check my memory because I was
*sure* that Maven has a full set of test-* phases parallel to what we
might call the main phases.  It has.  Then I came back here and saw
that what you want is parallel *scopes* [test]Compile, [test]Runtime.

Having sorted that (I hope), I think I agree with you.

-- 
Mark H. Wood, Lead System Programmer   mw...@iupui.edu
Machines should not be friendly.  Machines should be obedient.


signature.asc
Description: Digital signature


Re: maven-assembly-plugin:2.4 The following patterns were never triggered in this artifact inclusion filter

2014-02-12 Thread Barrie Treloar
 From
 http://maven.apache.org/plugins/maven-assembly-plugin/examples/single/including-and-excluding-artifacts.html
 I think the shortened form assumes a jar classifier.
 I'm guessing from your assembly name webapps-tar.xml that these might be
 wars?
 Have you tried using the long form of the dependency conflict id?

 I am not sure what you mean by this.
 The dependency specification in the pom has the right type.
 I tried adding the version and type in the assembly file and just got an
 error message with a more specific file name.

http://maven.apache.org/plugins/maven-assembly-plugin/examples/single/including-and-excluding-artifacts.html
defines the spec for dependency conflict id and it does not include
version.

It is either
* short form = groupId:artifactId
* long form = groupId:artifactId:type:classifier

Can you paste in your pom values for the dependency, as well as the
snippet from your assembly file.

I'm suggesting that you may not be able to use
includecom.artifact-software.taw:taw-localized-iumessage-ws/include
but instead need
includecom.artifact-software.taw:taw-localized-iumessage-ws:war:/include

 I have not specified a specific phase but I assume that the plug-in is being
 triggered since the error comes from the plug-in.

 I did read the docs but there is no example showing including of specific
 dependencies.

http://maven.apache.org/plugins/maven-assembly-plugin/examples/multimodule/module-binary-inclusion-simple.html
Has the example you want to look at.
It has the assembly descriptior, the pom that describes the plugin
management, and the actual pom that defines the dependencies and
invokes the assembly plugin.

-
To unsubscribe, e-mail: users-unsubscr...@maven.apache.org
For additional commands, e-mail: users-h...@maven.apache.org



Re: Why is dependency:analyze lying to me?

2014-02-12 Thread Barrie Treloar
On 13 February 2014 00:20, Mark H. Wood mw...@iupui.edu wrote:
 On Wed, Feb 12, 2014 at 08:11:29AM +1030, Barrie Treloar wrote:
 [snip]
 I think Maven is missing a scope, it needs to break up test into two
 phases; testCompile and testRuntime instead of having one scope which
 means both.
[del]
 Picky terminology tweak:

Yes, my bad.
I meant break that one scope into two scopes (not phases).
Much like there are two scopes for compilation (compile and runtime),
there needs to be two scopes for test (compile and runtime).

-
To unsubscribe, e-mail: users-unsubscr...@maven.apache.org
For additional commands, e-mail: users-h...@maven.apache.org



Re: Adding dependency management to parent pom causes errors

2014-02-12 Thread Barrie Treloar
On 12 February 2014 20:12, Adrien Rivard adrien.riv...@gmail.com wrote:
 Hi,

 DependencyManagement is declaring dependencies(mostly for versions) and
 dependencies is using them,
 So in general you should have only dependencyManagement in the parent poms
 and only dependencies in the childs.

 Now there are exceptions, for example junit is generally used in all child
 projects , so you can add it in dependencies of the parent pom and it will
 be avaible in all childs.

Adding dependencies into the pom with modules is a bad idea,
There will always be an exception that breaks that rule.
Then you will be asking how do I remove a dependency from a child pom
- you can't, dont define it in the first place.

Typical exceptions would be poms whose sole purpose is to build an
assembly (zip file) for package management, or Ear poms, etc.

A best practice is to never have any dependencies in a pom that has modules.
If you want to share common dependencies between projects, then you
can look at using the inherit type relationship but that pom would be
a standalone pom with no module declarations.
Your top level project is an Aggregation pom to make running Maven easier.
See https://maven.apache.org/pom.html#Aggregation

-
To unsubscribe, e-mail: users-unsubscr...@maven.apache.org
For additional commands, e-mail: users-h...@maven.apache.org



Re: maven-assembly-plugin:2.4 The following patterns were never triggered in this artifact inclusion filter

2014-02-12 Thread Ron Wheeler

Thanks for the help.
Here are more details from the pom and assembly.

I did a debug run and I think that the log shows where the includes get 
excluded


[DEBUG] com.artifact_software.taw:taw-localized-uimessage-ws:war:1.0.2 
was removed by one or more filters.
[DEBUG] com.artifact_software.taw:taw-dataaccess-ws:war:1.0.2 was 
removed by one or more filters.


You can see this in context below.
They make it through the scope filter but are getting rejected in 
another filter that does not get applied when includes are not specified.



BTW:
1) If I comment out the includes, I get all the dependencies from the 
pom packed into a giant tar.
It appears that when the includes are added,the plug-in recognizes the 
includes and thinks that I might want them to be used so it warns me
2) If I add xxx in the middle of the name 
(com.artifact-software.taw:taw-localized-iumessagexxx-ws:war) in the 
include, I get the same not triggered message so it appears that the 
plug-in suspects that I want to do an include but I have missed a 
required directive and it doesn't get to the point of referencing the 
pom to find the dependencies since it never sees the bogus artifact name 
as a problem.
We use the assembly plug-in a lot so I would like to stick with it for 
consistency but I way try shade or a simple ANT script


dependency
groupIdcom.artifact_software.taw/groupId
artifactIdtaw-localized-uimessage-ws/artifactId
version${project.version}/version
typewar/type
scoperuntime/scope
/dependency
dependency
groupIdcom.artifact_software.taw/groupId
artifactIdtaw-dataaccess-ws/artifactId
version${project.version}/version
typewar/type
scoperuntime/scope
/dependency


idwebapps/id
   formats
  formattar/format
   /formats
   includeBaseDirectoryfalse/includeBaseDirectory
   dependencySets
dependencySet
  outputDirectory/target/outputDirectory
  useProjectArtifactfalse/useProjectArtifact
  unpackfalse/unpack
  scoperuntime/scope
  includes
includecom.artifact-software.taw:taw-localized-iumessage-ws:war/include
includecom.artifact-software.taw:taw-dataccess-ws:war/include
  /includes
/dependencySet
  /dependencySets

plugin
artifactIdmaven-assembly-plugin/artifactId
configuration
descriptors
descriptorsrc/assembly/webapps-tar.xml
/descriptor
/descriptors
appendAssemblyIdtrue/appendAssemblyId
attachtrue/attach
/configuration

/plugin


Log
lots of stuff then
[DEBUG] Resolving project dependencies transitively.
[DEBUG] com.artifact_software.taw:taw-webapps-assembler:jar:1.0.2 
(selected for null)
[DEBUG] Using mirror nexus 
(http://repo.artifact-software.com:8081/nexus/content/groups/public) for 
central (http://repo1.maven.org/maven2).
[DEBUG] 
com.artifact_software.taw:taw-localized-uimessage-ws:war:1.0.2:runtime 
(selected for runtime)
[DEBUG] Using mirror nexus 
(http://repo.artifact-software.com:8081/nexus/content/groups/public) for 
central (http://repo1.maven.org/maven2).
[DEBUG] com.artifact_software.taw:taw-dataaccess-ws:war:1.0.2:runtime 
(selected for runtime)
[DEBUG] While resolving dependencies of 
com.artifact_software.taw:taw-webapps-assembler:jar:1.0.2:
[DEBUG] Statistics for Scope filter [null-scope=true, compile=true, 
runtime=true, test=false, provided=false, system=false]

... many lines that look ok
... cannot find something that it would like to have

[DEBUG] Cannot find ArtifactResolver with hint: project-cache-aware
org.codehaus.plexus.component.repository.exception.ComponentLookupException: 
java.util.NoSuchElementException

  role: org.apache.maven.artifact.resolver.ArtifactResolver
  roleHint: project-cache-aware


... stack dump


* THIS LOOKS LIKE THE POINT WHERE THE INCLUDES 
GET DROPPED


[DEBUG] Processing DependencySet (output=/target)
[DEBUG] Filtering dependency artifacts WITHOUT transitive dependency 
path information.
[DEBUG] com.artifact_software.taw:taw-localized-uimessage-ws:war:1.0.2 
was removed by one or more filters.
[DEBUG] com.artifact_software.taw:taw-dataaccess-ws:war:1.0.2 was 
removed by one or more filters.

... restof dependencies also removed



[DEBUG] Statistics for Includes filter:
o 'com.artifact-software.taw:taw-localized-iumessage-ws:war'
o 'com.artifact-software.taw:taw-dataccess-ws:war'

[WARNING] The following patterns were never triggered in this artifact 
inclusion filter:

o  'com.artifact-software.taw:taw-localized-iumessage-ws:war'
o  'com.artifact-software.taw:taw-dataccess-ws:war'

[DEBUG] The following artifacts were removed by this artifact inclusion 
filter:

com.artifact_software.taw:taw-localized-uimessage-ws:war:1.0.2
com.artifact_software.taw:taw-dataaccess-ws:war:1.0.2
.
. list of other dependencies that I will add once I get this to wok
.

Code coverage with debug logs: 100% branch coverage not possible?...

2014-02-12 Thread Benoît Berthonneau
Hi all,

 

I need your opinion/way to tackle the following problem:

In many projects we use a Logger (doesn’t matter which implementation). It
is often recommend to test if the debug level is activated before logging a
debug trace like the following:

if (logger.isDebugEnabled()) {

logger.debug(“blah ” + i + “ in the loop that contains ” + max);

}

 

Now when you run unit tests on this kind of code you need to make a choice:
run tests with INFO level or run tests with ALL traces activated. I choose
the second option in order to:

* Check that debug traces doesn’t throw unwanted exception (like
NPE)

* Have a better code coverage in term of covered lines

 

But in term of branches coverage we could never have a 100% :(

 

To me the only way to cover this is to run the tests suite 2 times: one with
INFO traces configured, and another one with ALL traces activated.

Did you face this issue and how did you solve it ?

 

Thanks,

Benoît.



Re: Code coverage with debug logs: 100% branch coverage not possible?...

2014-02-12 Thread Paul Benedict
IIRC, there should be an option in Emma/Cobertura that allows you to
exclude coverage on certain classes. So if you can exclude your log4j
classes (you don't really want to test your logging, do you?), then you
should be able to raise your percentage.


On Wed, Feb 12, 2014 at 2:30 PM, Benoît Berthonneau
ben...@berthonneau.comwrote:

 Hi all,



 I need your opinion/way to tackle the following problem:

 In many projects we use a Logger (doesn't matter which implementation). It
 is often recommend to test if the debug level is activated before logging a
 debug trace like the following:

 if (logger.isDebugEnabled()) {

 logger.debug(blah  + i +  in the loop that contains  + max);

 }



 Now when you run unit tests on this kind of code you need to make a choice:
 run tests with INFO level or run tests with ALL traces activated. I choose
 the second option in order to:

 * Check that debug traces doesn't throw unwanted exception (like
 NPE)

 * Have a better code coverage in term of covered lines



 But in term of branches coverage we could never have a 100% :(



 To me the only way to cover this is to run the tests suite 2 times: one
 with
 INFO traces configured, and another one with ALL traces activated.

 Did you face this issue and how did you solve it ?



 Thanks,

 Benoît.




-- 
Cheers,
Paul


Re: Code coverage with debug logs: 100% branch coverage not possible?...

2014-02-12 Thread Reto Hablützel
To follow up on Paul's answer, I would go as far as to say that I test my
code in order to feel comfortable that it actually does what it should do.
Thus, I can very well live with 90% coverage if I know what the missing 10%
are rather than testing for the sake of making a good impression with a
100% report.

- Reto
On Feb 12, 2014 9:37 PM, Paul Benedict pbened...@apache.org wrote:

 IIRC, there should be an option in Emma/Cobertura that allows you to
 exclude coverage on certain classes. So if you can exclude your log4j
 classes (you don't really want to test your logging, do you?), then you
 should be able to raise your percentage.


 On Wed, Feb 12, 2014 at 2:30 PM, Benoît Berthonneau
 ben...@berthonneau.comwrote:

  Hi all,
 
 
 
  I need your opinion/way to tackle the following problem:
 
  In many projects we use a Logger (doesn't matter which implementation).
 It
  is often recommend to test if the debug level is activated before
 logging a
  debug trace like the following:
 
  if (logger.isDebugEnabled()) {
 
  logger.debug(blah  + i +  in the loop that contains  + max);
 
  }
 
 
 
  Now when you run unit tests on this kind of code you need to make a
 choice:
  run tests with INFO level or run tests with ALL traces activated. I
 choose
  the second option in order to:
 
  * Check that debug traces doesn't throw unwanted exception (like
  NPE)
 
  * Have a better code coverage in term of covered lines
 
 
 
  But in term of branches coverage we could never have a 100% :(
 
 
 
  To me the only way to cover this is to run the tests suite 2 times: one
  with
  INFO traces configured, and another one with ALL traces activated.
 
  Did you face this issue and how did you solve it ?
 
 
 
  Thanks,
 
  Benoît.
 
 


 --
 Cheers,
 Paul



RE: Code coverage with debug logs: 100% branch coverage not possible?...

2014-02-12 Thread Benoît Berthonneau
Hi Paul,

 

Don't think that I could play with exclusions. Here is an example :

 

A Unit Test :



 

The tested class with ALL traces activated:



 

And the same tested class with INFO traces activated:



 

 

-Message d'origine-
De : paulus.benedic...@gmail.com [mailto:paulus.benedic...@gmail.com] De la
part de Paul Benedict
Envoyé : mercredi 12 février 2014 21:36
À : Maven Users List
Objet : Re: Code coverage with debug logs: 100% branch coverage not
possible?...

 

IIRC, there should be an option in Emma/Cobertura that allows you to exclude
coverage on certain classes. So if you can exclude your log4j classes (you
don't really want to test your logging, do you?), then you should be able to
raise your percentage.

 

 

On Wed, Feb 12, 2014 at 2:30 PM, Benoît Berthonneau

 mailto:ben...@berthonneau.com ben...@berthonneau.comwrote:

 

 Hi all,

 

 

 

 I need your opinion/way to tackle the following problem:

 

 In many projects we use a Logger (doesn't matter which 

 implementation). It is often recommend to test if the debug level is 

 activated before logging a debug trace like the following:

 

 if (logger.isDebugEnabled()) {

 

 logger.debug(blah  + i +  in the loop that contains  + max);

 

 }

 

 

 

 Now when you run unit tests on this kind of code you need to make a
choice:

 run tests with INFO level or run tests with ALL traces activated. I 

 choose the second option in order to:

 

 * Check that debug traces doesn't throw unwanted exception (like

 NPE)

 

 * Have a better code coverage in term of covered lines

 

 

 

 But in term of branches coverage we could never have a 100% :(

 

 

 

 To me the only way to cover this is to run the tests suite 2 times: 

 one with INFO traces configured, and another one with ALL traces 

 activated.

 

 Did you face this issue and how did you solve it ?

 

 

 

 Thanks,

 

 Benoît.

 

 

 

 

--

Cheers,

Paul



RE: Code coverage with debug logs: 100% branch coverage not possible?...

2014-02-12 Thread Kevin Krumwiede
It does matter which implementation.  The main reason it was recommended to
check the logging level was because string concatenation can be expensive,
and you want to avoid doing it for a message that won't be logged.  But if
you're using a logging API like slf4j that uses parameter replacement
tokens in the message string, if the message isn't logged, the replacement
won't be performed and the call will be cheap.
On Feb 12, 2014 1:57 PM, Benoît Berthonneau ben...@berthonneau.com
wrote:

 Hi Paul,



 Don't think that I could play with exclusions. Here is an example :



 *A Unit Test :*



 *The tested class with ALL traces activated:*



 *And the same tested class with INFO traces activated:*





 -Message d'origine-
 De : paulus.benedic...@gmail.com [mailto:paulus.benedic...@gmail.com] De
 la part de Paul Benedict
 Envoyé : mercredi 12 février 2014 21:36
 À : Maven Users List
 Objet : Re: Code coverage with debug logs: 100% branch coverage not
 possible?...



 IIRC, there should be an option in Emma/Cobertura that allows you to
 exclude coverage on certain classes. So if you can exclude your log4j
 classes (you don't really want to test your logging, do you?), then you
 should be able to raise your percentage.





 On Wed, Feb 12, 2014 at 2:30 PM, Benoît Berthonneau

 ben...@berthonneau.comwrote:



  Hi all,

 

 

 

  I need your opinion/way to tackle the following problem:

 

  In many projects we use a Logger (doesn't matter which

  implementation). It is often recommend to test if the debug level is

  activated before logging a debug trace like the following:

 

  if (logger.isDebugEnabled()) {

 

  logger.debug(blah  + i +  in the loop that contains  + max);

 

  }

 

 

 

  Now when you run unit tests on this kind of code you need to make a
 choice:

  run tests with INFO level or run tests with ALL traces activated. I

  choose the second option in order to:

 

  * Check that debug traces doesn't throw unwanted exception (like

  NPE)

 

  * Have a better code coverage in term of covered lines

 

 

 

  But in term of branches coverage we could never have a 100% :(

 

 

 

  To me the only way to cover this is to run the tests suite 2 times:

  one with INFO traces configured, and another one with ALL traces

  activated.

 

  Did you face this issue and how did you solve it ?

 

 

 

  Thanks,

 

  Benoît.

 

 





 --

 Cheers,

 Paul



RE: Code coverage with debug logs: 100% branch coverage not possible?...

2014-02-12 Thread Benoît Berthonneau
Hi Reto,

I understood your point of view but to me the problem I'm talking about
could be easily solved for a better branch coverage and know what is the
real missed branches report.

Benoît.


-Message d'origine-
De : Reto Hablützel [mailto:ret...@rethab.ch] 
Envoyé : mercredi 12 février 2014 21:47
À : Maven Users List
Objet : Re: Code coverage with debug logs: 100% branch coverage not
possible?...

To follow up on Paul's answer, I would go as far as to say that I test my
code in order to feel comfortable that it actually does what it should do.
Thus, I can very well live with 90% coverage if I know what the missing 10%
are rather than testing for the sake of making a good impression with a 100%
report.

- Reto
On Feb 12, 2014 9:37 PM, Paul Benedict pbened...@apache.org wrote:

 IIRC, there should be an option in Emma/Cobertura that allows you to 
 exclude coverage on certain classes. So if you can exclude your log4j 
 classes (you don't really want to test your logging, do you?), then 
 you should be able to raise your percentage.


 On Wed, Feb 12, 2014 at 2:30 PM, Benoît Berthonneau
 ben...@berthonneau.comwrote:

  Hi all,
 
 
 
  I need your opinion/way to tackle the following problem:
 
  In many projects we use a Logger (doesn't matter which implementation).
 It
  is often recommend to test if the debug level is activated before
 logging a
  debug trace like the following:
 
  if (logger.isDebugEnabled()) {
 
  logger.debug(blah  + i +  in the loop that contains  + max);
 
  }
 
 
 
  Now when you run unit tests on this kind of code you need to make a
 choice:
  run tests with INFO level or run tests with ALL traces activated. I
 choose
  the second option in order to:
 
  * Check that debug traces doesn't throw unwanted exception (like
  NPE)
 
  * Have a better code coverage in term of covered lines
 
 
 
  But in term of branches coverage we could never have a 100% :(
 
 
 
  To me the only way to cover this is to run the tests suite 2 times: 
  one with INFO traces configured, and another one with ALL traces 
  activated.
 
  Did you face this issue and how did you solve it ?
 
 
 
  Thanks,
 
  Benoît.
 
 


 --
 Cheers,
 Paul



-
To unsubscribe, e-mail: users-unsubscr...@maven.apache.org
For additional commands, e-mail: users-h...@maven.apache.org



RE: Code coverage with debug logs: 100% branch coverage not possible?...

2014-02-12 Thread Benoît Berthonneau
With the pictures attached…

 

 

De : Benoît Berthonneau [mailto:ben...@berthonneau.com] 
Envoyé : mercredi 12 février 2014 21:57
À : 'Maven Users List'
Objet : RE: Code coverage with debug logs: 100% branch coverage not
possible?...

 

Hi Paul,

 

Don't think that I could play with exclusions. Here is an example :

 

A Unit Test :



 

The tested class with ALL traces activated:



 

And the same tested class with INFO traces activated:



 

 

-Message d'origine-
De : paulus.benedic...@gmail.com mailto:paulus.benedic...@gmail.com
[mailto:paulus.benedic...@gmail.com] De la part de Paul Benedict
Envoyé : mercredi 12 février 2014 21:36
À : Maven Users List
Objet : Re: Code coverage with debug logs: 100% branch coverage not
possible?...

 

IIRC, there should be an option in Emma/Cobertura that allows you to exclude
coverage on certain classes. So if you can exclude your log4j classes (you
don't really want to test your logging, do you?), then you should be able to
raise your percentage.

 

 

On Wed, Feb 12, 2014 at 2:30 PM, Benoît Berthonneau

 mailto:ben...@berthonneau.com ben...@berthonneau.comwrote:

 

 Hi all,

 

 

 

 I need your opinion/way to tackle the following problem:

 

 In many projects we use a Logger (doesn't matter which 

 implementation). It is often recommend to test if the debug level is 

 activated before logging a debug trace like the following:

 

 if (logger.isDebugEnabled()) {

 

 logger.debug(blah  + i +  in the loop that contains  + max);

 

 }

 

 

 

 Now when you run unit tests on this kind of code you need to make a
choice:

 run tests with INFO level or run tests with ALL traces activated. I 

 choose the second option in order to:

 

 * Check that debug traces doesn't throw unwanted exception (like

 NPE)

 

 * Have a better code coverage in term of covered lines

 

 

 

 But in term of branches coverage we could never have a 100% :(

 

 

 

 To me the only way to cover this is to run the tests suite 2 times: 

 one with INFO traces configured, and another one with ALL traces 

 activated.

 

 Did you face this issue and how did you solve it ?

 

 

 

 Thanks,

 

 Benoît.

 

 

 

 

--

Cheers,

Paul


-
To unsubscribe, e-mail: users-unsubscr...@maven.apache.org
For additional commands, e-mail: users-h...@maven.apache.org

Re: Code coverage with debug logs: 100% branch coverage not possible?...

2014-02-12 Thread Mirko Friedenhagen
Hello Benoit,

Kevin is right, using slf4j[0] one would use sth. like:

logger.debug(“blah {} in the loop that contains {}”, i, max);

No need for iffing :-).

[0] http://www.slf4j.org/manual.html
Regards Mirko
--
http://illegalstateexception.blogspot.com/
https://github.com/mfriedenhagen/ (http://osrc.dfm.io/mfriedenhagen)
https://bitbucket.org/mfriedenhagen/


On Wed, Feb 12, 2014 at 10:10 PM, Kevin Krumwiede kjk...@gmail.com wrote:
 It does matter which implementation.  The main reason it was recommended to
 check the logging level was because string concatenation can be expensive,
 and you want to avoid doing it for a message that won't be logged.  But if
 you're using a logging API like slf4j that uses parameter replacement
 tokens in the message string, if the message isn't logged, the replacement
 won't be performed and the call will be cheap.
 On Feb 12, 2014 1:57 PM, Benoît Berthonneau ben...@berthonneau.com
 wrote:

 Hi Paul,



 Don't think that I could play with exclusions. Here is an example :



 *A Unit Test :*



 *The tested class with ALL traces activated:*



 *And the same tested class with INFO traces activated:*





 -Message d'origine-
 De : paulus.benedic...@gmail.com [mailto:paulus.benedic...@gmail.com] De
 la part de Paul Benedict
 Envoyé : mercredi 12 février 2014 21:36
 À : Maven Users List
 Objet : Re: Code coverage with debug logs: 100% branch coverage not
 possible?...



 IIRC, there should be an option in Emma/Cobertura that allows you to
 exclude coverage on certain classes. So if you can exclude your log4j
 classes (you don't really want to test your logging, do you?), then you
 should be able to raise your percentage.





 On Wed, Feb 12, 2014 at 2:30 PM, Benoît Berthonneau

 ben...@berthonneau.comwrote:



  Hi all,

 

 

 

  I need your opinion/way to tackle the following problem:

 

  In many projects we use a Logger (doesn't matter which

  implementation). It is often recommend to test if the debug level is

  activated before logging a debug trace like the following:

 

  if (logger.isDebugEnabled()) {

 

  logger.debug(blah  + i +  in the loop that contains  + max);

 

  }

 

 

 

  Now when you run unit tests on this kind of code you need to make a
 choice:

  run tests with INFO level or run tests with ALL traces activated. I

  choose the second option in order to:

 

  * Check that debug traces doesn't throw unwanted exception (like

  NPE)

 

  * Have a better code coverage in term of covered lines

 

 

 

  But in term of branches coverage we could never have a 100% :(

 

 

 

  To me the only way to cover this is to run the tests suite 2 times:

  one with INFO traces configured, and another one with ALL traces

  activated.

 

  Did you face this issue and how did you solve it ?

 

 

 

  Thanks,

 

  Benoît.

 

 





 --

 Cheers,

 Paul


-
To unsubscribe, e-mail: users-unsubscr...@maven.apache.org
For additional commands, e-mail: users-h...@maven.apache.org



Re: [scala-ide-user] Can't Find Scala Archetype

2014-02-12 Thread David Bernard
Hi,

What do you mean by it cannot find ?
You can't see it in the list of archetype when you create a new maven
project (via m2e) ? On my desktop it require few seconds to retrieve the
list of archetypes from Default Catalogs

  m2e - Maven Integration for Eclipse1.4.0.20130601-0317
org.eclipse.m2e.feature.feature.groupEclipse.org - m2e
  Scala IDE for Eclipse4.0.0.m1-2_10-201311011355-8ed264f
org.scala-ide.sdt.feature.feature.groupscala-ide.org

/davidB


On Wed, Feb 12, 2014 at 3:44 AM, Eric Kolotyluk eric.koloty...@gmail.comwrote:

 I am using the Typesafe Eclipse IDE, and for some reason it cannot find
 the archetype for

 groupIdnet.alchim31.maven/groupId
 artifactIdscala-archetype-simple/artifactId
 version1.5/version

 At one time it could, but now it can't, and I have no idea why. The
 diagnostics are completely useless in understanding the problem.

 Can anyone suggest anything that might be wrong and how to fix it?

 Cheers, Eric

 --
 You received this message because you are subscribed to the Google Groups
 Scala IDE User group.
 To unsubscribe from this group and stop receiving emails from it, send an
 email to scala-ide-user+unsubscr...@googlegroups.com.
 To view this discussion on the web visit https://groups.google.com/d/
 msgid/scala-ide-user/52FAE00E.2030206%40gmail.com.
 For more options, visit https://groups.google.com/groups/opt_out.



Re: Code coverage with debug logs: 100% branch coverage not possible?...

2014-02-12 Thread Ron Wheeler



Not really a Maven issue but if you do your logging like this:

package com.myco.testapp;
import org.slf4j.Logger;
import org.slf4j.LoggerFactory;

public class MyClass{
private Logger 
_logger=LoggerFactory.getLogger(this.getClass());


  logger.debug(“blah {} in the loop that contains {}”, i, max);

}

You can sort out the enabling of logs and destination of your logging by 
severity and class(I think by package as well) in the log configuration 
at run-time.


Ron

On 12/02/2014 4:20 PM, Mirko Friedenhagen wrote:

Hello Benoit,

Kevin is right, using slf4j[0] one would use sth. like:

logger.debug(“blah {} in the loop that contains {}”, i, max);

No need for iffing :-).

[0] http://www.slf4j.org/manual.html
Regards Mirko
--
http://illegalstateexception.blogspot.com/
https://github.com/mfriedenhagen/ (http://osrc.dfm.io/mfriedenhagen)
https://bitbucket.org/mfriedenhagen/


On Wed, Feb 12, 2014 at 10:10 PM, Kevin Krumwiede kjk...@gmail.com wrote:

It does matter which implementation.  The main reason it was recommended to
check the logging level was because string concatenation can be expensive,
and you want to avoid doing it for a message that won't be logged.  But if
you're using a logging API like slf4j that uses parameter replacement
tokens in the message string, if the message isn't logged, the replacement
won't be performed and the call will be cheap.
On Feb 12, 2014 1:57 PM, Benoît Berthonneau ben...@berthonneau.com
wrote:


Hi Paul,



Don't think that I could play with exclusions. Here is an example :



*A Unit Test :*



*The tested class with ALL traces activated:*



*And the same tested class with INFO traces activated:*





-Message d'origine-
De : paulus.benedic...@gmail.com [mailto:paulus.benedic...@gmail.com] De
la part de Paul Benedict
Envoyé : mercredi 12 février 2014 21:36
À : Maven Users List
Objet : Re: Code coverage with debug logs: 100% branch coverage not
possible?...



IIRC, there should be an option in Emma/Cobertura that allows you to
exclude coverage on certain classes. So if you can exclude your log4j
classes (you don't really want to test your logging, do you?), then you
should be able to raise your percentage.





On Wed, Feb 12, 2014 at 2:30 PM, Benoît Berthonneau

ben...@berthonneau.comwrote:




Hi all,
I need your opinion/way to tackle the following problem:
In many projects we use a Logger (doesn't matter which
implementation). It is often recommend to test if the debug level is
activated before logging a debug trace like the following:
if (logger.isDebugEnabled()) {
 logger.debug(blah  + i +  in the loop that contains  + max);
}
Now when you run unit tests on this kind of code you need to make a

choice:


run tests with INFO level or run tests with ALL traces activated. I
choose the second option in order to:
* Check that debug traces doesn't throw unwanted exception (like
NPE)
* Have a better code coverage in term of covered lines
But in term of branches coverage we could never have a 100% :(
To me the only way to cover this is to run the tests suite 2 times:
one with INFO traces configured, and another one with ALL traces
activated.
Did you face this issue and how did you solve it ?
Thanks,
Benoît.





--

Cheers,

Paul


-
To unsubscribe, e-mail: users-unsubscr...@maven.apache.org
For additional commands, e-mail: users-h...@maven.apache.org





--
Ron Wheeler
President
Artifact Software Inc
email: rwhee...@artifact-software.com
skype: ronaldmwheeler
phone: 866-970-2435, ext 102


-
To unsubscribe, e-mail: users-unsubscr...@maven.apache.org
For additional commands, e-mail: users-h...@maven.apache.org



Re: maven-assembly-plugin:2.4 The following patterns were never triggered in this artifact inclusion filter

2014-02-12 Thread Barrie Treloar
On 13 February 2014 02:24, Ron Wheeler rwhee...@artifact-software.com wrote:
 I did a debug run and I think that the log shows where the includes get
 idwebapps/id
formats
   formattar/format
/formats
includeBaseDirectoryfalse/includeBaseDirectory
dependencySets
 dependencySet
   outputDirectory/target/outputDirectory
   useProjectArtifactfalse/useProjectArtifact
   unpackfalse/unpack
   scoperuntime/scope
   includes

 includecom.artifact-software.taw:taw-localized-iumessage-ws:war/include
 includecom.artifact-software.taw:taw-dataccess-ws:war/include
   /includes
 /dependencySet
   /dependencySets
[del]
 [DEBUG] com.artifact_software.taw:taw-localized-uimessage-ws:war:1.0.2 was 
 removed by one or more filters.
 [DEBUG] com.artifact_software.taw:taw-dataaccess-ws:war:1.0.2 was removed by 
 one or more filters.

 [WARNING] The following patterns were never triggered in this artifact
 inclusion filter:
 o  'com.artifact-software.taw:taw-localized-iumessage-ws:war'
 o  'com.artifact-software.taw:taw-dataccess-ws:war'
[del]
 http://maven.apache.org/plugins/maven-assembly-plugin/examples/single/including-and-excluding-artifacts.html
 defines the spec for dependency conflict id and it does not include
 version.

 It is either
 * short form = groupId:artifactId
 * long form = groupId:artifactId:type:classifier

I'm guiding you blind since I've not done this before.
The output is telling us that the pattern
'com.artifact-software.taw:taw-localized-iumessage-ws:war'
does not match the artifact
com.artifact_software.taw:taw-localized-uimessage-ws:war:1.0.2

And there is your problem.
You have an _ in your groupid for the artifact, but not your include filter.

I've been looking at maven-common-artifact-filters-1.4
PatternIncludesArtifactFilter.matchAgainst(). since I thought that may
you needed a trailing : on the long id.

But it does the right thing with as much of the pattern you provide
- and wildcards the rest.

-
To unsubscribe, e-mail: users-unsubscr...@maven.apache.org
For additional commands, e-mail: users-h...@maven.apache.org



Re: Code coverage with debug logs: 100% branch coverage not possible?...

2014-02-12 Thread Benoît Berthonneau
Ron, Mirko, Kevin,

Thanks for your feedback : you're right with Slf4j implementation. 
Unfortunately, it is not. It is a home made logger interface implemented by 
Log4j.

Benoît

 Le 12 févr. 2014 à 23:25, Ron Wheeler rwhee...@artifact-software.com a 
 écrit :
 
 
 
 Not really a Maven issue but if you do your logging like this:
 
 package com.myco.testapp;
 import org.slf4j.Logger;
 import org.slf4j.LoggerFactory;
 
 public class MyClass{
private Logger _logger=LoggerFactory.getLogger(this.getClass());
 
  logger.debug(“blah {} in the loop that contains {}”, i, max);
 
 }
 
 You can sort out the enabling of logs and destination of your logging by 
 severity and class(I think by package as well) in the log configuration at 
 run-time.
 
 Ron
 
 On 12/02/2014 4:20 PM, Mirko Friedenhagen wrote:
 Hello Benoit,
 
 Kevin is right, using slf4j[0] one would use sth. like:
 
 logger.debug(“blah {} in the loop that contains {}”, i, max);
 
 No need for iffing :-).
 
 [0] http://www.slf4j.org/manual.html
 Regards Mirko
 --
 http://illegalstateexception.blogspot.com/
 https://github.com/mfriedenhagen/ (http://osrc.dfm.io/mfriedenhagen)
 https://bitbucket.org/mfriedenhagen/
 
 
 On Wed, Feb 12, 2014 at 10:10 PM, Kevin Krumwiede kjk...@gmail.com wrote:
 It does matter which implementation.  The main reason it was recommended to
 check the logging level was because string concatenation can be expensive,
 and you want to avoid doing it for a message that won't be logged.  But if
 you're using a logging API like slf4j that uses parameter replacement
 tokens in the message string, if the message isn't logged, the replacement
 won't be performed and the call will be cheap.
 On Feb 12, 2014 1:57 PM, Benoît Berthonneau ben...@berthonneau.com
 wrote:
 
 Hi Paul,
 
 
 
 Don't think that I could play with exclusions. Here is an example :
 
 
 
 *A Unit Test :*
 
 
 
 *The tested class with ALL traces activated:*
 
 
 
 *And the same tested class with INFO traces activated:*
 
 
 
 
 
 -Message d'origine-
 De : paulus.benedic...@gmail.com [mailto:paulus.benedic...@gmail.com] De
 la part de Paul Benedict
 Envoyé : mercredi 12 février 2014 21:36
 À : Maven Users List
 Objet : Re: Code coverage with debug logs: 100% branch coverage not
 possible?...
 
 
 
 IIRC, there should be an option in Emma/Cobertura that allows you to
 exclude coverage on certain classes. So if you can exclude your log4j
 classes (you don't really want to test your logging, do you?), then you
 should be able to raise your percentage.
 
 
 
 
 
 On Wed, Feb 12, 2014 at 2:30 PM, Benoît Berthonneau
 
 ben...@berthonneau.comwrote:
 
 
 
 Hi all,
 I need your opinion/way to tackle the following problem:
 In many projects we use a Logger (doesn't matter which
 implementation). It is often recommend to test if the debug level is
 activated before logging a debug trace like the following:
 if (logger.isDebugEnabled()) {
 logger.debug(blah  + i +  in the loop that contains  + max);
 }
 Now when you run unit tests on this kind of code you need to make a
 choice:
 
 run tests with INFO level or run tests with ALL traces activated. I
 choose the second option in order to:
 * Check that debug traces doesn't throw unwanted exception (like
 NPE)
 * Have a better code coverage in term of covered lines
 But in term of branches coverage we could never have a 100% :(
 To me the only way to cover this is to run the tests suite 2 times:
 one with INFO traces configured, and another one with ALL traces
 activated.
 Did you face this issue and how did you solve it ?
 Thanks,
 Benoît.
 
 
 
 
 --
 
 Cheers,
 
 Paul
 -
 To unsubscribe, e-mail: users-unsubscr...@maven.apache.org
 For additional commands, e-mail: users-h...@maven.apache.org
 
 
 -- 
 Ron Wheeler
 President
 Artifact Software Inc
 email: rwhee...@artifact-software.com
 skype: ronaldmwheeler
 phone: 866-970-2435, ext 102
 
 
 -
 To unsubscribe, e-mail: users-unsubscr...@maven.apache.org
 For additional commands, e-mail: users-h...@maven.apache.org
 

-
To unsubscribe, e-mail: users-unsubscr...@maven.apache.org
For additional commands, e-mail: users-h...@maven.apache.org