Re: Reasonable use of profiles?

2010-12-14 Thread fhomasp

After reading a bit of the debate I wonder a few things.  I read stay away
from profiles a lot but I do find them to be very useful.  

So what's the alternative on profiles?  Assuming there is a modular project
with several jars, several wars and several ears.  Each of those artifacts
can be built for a different environment (development, test (1,2,3),
staging, validation,...)

Then an ear/war can be deployed using Maven to those different environments,
be it from a local machine or Hudson or some other contineous integration
tool.

How would one automate such situations without profiles and without a huge
amount of redundant maven xml?


-- 
View this message in context: 
http://maven.40175.n5.nabble.com/Reasonable-use-of-profiles-tp3300650p3304188.html
Sent from the Maven - Users mailing list archive at Nabble.com.

-
To unsubscribe, e-mail: users-unsubscr...@maven.apache.org
For additional commands, e-mail: users-h...@maven.apache.org



Re: Execution specific assembly identifiers and suppressing jar-with-depends ouput

2010-12-14 Thread Ron Wheeler

What are you trying to build?
The project will build one artifact and the assembly plug-in will build 
another.

Are you sure you need 3 artifacts to be built?

We use maven to build web services which include both a client jar and a 
war for the service but we only need 1 execution.



Ron


On 14/12/2010 1:36 AM, Danny Thomas wrote:

Hi,

I'm using the assembly plugin to package my project. I currently have two 
assembly descriptors with two executions for my project which generates 
distributions of the project for two different audiences:

   !-- Client distribution --
   execution
 idclient/id
 phasepackage/phase
 goals
   goalsingle/goal
 /goals
 configuration
   descriptors
 descriptorsrc/main/assembly/dist-client.xml/descriptor
   /descriptors
 /configuration
   /execution
   !-- Developer distribution --
   execution
 iddeveloper/id
 phasepackage/phase
 goals
   goalsingle/goal
 /goals
 configuration
   descriptors
 descriptorsrc/main/assembly/dist-developer.xml/descriptor
   /descriptors
 /configuration
   /execution
 /executions

I'm using filters to substitute in the distributable jar filename, and they're 
definitely part of the solution, but I also need the final filename, the 
destination shell script names etc to represent the execution, so need 
expressions that are visible to the assembly. It's a command line app and I 
display different available actions depending on the distribution type passed 
in on a system parameter, so the filtered shell  script parameters an filenames 
don't have to differ at all except for the 'client' vs 'developer' designator.

I'm currently maintaining two descriptors and set of distribution resources 
which I want to avoid, but I can't seem to find a way of getting an expression 
containing the execution id or another parameter unique to the execution to the 
descriptor. I'd assume that if the expression is visible to the assembly, the 
plugin will also be able to substitute it into my shell scripts to consolidate 
the build into one assembly descriptor with two executions.

I've also had problems  suppressing output using the jar-with-dependencies 
assembly:  I'd like to suppress the 'skipping' and Plexus configuration merge 
related messages. Is this possible?

On the former, if have a Stackoverflow question if you're a member and you'd 
like earn yourself some reputation:

http://stackoverflow.com/questions/4406660/how-do-i-include-maven-assembly-execution-specific-expressions-in-an-assembly-des

Cheers,
Danny


This email and any attachments may contain confidential and proprietary 
information of Blackboard that is for the sole use of the intended recipient. 
If you are not the intended recipient, disclosure, copying, re-distribution or 
other use of any of this information is strictly prohibited. Please immediately 
notify the sender and delete this transmission if you received this email in 
error.




-
To unsubscribe, e-mail: users-unsubscr...@maven.apache.org
For additional commands, e-mail: users-h...@maven.apache.org



Re: Reasonable use of profiles?

2010-12-14 Thread Stephen Connolly
On 14 December 2010 08:06, fhomasp thomas.peet...@realdolmen.com wrote:

 After reading a bit of the debate I wonder a few things.  I read stay away
 from profiles a lot but I do find them to be very useful.

 So what's the alternative on profiles?  Assuming there is a modular project
 with several jars, several wars and several ears.  Each of those artifacts
 can be built for a different environment (development, test (1,2,3),
 staging, validation,...)


Here is your issue.

The Maven Way is to build one artifact that works in any environment.
You don't go building environment specific artifacts on the Maven Way.

You use things like JNDI or properties files added to the classpath to
provide the environment customisations

In situations like branding, you produce a brand artifact for each of
the customer specific brands and you would load that into your
application, by e.g. loading from the classpath, or by installing into
the deployed application.

If you have an existing application that is not designed this way,
then I can see that you might find it hard to avoid profiles but
you will have a better application if it is designed for the kind of
pluggable customizations I describe.

The Maven Way is about best practices and one of the best
practices there is is ensuring that you only build an artifact once
and re-use that tested artifact... it reduces the scope of testing
(i.e. you only have to test the JNDI names exist, or you only have to
test that the branding is correctly applied, rather than have to
retest the entire application because you have rebuilt it with the
alternate profile.

-Stephen


 Then an ear/war can be deployed using Maven to those different environments,
 be it from a local machine or Hudson or some other contineous integration
 tool.

 How would one automate such situations without profiles and without a huge
 amount of redundant maven xml?


 --
 View this message in context: 
 http://maven.40175.n5.nabble.com/Reasonable-use-of-profiles-tp3300650p3304188.html
 Sent from the Maven - Users mailing list archive at Nabble.com.

 -
 To unsubscribe, e-mail: users-unsubscr...@maven.apache.org
 For additional commands, e-mail: users-h...@maven.apache.org



-
To unsubscribe, e-mail: users-unsubscr...@maven.apache.org
For additional commands, e-mail: users-h...@maven.apache.org



Re: Reasonable use of profiles?

2010-12-14 Thread Ron Wheeler


On 14/12/2010 3:06 AM, fhomasp wrote:

After reading a bit of the debate I wonder a few things.  I read stay away
from profiles a lot but I do find them to be very useful.

So what's the alternative on profiles?  Assuming there is a modular project
with several jars, several wars and several ears.  Each of those artifacts
can be built for a different environment (development, test (1,2,3),
staging, validation,...)



Separate operations (deployment configurations) from development (code, 
JAR and WARs).
Use JNDI or other configuration methods to set up your environments 
rather than code.


We use a project to hold our configurations but there is no Maven build 
required since we use JNDI in a single file for each of the environments 
that we support and have not achieved the level of complexity where a 
build would be helpful.
When we get to automating their construction, I wonder if Ant with XSLT 
might be a better tool for assembling our environments than Maven.


The other 70+ projects are environment neutral and each one builds one 
or 2 artifacts (JARS or WARs) or contains a POM with no code.
We are building a portal that runs on Tomcat with Web services, portals 
and standalone batch jobs.
We support 2 client versions with different functionality and numerous 
test and development environments. We are maintaining production 
versions at the same time as developing one or more new versions 
simultaneously.  We have a small team that has ranged from 3 to 5 people.


I have never used profiles but I see a lot of people get into really 
complex situations with lots of frustration over profiles and I have a 
sense that they are really easy to misuse and lead people away from 
simple sensible solutions to their problems.


I also see conversations from people whose opinions I respect saying 
that they can be useful.
I have concluded that they are not a good thing to start with but may be 
helpful later, once the Maven environment is up and running, to optimize 
some functions.



Ron


Then an ear/war can be deployed using Maven to those different environments,
be it from a local machine or Hudson or some other contineous integration
tool.

How would one automate such situations without profiles and without a huge
amount of redundant maven xml?





-
To unsubscribe, e-mail: users-unsubscr...@maven.apache.org
For additional commands, e-mail: users-h...@maven.apache.org



Re: Reasonable use of profiles?

2010-12-14 Thread fhomasp



I didn't mean huge changes for the different platform.  The usual changes
for the specific environments switched with profiles are usually those
property files you're talking about.  
So the codebase remains the same, but assuming I build my ear using property
files for the validation platform, then the property files for the jar
dependencies in that ear would be automatically customized for the
validation platform as well.  This means that some language properties
change, also the validation software needs to connect to another server,
also on the validation platform, so that url and its properties need to be
specific as well.  And then also the property files for JUnit and for the
different databases.  This way there is almost no human error possible, the
artifact is the same for all platforms, but with different config settings,
etc.

Or am I missing something? 



stephenconnolly wrote:
 
 On 14 December 2010 08:06, fhomasp thomas.peet...@realdolmen.com wrote:

 After reading a bit of the debate I wonder a few things.  I read stay
 away
 from profiles a lot but I do find them to be very useful.

 So what's the alternative on profiles?  Assuming there is a modular
 project
 with several jars, several wars and several ears.  Each of those
 artifacts
 can be built for a different environment (development, test (1,2,3),
 staging, validation,...)
 
 
 Here is your issue.
 
 The Maven Way is to build one artifact that works in any environment.
 You don't go building environment specific artifacts on the Maven Way.
 
 You use things like JNDI or properties files added to the classpath to
 provide the environment customisations
 
 In situations like branding, you produce a brand artifact for each of
 the customer specific brands and you would load that into your
 application, by e.g. loading from the classpath, or by installing into
 the deployed application.
 
 If you have an existing application that is not designed this way,
 then I can see that you might find it hard to avoid profiles but
 you will have a better application if it is designed for the kind of
 pluggable customizations I describe.
 
 The Maven Way is about best practices and one of the best
 practices there is is ensuring that you only build an artifact once
 and re-use that tested artifact... it reduces the scope of testing
 (i.e. you only have to test the JNDI names exist, or you only have to
 test that the branding is correctly applied, rather than have to
 retest the entire application because you have rebuilt it with the
 alternate profile.
 
 -Stephen
 

 Then an ear/war can be deployed using Maven to those different
 environments,
 be it from a local machine or Hudson or some other contineous integration
 tool.

 How would one automate such situations without profiles and without a
 huge
 amount of redundant maven xml?


 --
 View this message in context:
 http://maven.40175.n5.nabble.com/Reasonable-use-of-profiles-tp3300650p3304188.html
 Sent from the Maven - Users mailing list archive at Nabble.com.

 -
 To unsubscribe, e-mail: users-unsubscr...@maven.apache.org
 For additional commands, e-mail: users-h...@maven.apache.org


 
 -
 To unsubscribe, e-mail: users-unsubscr...@maven.apache.org
 For additional commands, e-mail: users-h...@maven.apache.org
 
 
 

-- 
View this message in context: 
http://maven.40175.n5.nabble.com/Reasonable-use-of-profiles-tp3300650p3304241.html
Sent from the Maven - Users mailing list archive at Nabble.com.

-
To unsubscribe, e-mail: users-unsubscr...@maven.apache.org
For additional commands, e-mail: users-h...@maven.apache.org



maven dependency plugin exclude test dependencies

2010-12-14 Thread Erwin Mueller
Hello,

Can I exclude test dependencies in the maven-dependency-plugin? I'm 
using this plugin to copy all dependencies to a specific directory so the 
izpack plugin can package them into an installation application. However, it 
seems the dependency plugin is also copying the test dependencies of the 
project, like junit, fest-swing-junit, etc. I like to exclude them without to 
specify each dependency explicit in the excludeGroupIds tag.

Thank you, Erwin.
-- 
Erwin Mueller, erwin.muel...@deventm.org
http://www.global-scaling-institute.de/

-
To unsubscribe, e-mail: users-unsubscr...@maven.apache.org
For additional commands, e-mail: users-h...@maven.apache.org



Re: Reasonable use of profiles?

2010-12-14 Thread Stephen Connolly
On 14 December 2010 08:50, fhomasp thomas.peet...@realdolmen.com wrote:



 I didn't mean huge changes for the different platform.  The usual changes
 for the specific environments switched with profiles are usually those
 property files you're talking about.
 So the codebase remains the same, but assuming I build my ear using property
 files for the validation platform, then the property files for the jar
 dependencies in that ear would be automatically customized for the
 validation platform as well.  This means that some language properties
 change, also the validation software needs to connect to another server,
 also on the validation platform, so that url and its properties need to be
 specific as well.  And then also the property files for JUnit and for the
 different databases.  This way there is almost no human error possible, the
 artifact is the same for all platforms, but with different config settings,
 etc.

 Or am I missing something?

OK, I used to do consulting for the pharma sector.  The FDA would hang
you for a short-cut if they heard you were rebuilding the artifact and
not doing a full regression test on the modified artifact ;-)

[OK, so there are ways and means you could make it work and not be
hung by the FDA, but that's a lot of process you'd be putting in place
and a lot of extra validation and a lot of extra consultant hours to
spend... (why did I leave the pharma sector... oh yeah I hate the
paperwork!)]

You are REBUILDING your artifact for each use case... that is not the
Maven Way...

Just to be clear, if you want to rebuild the artifact for each
environment, then profiles is the solution I would choose... but I
would NEVER want to rebuild the artifact for each environment...

The Maven Way is to find a way of building, packaging and deploying
your application such that you _never_ need to rebuild your artifact
to run it in a different environment... that way you can build the
artifact, test it in your test environment, qualify it for release in
your qualification environment, and deploy it in your customer
environment, and the whole time the SHA1 and MD5 of your artifact is
the exact same, each step of the process.

If you rebuild at any point, then a strict quality process would
mandate one of two things:
1. Restart testing back at the start
or
2. Do a diff of the two artifacts, identify all changes, make a risk
assessment of the impact of each change, and then on the basis of the
assessed risk do strategic testing in the environment you are
deploying to... and also put in place a process of review for the risk
assessments to ensure that they are being performed correctly and that
the additional testing is not missing issues that would have been
caught by just Restarting testing back at the start

Now if quality is not your thing I can see why you might think that
REBUILDING for each environment might be ok... but I happen to
have been 'tainted' by working in heavily regulated industries and
trained in GMP, GLP, cGCP... in short I have had quality drilled into
me, at what some suspect is a cellular level, and every fiber of my
being screams... never ever rebuild an artifact just so you can deploy
it in a different environment... rebuild to fix bugs... rebuild to add
features rebuild because the artifact will/should be better

[/rant]

-Stephen




 stephenconnolly wrote:

 On 14 December 2010 08:06, fhomasp thomas.peet...@realdolmen.com wrote:

 After reading a bit of the debate I wonder a few things.  I read stay
 away
 from profiles a lot but I do find them to be very useful.

 So what's the alternative on profiles?  Assuming there is a modular
 project
 with several jars, several wars and several ears.  Each of those
 artifacts
 can be built for a different environment (development, test (1,2,3),
 staging, validation,...)


 Here is your issue.

 The Maven Way is to build one artifact that works in any environment.
 You don't go building environment specific artifacts on the Maven Way.

 You use things like JNDI or properties files added to the classpath to
 provide the environment customisations

 In situations like branding, you produce a brand artifact for each of
 the customer specific brands and you would load that into your
 application, by e.g. loading from the classpath, or by installing into
 the deployed application.

 If you have an existing application that is not designed this way,
 then I can see that you might find it hard to avoid profiles but
 you will have a better application if it is designed for the kind of
 pluggable customizations I describe.

 The Maven Way is about best practices and one of the best
 practices there is is ensuring that you only build an artifact once
 and re-use that tested artifact... it reduces the scope of testing
 (i.e. you only have to test the JNDI names exist, or you only have to
 test that the branding is correctly applied, rather than have to
 retest the entire application because you have rebuilt it 

Re: Reasonable use of profiles?

2010-12-14 Thread Anders Hammar
One other issue with rebuilding the artifacts is that you're going to have
problem if you want to do it the Maven way by deploying to a repo. As there
can only be one flavor of a specific version of an artifact, you can't
rebuild without bumping version. (Yes, you should never ever
replace/remove/alter a released artifact.)

/Anders

On Tue, Dec 14, 2010 at 10:11, Stephen Connolly 
stephen.alan.conno...@gmail.com wrote:

 On 14 December 2010 08:50, fhomasp thomas.peet...@realdolmen.com wrote:
 
 
 
  I didn't mean huge changes for the different platform.  The usual changes
  for the specific environments switched with profiles are usually those
  property files you're talking about.
  So the codebase remains the same, but assuming I build my ear using
 property
  files for the validation platform, then the property files for the jar
  dependencies in that ear would be automatically customized for the
  validation platform as well.  This means that some language properties
  change, also the validation software needs to connect to another server,
  also on the validation platform, so that url and its properties need to
 be
  specific as well.  And then also the property files for JUnit and for the
  different databases.  This way there is almost no human error possible,
 the
  artifact is the same for all platforms, but with different config
 settings,
  etc.
 
  Or am I missing something?

 OK, I used to do consulting for the pharma sector.  The FDA would hang
 you for a short-cut if they heard you were rebuilding the artifact and
 not doing a full regression test on the modified artifact ;-)

 [OK, so there are ways and means you could make it work and not be
 hung by the FDA, but that's a lot of process you'd be putting in place
 and a lot of extra validation and a lot of extra consultant hours to
 spend... (why did I leave the pharma sector... oh yeah I hate the
 paperwork!)]

 You are REBUILDING your artifact for each use case... that is not the
 Maven Way...

 Just to be clear, if you want to rebuild the artifact for each
 environment, then profiles is the solution I would choose... but I
 would NEVER want to rebuild the artifact for each environment...

 The Maven Way is to find a way of building, packaging and deploying
 your application such that you _never_ need to rebuild your artifact
 to run it in a different environment... that way you can build the
 artifact, test it in your test environment, qualify it for release in
 your qualification environment, and deploy it in your customer
 environment, and the whole time the SHA1 and MD5 of your artifact is
 the exact same, each step of the process.

 If you rebuild at any point, then a strict quality process would
 mandate one of two things:
 1. Restart testing back at the start
 or
 2. Do a diff of the two artifacts, identify all changes, make a risk
 assessment of the impact of each change, and then on the basis of the
 assessed risk do strategic testing in the environment you are
 deploying to... and also put in place a process of review for the risk
 assessments to ensure that they are being performed correctly and that
 the additional testing is not missing issues that would have been
 caught by just Restarting testing back at the start

 Now if quality is not your thing I can see why you might think that
 REBUILDING for each environment might be ok... but I happen to
 have been 'tainted' by working in heavily regulated industries and
 trained in GMP, GLP, cGCP... in short I have had quality drilled into
 me, at what some suspect is a cellular level, and every fiber of my
 being screams... never ever rebuild an artifact just so you can deploy
 it in a different environment... rebuild to fix bugs... rebuild to add
 features rebuild because the artifact will/should be better

 [/rant]

 -Stephen

 
 
 
  stephenconnolly wrote:
 
  On 14 December 2010 08:06, fhomasp thomas.peet...@realdolmen.com
 wrote:
 
  After reading a bit of the debate I wonder a few things.  I read stay
  away
  from profiles a lot but I do find them to be very useful.
 
  So what's the alternative on profiles?  Assuming there is a modular
  project
  with several jars, several wars and several ears.  Each of those
  artifacts
  can be built for a different environment (development, test (1,2,3),
  staging, validation,...)
 
 
  Here is your issue.
 
  The Maven Way is to build one artifact that works in any environment.
  You don't go building environment specific artifacts on the Maven Way.
 
  You use things like JNDI or properties files added to the classpath to
  provide the environment customisations
 
  In situations like branding, you produce a brand artifact for each of
  the customer specific brands and you would load that into your
  application, by e.g. loading from the classpath, or by installing into
  the deployed application.
 
  If you have an existing application that is not designed this way,
  then I can see that you might find it hard to avoid 

Re: Reasonable use of profiles?

2010-12-14 Thread fhomasp

Your rant makes sense Stephen and I'm already glad I asked :-)

I honestly have never been in a development environment where rebuilding an
artifact was a problem.  But ok, I see the idea of keeping the build.  But
if everything is the same, except for property files, what's the problem
with rebuilding?  Ok, the SHA1 and MD5 are indeed different after a rebuild,
but I never thought that kind of thing to be a source of possible failures. 
Although I have to admit that I'm far from a specialst on that matter. 
Perhaps someone or something should 'taint' me ?  Though that sounds more
awkward than intended ^^

I have yet to find a project where rebuilding an artifact is considered a
problem.  Usually the idea of the testing is a combination of test
frameworks applied on the codebase, not the release.  Hence the idea of
having unit tests as part of the success vs. fail of a build.  
Then a lot of project managers would want me to rebuild the artifact on a
machine with the same kind of environment as the target machine.  Which is
why some of my clients used Hudson, to be able to do just that and choose
the properties, with a profile.

Quality should always be a thing, imho.  But at this point I don't really
see a reason why rebuilding an artifact would break the quality, but of
course as I said, I'm not a specialist.  I've been a pro developer for 3
years .

-- 
View this message in context: 
http://maven.40175.n5.nabble.com/Reasonable-use-of-profiles-tp3300650p3304312.html
Sent from the Maven - Users mailing list archive at Nabble.com.

-
To unsubscribe, e-mail: users-unsubscr...@maven.apache.org
For additional commands, e-mail: users-h...@maven.apache.org



RE: Execution specific assembly identifiers and suppressing jar-with-depends ouput

2010-12-14 Thread Danny Thomas
Distributable zip files including readmes, shell scripts and a single jar with 
all dependencies. I've got a jar coming from the project, then the assembly 
plugin creates a jar with dependencies, and the two distributions. The only 
difference between the two distributions are the names of the shell scripts, 
and a distribution type parameter that changes the available 'actions' (ie. 
command [action] options) when the application is invoked.

Danny

From: Ron Wheeler [rwhee...@artifact-software.com]
Sent: Tuesday, 14 December 2010 7:16 PM
To: users@maven.apache.org
Subject: Re: Execution specific assembly identifiers and suppressing 
jar-with-depends ouput

What are you trying to build?
The project will build one artifact and the assembly plug-in will build
another.
Are you sure you need 3 artifacts to be built?

We use maven to build web services which include both a client jar and a
war for the service but we only need 1 execution.


Ron


On 14/12/2010 1:36 AM, Danny Thomas wrote:
 Hi,

 I'm using the assembly plugin to package my project. I currently have two 
 assembly descriptors with two executions for my project which generates 
 distributions of the project for two different audiences:

!-- Client distribution --
execution
  idclient/id
  phasepackage/phase
  goals
goalsingle/goal
  /goals
  configuration
descriptors
  descriptorsrc/main/assembly/dist-client.xml/descriptor
/descriptors
  /configuration
/execution
!-- Developer distribution --
execution
  iddeveloper/id
  phasepackage/phase
  goals
goalsingle/goal
  /goals
  configuration
descriptors
  descriptorsrc/main/assembly/dist-developer.xml/descriptor
/descriptors
  /configuration
/execution
  /executions

 I'm using filters to substitute in the distributable jar filename, and 
 they're definitely part of the solution, but I also need the final filename, 
 the destination shell script names etc to represent the execution, so need 
 expressions that are visible to the assembly. It's a command line app and I 
 display different available actions depending on the distribution type passed 
 in on a system parameter, so the filtered shell  script parameters an 
 filenames don't have to differ at all except for the 'client' vs 'developer' 
 designator.

 I'm currently maintaining two descriptors and set of distribution resources 
 which I want to avoid, but I can't seem to find a way of getting an 
 expression containing the execution id or another parameter unique to the 
 execution to the descriptor. I'd assume that if the expression is visible to 
 the assembly, the plugin will also be able to substitute it into my shell 
 scripts to consolidate the build into one assembly descriptor with two 
 executions.

 I've also had problems  suppressing output using the jar-with-dependencies 
 assembly:  I'd like to suppress the 'skipping' and Plexus configuration merge 
 related messages. Is this possible?

 On the former, if have a Stackoverflow question if you're a member and you'd 
 like earn yourself some reputation:

 http://stackoverflow.com/questions/4406660/how-do-i-include-maven-assembly-execution-specific-expressions-in-an-assembly-des

 Cheers,
 Danny


 This email and any attachments may contain confidential and proprietary 
 information of Blackboard that is for the sole use of the intended recipient. 
 If you are not the intended recipient, disclosure, copying, re-distribution 
 or other use of any of this information is strictly prohibited. Please 
 immediately notify the sender and delete this transmission if you received 
 this email in error.



-
To unsubscribe, e-mail: users-unsubscr...@maven.apache.org
For additional commands, e-mail: users-h...@maven.apache.org


This email and any attachments may contain confidential and proprietary 
information of Blackboard that is for the sole use of the intended recipient. 
If you are not the intended recipient, disclosure, copying, re-distribution or 
other use of any of this information is strictly prohibited. Please immediately 
notify the sender and delete this transmission if you received this email in 
error.

-
To unsubscribe, e-mail: users-unsubscr...@maven.apache.org
For additional commands, e-mail: users-h...@maven.apache.org



Re: Reasonable use of profiles?

2010-12-14 Thread Stephen Connolly
On 14 December 2010 09:45, fhomasp thomas.peet...@realdolmen.com wrote:

 Your rant makes sense Stephen and I'm already glad I asked :-)

 I honestly have never been in a development environment where rebuilding an
 artifact was a problem.

But did you ever ask why it was OK to rebuild the artifact in those
environments?

Perhaps Joe was working for ABC Ltd, developing software.

ABC Ltd hires some Quality people because customers want a quality
product... they quality people say Hey you can't keep rebuilding the
artifact for each customer and saying all you did was change the
branding and therefore we don't have to retest... there are many many
meetings eventually it becomes clear that Joe has not got the time
to refactor the build process so that ABC Ltd can release one artifact
and re-use for each customer, and anyway Joe has some
technical/architectural perspective where he believes that this is not
possible for the type of software that ABC Ltd develops.

The Quality people run off in a sulk back to their offices... after a
while they come back and say OK, you can rebuild the artifact for
each customer, but only if you add additional testing that shows that
there is no risk in rebuilding this artifact as a result of your build
process and the nature of the changes between builds

Joe does a small amount of work to help devise the tests, and the
Testing people have more work (but less than if they had to test every
build for every customer)...

Everything is going swimmingly...

Joe has a bright idea... he's decides to leave ABC Ltd to form a
start-up DEF Ltd... It's his idea, he knows the architecture, he sets
up things so that the build has low risk building for different
environments (after all the Quality fight in ABC Ltd didn't really
affect him... he just took on board the lesson that you need to add
testing to show that rebuilding is low risk)...

Joe hires Fred to develop software...

Fred sees Joe rebuilding artifacts all the time... Fred sees the
Quality people Joe hired being ok with this... When Fred moves on,
what will he think? How will he feel about rebuilding artifacts?

Fred is missing the context of Joe's experience... he only has the
context of working for a company where rebuilding was OK (because they
had already validated rebuilding for their context before Fred
started)

 But ok, I see the idea of keeping the build.  But
 if everything is the same, except for property files, what's the problem
 with rebuilding?

There may not be any problem... but how can you be sure? Show me the
evidence that for your use case rebuilding is OK... if you have the
evidence then fine... you are in the clear... but in the absence of
evidence that rebuilding is OK then from a quality perspective you
have to assume that it is not ok.

 Ok, the SHA1 and MD5 are indeed different after a rebuild,
 but I never thought that kind of thing to be a source of possible failures.

How do you know that other SCM changes have not crept in? The contents
of that EAR have different checksums too, so now we have to do a diff
on the wars and jars in the ear... oh and the jars in the wars are
also different... so now we have to do a diff of all the .class files
to show that they are the same and you have code that references
File.separator and you have optimization turned on in your compiler
and this build was on windows while that build was on linux ps!
these artifacts behave differently... the one on windows won't work on
linux but the one on linux works on windows except when it launches
the bat file that it writes... eeek!!! if only we had a repeatable
consistant build that we just passed through all the environments, we
would have caught this earlier and either turned off the optimization
in javac, or used System.getProperty(file.separator).

 Although I have to admit that I'm far from a specialst on that matter.
 Perhaps someone or something should 'taint' me ?  Though that sounds more
 awkward than intended ^^

 I have yet to find a project where rebuilding an artifact is considered a
 problem.

Correction, you have yet to work on a project where you were aware of
the work that went into showing why rebuilding was OK, or where a
quality team understood the risk

 Usually the idea of the testing is a combination of test
 frameworks applied on the codebase, not the release.  Hence the idea of
 having unit tests as part of the success vs. fail of a build.
 Then a lot of project managers would want me to rebuild the artifact on a
 machine with the same kind of environment as the target machine.

Sounds like you've worked for a lot of Fred's

 Which is
 why some of my clients used Hudson, to be able to do just that and choose
 the properties, with a profile.

 Quality should always be a thing, imho.  But at this point I don't really
 see a reason why rebuilding an artifact would break the quality, but of
 course as I said, I'm not a specialist.  I've been a pro developer for 3
 years .

I'm at this for 20 

Re: maven dependency plugin exclude test dependencies

2010-12-14 Thread Stephen Connolly
http://maven.apache.org/plugins/maven-dependency-plugin/copy-dependencies-mojo.html#excludeTransitive

On 14 December 2010 01:35, Erwin Mueller erwin.muel...@deventm.org wrote:
 Hello,

        Can I exclude test dependencies in the maven-dependency-plugin? I'm
 using this plugin to copy all dependencies to a specific directory so the
 izpack plugin can package them into an installation application. However, it
 seems the dependency plugin is also copying the test dependencies of the
 project, like junit, fest-swing-junit, etc. I like to exclude them without to
 specify each dependency explicit in the excludeGroupIds tag.

 Thank you, Erwin.
 --
 Erwin Mueller, erwin.muel...@deventm.org
 http://www.global-scaling-institute.de/

 -
 To unsubscribe, e-mail: users-unsubscr...@maven.apache.org
 For additional commands, e-mail: users-h...@maven.apache.org



-
To unsubscribe, e-mail: users-unsubscr...@maven.apache.org
For additional commands, e-mail: users-h...@maven.apache.org



Re: Reasonable use of profiles?

2010-12-14 Thread fhomasp

Ok..  Thanks!
FYI all questions I ask are not posed to doubt your answers and
observations.  Merely so I'd understand it better.


stephenconnolly wrote:
 
 
 But did you ever ask why it was OK to rebuild the artifact in those
 environments?  
 

Actually, yes.  The answer to that usually has something to do with building
on the target platform.  I've worked on one project where we had a
professional 'build manager'.  Yet he worked with unit tests and integration
tests using Teamcity with build-test-fail/succes principles.


stephenconnolly wrote:
 
 
 How do you know that other SCM changes have not crept in? The contents
 of that EAR have different checksums too, so now we have to do a diff
 on the wars and jars in the ear... oh and the jars in the wars are
 also different... so now we have to do a diff of all the .class files
 to show that they are the same and you have code that references
 File.separator and you have optimization turned on in your compiler
 and this build was on windows while that build was on linux ps!
 these artifacts behave differently... the one on windows won't work on
 linux but the one on linux works on windows except when it launches
 the bat file that it writes... eeek!!! if only we had a repeatable
 consistant build that we just passed through all the environments, we
 would have caught this earlier and either turned off the optimization
 in javac, or used System.getProperty(file.separator).
 

I thought that was the whole point of pulling a tag and rebuilding that tag
using profiles on a machine with a contineous build tool such as Hudson or
Teamcity.  At any rate, that way we are absolutely sure that other SCM
changes have not crept in, if the dependencies in the local/remote repo are
ok.
Though I do have to admit that I've seen charset problems rather often. 
That might have something to do with this kind of building strategy, right?


stephenconnolly wrote:
 
 Correction, you have yet to work on a project where you were aware of
 the work that went into showing why rebuilding was OK, or where a
 quality team understood the risk
 

Fair enough, although I do doubt the amount of work that's being put in on
that account.


-- 
View this message in context: 
http://maven.40175.n5.nabble.com/Reasonable-use-of-profiles-tp3300650p3304419.html
Sent from the Maven - Users mailing list archive at Nabble.com.

-
To unsubscribe, e-mail: users-unsubscr...@maven.apache.org
For additional commands, e-mail: users-h...@maven.apache.org



War project - Links folder

2010-12-14 Thread tiggy

Hello everybody,

I don't see any way to be able to import a maven project of type war with
linked folder pointing to jsp.

M2Eclipse takes my definition of maven-war-plugin as input:

   plugin
groupIdorg.apache.maven.plugins/groupId
artifactIdmaven-war-plugin/artifactId
configuration
webResources
resource
   
directory../shared-jsps/src/main/webapp/directory
targetPathWEB-INF/targetPath
/resource
/webResources
/configuration
/plugin

It must be translated as this in my .project (view, tags, layout, messages
are subdirectories of shared-jsps/src/main/webapp)

linkedResources
link
namesrc/main/webapp/WEB-INF/layout/name
type2/type

locationURIworkspace/shared-jsps/src/main/webapp/WEB-INF/layout/locationURI
/link
link
namesrc/main/webapp/WEB-INF/messages/name
type2/type

locationURIworkspace/shared-jsps/src/main/webapp/WEB-INF/messages/locationURI
/link
link
namesrc/main/webapp/WEB-INF/tags/name
type2/type

locationURIworkspace/eng-common-pub/src/main/webapp/WEB-INF/tags/locationURI
/link
link
namesrc/main/webapp/WEB-INF/views/name
type2/type

locationURIworkspace/shared-jsps/src/main/webapp/WEB-INF/views/locationURI
/link
/linkedResources

Currently what it does is to copy all the files contained in
shared-jsps/src/main/webapp... I don't find a solution to this issue. 

Any ideas would be appreciated ?

Tiggy


-- 
View this message in context: 
http://maven.40175.n5.nabble.com/War-project-Links-folder-tp3304454p3304454.html
Sent from the Maven - Users mailing list archive at Nabble.com.

-
To unsubscribe, e-mail: users-unsubscr...@maven.apache.org
For additional commands, e-mail: users-h...@maven.apache.org



Handling native dependencies via classifier(s)

2010-12-14 Thread Peter Bridge
My project depends on a 3rd party library for OpenGL + Java called JOGL
(http://jogamp.org/jogl/www/)

The library is currently not maven-ized and I'm trying to find the best
way to achieve that.  My progress is documented at:
http://jogamp.762907.n3.nabble.com/maven2-artifacts-tp1935908p1935908.html

To get started I'm just manually putting the JOGL jars into a local
repository.  Currently this is 4 jar files and each jar has a dependency
on a couple of native libs (.dll .jnilib .so etc) which I'm zipping and
then unpacking again for my project with:

   !-- Unpack native libs --
plugin
artifactIdmaven-dependency-plugin/artifactId
executions
execution
idunpack-dependencies/id
phasegenerate-sources/phase
goals
goalunpack-dependencies/goal
/goals
configuration
   
outputDirectory${basedir}/lib/outputDirectory
includeTypeszip/includeTypes
/configuration
/execution
/executions
/plugin


For each Jar I have a block like this:

dependency
groupIdcom.jogamp.jogl/groupId
artifactIdjogl-all/artifactId
version${jogl.version}/version
/dependency
dependency
groupIdcom.jogamp.jogl/groupId
artifactIdjogl-natives/artifactId
version${jogl.version}/version
classifierosx-universal/classifier
typezip/type
/dependency

Now this is going to get ugly quite fast, ie once I add all 5 supported
native platforms.  It also feels wrong, although I'm very new to
maven2.  Somehow I expected that there would be a way to tie the native
dependencies together with the jar in a more elegant way, maybe
something like:

dependency
groupIdcom.jogamp.jogl/groupId
artifactIdjogl-all/artifactId
version${jogl.version}/version
classifiersources/classifier
classifierjavadoc/classifier
classifierosx-universal/classifier
classifierwindows/classifier
classifierlinux/classifier
...
/dependency

I considered some kind of wrapper pom for each jar, but it also feels
like a work-around.  ie the jar can't be used without the natives, so
somehow they should be configured as a single artifact/entity/dependency.

The next issue I trip over with native handling, is eclipse:eclipse. 
Looking at the sources, it seems to take account of classifiers for
'sources' and 'javadoc' but I don't see anything for handling of native
directories.  ie the EclipseClasspathWriter and AbstractIdeSupportMojo
would need to output something like this:

classpathentry kind=var
path=M2_REPO/com/jogamp/jogl/jogl-all/2.0.251/jogl-all-2.0.251.jar
attributes
attribute
name=org.eclipse.jdt.launching.CLASSPATH_ATTR_LIBRARY_PATH_ENTRY
value=client/lib/
/attributes
/classpathentry


Now I'm thinking I can't be the first person to trip over this, but
google isn't giving much help.  So I'm wondering how other people are
dealing with this?

Or is it just that maven currently just isn't designed to work with
native dependencies?




-
To unsubscribe, e-mail: users-unsubscr...@maven.apache.org
For additional commands, e-mail: users-h...@maven.apache.org



Use error to start a new process?

2010-12-14 Thread gorgophol

HI, 

I'm trying to do the following:

My POM uses the exec-maven-plugin to start an external process. 
This is used to install software after the build process created an
installer. 
If this process can't finish the installation, my maven build fails. 
That is how it should be. 

In this case I need to start another process, which re-installs the old
version, so people can do their tests. 

Is there any possibility to configure the exec-maven-plugin (or any other
existing plugin) in a way, that I can catch the maven build error of the
install-execute-process and use is as a trigger, that starts the
re-install-process? 

Thanks.
Ben

-- 
View this message in context: 
http://maven.40175.n5.nabble.com/Use-error-to-start-a-new-process-tp3304508p3304508.html
Sent from the Maven - Users mailing list archive at Nabble.com.

-
To unsubscribe, e-mail: users-unsubscr...@maven.apache.org
For additional commands, e-mail: users-h...@maven.apache.org



Re: maven dependency plugin exclude test dependencies

2010-12-14 Thread Erwin Mueller
Hallo,

I did know about this but it's not what I want. I want to exclude all 
dependencies in the test scope. I don't want to exclude transitive 
dependencies.

But you pointed me in the documentation which I overlooked. I need 
excludeScope. Thank you.

On Tuesday 14 December 2010 11:46:23 Stephen Connolly wrote:
 http://maven.apache.org/plugins/maven-dependency-plugin/copy-dependencies-m
 ojo.html#excludeTransitive
 
 On 14 December 2010 01:35, Erwin Mueller erwin.muel...@deventm.org wrote:
  Hello,
  
 Can I exclude test dependencies in the maven-dependency-plugin?
  I'm using this plugin to copy all dependencies to a specific directory
  so the izpack plugin can package them into an installation application.
  However, it seems the dependency plugin is also copying the test
  dependencies of the project, like junit, fest-swing-junit, etc. I like
  to exclude them without to specify each dependency explicit in the
  excludeGroupIds tag.
  
  Thank you, Erwin.
  --
  Erwin Mueller, erwin.muel...@deventm.org
  http://www.global-scaling-institute.de/
  
  -
  To unsubscribe, e-mail: users-unsubscr...@maven.apache.org
  For additional commands, e-mail: users-h...@maven.apache.org
 
 -
 To unsubscribe, e-mail: users-unsubscr...@maven.apache.org
 For additional commands, e-mail: users-h...@maven.apache.org

-- 
Erwin Mueller, erwin.muel...@deventm.org
http://www.global-scaling-institute.de/

-
To unsubscribe, e-mail: users-unsubscr...@maven.apache.org
For additional commands, e-mail: users-h...@maven.apache.org



RE: Handling native dependencies via classifier(s)

2010-12-14 Thread Niels B Nielsen
Well, I can tell you how I deal with it..

I have all editions of our native libraries bundled inside a single jar file, I 
created.
./win32/libsomething.dll
./win64/libsomething.dll
./unix/x32/something.so
etc

and unpacks that into lib (similar to what you did).
During test and execution we navigate the right structure and set 
java.library.path.

the eclipse plugin surely has some shortcomings, especially in the native ares.
When setting up for eclipse, I use 'mvn eclipse:eclipse -DdownloadSources 
-DdownloadJavadocs', and then just reference the library in the launch 
configuration (java.library.path).

Hope that helps.

P.S. I would never add a dependency on source and javadocs etc, as the main 
resolve would transitively include these in the classpath.

Niels B Nielsen | Lead Engineer
J.P. Morgan | IBTech Global Rates Derivatives



-Original Message-
From: Peter Bridge [mailto:peter_bri...@hotmail.com] 
Sent: 14 December 2010 12:45
To: users@maven.apache.org
Subject: Handling native dependencies via classifier(s)

My project depends on a 3rd party library for OpenGL + Java called JOGL
(http://jogamp.org/jogl/www/)

The library is currently not maven-ized and I'm trying to find the best
way to achieve that.  My progress is documented at:
http://jogamp.762907.n3.nabble.com/maven2-artifacts-tp1935908p1935908.html

To get started I'm just manually putting the JOGL jars into a local
repository.  Currently this is 4 jar files and each jar has a dependency
on a couple of native libs (.dll .jnilib .so etc) which I'm zipping and
then unpacking again for my project with:

   !-- Unpack native libs --
plugin
artifactIdmaven-dependency-plugin/artifactId
executions
execution
idunpack-dependencies/id
phasegenerate-sources/phase
goals
goalunpack-dependencies/goal
/goals
configuration
   
outputDirectory${basedir}/lib/outputDirectory
includeTypeszip/includeTypes
/configuration
/execution
/executions
/plugin


For each Jar I have a block like this:

dependency
groupIdcom.jogamp.jogl/groupId
artifactIdjogl-all/artifactId
version${jogl.version}/version
/dependency
dependency
groupIdcom.jogamp.jogl/groupId
artifactIdjogl-natives/artifactId
version${jogl.version}/version
classifierosx-universal/classifier
typezip/type
/dependency

Now this is going to get ugly quite fast, ie once I add all 5 supported
native platforms.  It also feels wrong, although I'm very new to
maven2.  Somehow I expected that there would be a way to tie the native
dependencies together with the jar in a more elegant way, maybe
something like:

dependency
groupIdcom.jogamp.jogl/groupId
artifactIdjogl-all/artifactId
version${jogl.version}/version
classifiersources/classifier
classifierjavadoc/classifier
classifierosx-universal/classifier
classifierwindows/classifier
classifierlinux/classifier
...
/dependency

I considered some kind of wrapper pom for each jar, but it also feels
like a work-around.  ie the jar can't be used without the natives, so
somehow they should be configured as a single artifact/entity/dependency.

The next issue I trip over with native handling, is eclipse:eclipse. 
Looking at the sources, it seems to take account of classifiers for
'sources' and 'javadoc' but I don't see anything for handling of native
directories.  ie the EclipseClasspathWriter and AbstractIdeSupportMojo
would need to output something like this:

classpathentry kind=var
path=M2_REPO/com/jogamp/jogl/jogl-all/2.0.251/jogl-all-2.0.251.jar
attributes
attribute
name=org.eclipse.jdt.launching.CLASSPATH_ATTR_LIBRARY_PATH_ENTRY
value=client/lib/
/attributes
/classpathentry


Now I'm thinking I can't be the first person to trip over this, but
google isn't giving much help.  So I'm wondering how other people are
dealing with this?

Or is it just that maven currently just isn't designed to work with
native dependencies?




-
To unsubscribe, e-mail: users-unsubscr...@maven.apache.org
For additional commands, e-mail: users-h...@maven.apache.org

This email is confidential and subject to important disclaimers and
conditions including on offers for the purchase or sale of
securities, accuracy and completeness of information, viruses,
confidentiality, legal privilege, and legal entity disclaimers,
available at http://www.jpmorgan.com/pages/disclosures/email.


provided scope and compilation error

2010-12-14 Thread El Arbi ABOUSSOROR
Hi

 

I have a multi modules project with a jar module and war one.

Actually I don't want to deploy 2 of my jar module dependencies to the war,
so I put a scope provided in the jar module pom. But when I run mvn
install I get the following error :

 

[INFO]


[INFO] Building Unnamed - plpm:plpm.example-web:war:0.0.1-SNAPSHOT
0.0.1-SNAPSHOT

[INFO]


[INFO] 

[INFO] --- maven-resources-plugin:2.4.1:resources (default-resources) @
plpm.example-web ---

[WARNING] Using platform encoding (Cp1252 actually) to copy filtered
resources, i.e. build is platform dependent!

[INFO] Copying 1 resource

[INFO] 

[INFO] --- maven-compiler-plugin:2.3.2:compile (default-compile) @
plpm.example-web ---

[WARNING] File encoding has not been set, using platform encoding Cp1252,
i.e. build is platform dependent!

[INFO] Compiling 8 source files to
D:\tamed\produit\workspace\plpm.example\plpm.example-web\target\classes

[INFO] -

[ERROR] COMPILATION ERROR : 

[INFO] -

[ERROR] Failure executing javac, but could not parse the error:

An exception has occurred in the compiler (1.6.0_23). Please file a bug at
the Java Developer Connection (http://java.sun.com/webapps/bugreport)  after
checking the Bug Parade for duplicates. Include your program and the
following diagnostic in your report.  Thank you.

com.sun.tools.javac.code.Symbol$CompletionFailure: class file for
javax.persistence.GenerationType not found

 

[INFO] 1 error

[INFO] -

[INFO]


[INFO] Reactor Summary:

[INFO] 

[INFO] Unnamed - plpm:plpm.example:pom:0.0.1-SNAPSHOT  SUCCESS [0.203s]

[INFO] Unnamed - plpm:plpm.example-model:jar:0.0.1-SNAPSHOT  SUCCESS
[0.876s]

[INFO] Unnamed - plpm:plpm.example-domain:jar:0.0.1-SNAPSHOT  SUCCESS
[1.562s]

[INFO] Unnamed - plpm:plpm.example-web:war:0.0.1-SNAPSHOT  FAILURE [0.516s]

[INFO] Unnamed - plpm:plpm.example-generator:jar:0.0.1-SNAPSHOT  SKIPPED

[INFO]


[INFO] BUILD FAILURE

[INFO]


[INFO] Total time: 3.235s

[INFO] Finished at: Tue Dec 14 16:25:43 CET 2010

[INFO] Final Memory: 10M/24M

[INFO]


[ERROR] Failed to execute goal
org.apache.maven.plugins:maven-compiler-plugin:2.3.2:compile
(default-compile) on project plpm.example-web: Compilation failure

Failure executing javac, but could not parse the error:

An exception has occurred in the compiler (1.6.0_23). Please file a bug at
the Java Developer Connection (http://java.sun.com/webapps/bugreport)  after
checking the Bug Parade for duplicates. Include your program and the
following diagnostic in your report.  Thank you.

com.sun.tools.javac.code.Symbol$CompletionFailure: class file for
javax.persistence.GenerationType not found

- [Help 1]

[ERROR] 

[ERROR] To see the full stack trace of the errors, re-run Maven with the -e
switch.

[ERROR] Re-run Maven using the -X switch to enable full debug logging.

[ERROR] 

[ERROR] For more information about the errors and possible solutions, please
read the following articles:

[ERROR] [Help 1]
http://cwiki.apache.org/confluence/display/MAVEN/MojoFailureException

[ERROR] 

[ERROR] After correcting the problems, you can resume the build with the
command

[ERROR]   mvn goals -rf :plpm.example-web

 

 

 

 

 

 



RE: Depending on POM using inheritance breaks build

2010-12-14 Thread cowwoc


Eric Haszlakiewicz wrote:
 
-Original Message-
From: cowwoc [mailto:cow...@bbs.darktech.org]

Wayne Fay wrote:

 What do you recommend I do to work around this problem?

 Don't use the ${swt.classifier} for now...

 You may notice it was reported 5 years ago so if you actually want it
 fixed, you'll probably need to help on MNG-1388.


How do you deal with artifacts that need a different native library
depending on the build platform without the use of ${swt.classifier}?
 
 This feels to me like the old problem that there is no distinction between
 the pom file used to control the build, and the pom file that describes
 the artifact.  
 In a general sense, the properties that exist at the time that an artifact
 gets created (in this case swt.classifier, which was set by a profile)
 aren't remembered in the deployed artifact.
 
 The workaround I've seen suggested for this is to throw another layer of
 indirection in there and have a separate pom for each possible value of
 your variable.  
 In your case, I think you'd split project B into a B-windows-x86/pom.xml,
 B-whateverelse/pom.xml, etc... modules that each refer to the
 swt.classifier with a fixed value.  
 Then you'd need to set up an overall pom.xml that lists different modules
 based on the profile, 
 and you'll probably want some kind of common pom that all of the
 sub-modules depend on to pull in all the other normal dependencies.
 Obviously, this doesn't work well if you have more than just one variable
 you're dealing with, 
 and it seems (to me anyway) to be an unnecessary complex and confusing way
 to set things up, 
 but supposedly that's the maven way.
 
 eric
 

Gentlemen,

Does any of you care to comment on this problem? Is this convoluted approach
really the best available way of solving the problem?

Gili
-- 
View this message in context: 
http://maven.40175.n5.nabble.com/POM-inheritance-breaks-build-tp3263869p3304793.html
Sent from the Maven - Users mailing list archive at Nabble.com.

-
To unsubscribe, e-mail: users-unsubscr...@maven.apache.org
For additional commands, e-mail: users-h...@maven.apache.org



Re: provided scope and compilation error

2010-12-14 Thread Wayne Fay
 [ERROR] Failure executing javac, but could not parse the error:

 An exception has occurred in the compiler (1.6.0_23). Please file a bug at
 the Java Developer Connection (http://java.sun.com/webapps/bugreport)  after
 checking the Bug Parade for duplicates. Include your program and the
 following diagnostic in your report.  Thank you.

 com.sun.tools.javac.code.Symbol$CompletionFailure: class file for
 javax.persistence.GenerationType not found

Your error appears to have nothing to do with Maven but instead is
simply an error in the JDK. Follow the directions that were in your
build log and file a bug report after checking if it is already
posted, etc.

Alternatively, you could try adding a provided dependency on JPA --
perhaps it would resolve your issue?

Wayne

-
To unsubscribe, e-mail: users-unsubscr...@maven.apache.org
For additional commands, e-mail: users-h...@maven.apache.org



Re: War project - Links folder

2010-12-14 Thread Wayne Fay
 I don't see any way to be able to import a maven project of type war with
 linked folder pointing to jsp.

 directory../shared-jsps/src/main/webapp/directory

 Currently what it does is to copy all the files contained in
 shared-jsps/src/main/webapp... I don't find a solution to this issue.

Move the shared stuff to a separate project and depend on it in the
various projects you need to use it, and use war overlays to put the
files where they need to be in those projects.

Wayne

-
To unsubscribe, e-mail: users-unsubscr...@maven.apache.org
For additional commands, e-mail: users-h...@maven.apache.org



Re: Use error to start a new process?

2010-12-14 Thread Wayne Fay
 Is there any possibility to configure the exec-maven-plugin (or any other
 existing plugin) in a way, that I can catch the maven build error of the
 install-execute-process and use is as a trigger, that starts the
 re-install-process?

I doubt this functionality exists in any current plugin. But take a
look at the failsafe plugin and maybe you'll get some ideas on how you
can modify the exec-m-p to accommodate this requirement.

Otherwise I'd just handle this in a shell script/batch file that calls
Maven, checks the results, and then calls Maven again if necessary.

Wayne

-
To unsubscribe, e-mail: users-unsubscr...@maven.apache.org
For additional commands, e-mail: users-h...@maven.apache.org



Webstart: Setting the store and pass using system properties

2010-12-14 Thread Johannes Schneider
-BEGIN PGP SIGNED MESSAGE-
Hash: SHA1

Hi guys,

at the moment I am trying to figure out how to use the WebStart plugin.
Everything works fine, when I add the path to the keystore location and
the keystore password into the pom. But of course I'd prefer to store
those values within settings.xml or use command line parameters.

But the webstart plugin does not pick up the values when calling like that:
mvn package -Dstorepass=mypass does

Any ideas?


Thanks,

Johannes
- -- 
Johannes Schneider - blog.cedarsoft.com
-BEGIN PGP SIGNATURE-
Version: GnuPG v1.4.10 (GNU/Linux)

iQEcBAEBAgAGBQJNB9G3AAoJEAytD9R7Qv6dZtYIAJXrLbt9ZDP/K2e9O4UBxXPb
K8X3hLeVrUeq0sX8rngixPe9x2uLMqemenxKeUUB70zEBN3qjJ4vMACeX6o8xcrh
HT9MY9PGYUCzCsYBN7I2GbYpV5anjVtNQSX0tc97ZAcm0sx2MgV0Dg63K0T9l0OB
VV4ANB5pW+hGWmJ9EXdGu8Cg6KVuOr6utYZO2TIr+X7veHbgjWJZ4cp/x8QtM03e
okareAgg87i19QRqQpxioHji2m9s9JWiL//L+OULTj4tbGGal9X86ZtHjXYaoNlN
AZ8nMdYIzGwTq2qa2h4Vo3I9wS3++hrzs3lH1/uzyr0U4HKxkbJRNGWJ6dhNV0g=
=QeTS
-END PGP SIGNATURE-

-
To unsubscribe, e-mail: users-unsubscr...@maven.apache.org
For additional commands, e-mail: users-h...@maven.apache.org



Re: Webstart: Setting the store and pass using system properties

2010-12-14 Thread Johannes Schneider
-BEGIN PGP SIGNED MESSAGE-
Hash: SHA1

Think I have solved it:

sign
keystore${keystore.location}/keystore
storepass${keystore.pass}/storepass
alias${keystore.alias}/alias
   /sign



On 12/14/2010 09:21 PM, Johannes Schneider wrote:
 Hi guys,
 
 at the moment I am trying to figure out how to use the WebStart plugin.
 Everything works fine, when I add the path to the keystore location and
 the keystore password into the pom. But of course I'd prefer to store
 those values within settings.xml or use command line parameters.
 
 But the webstart plugin does not pick up the values when calling like that:
 mvn package -Dstorepass=mypass does
 
 Any ideas?
 
 
 Thanks,
 
 Johannes

- -
To unsubscribe, e-mail: users-unsubscr...@maven.apache.org
For additional commands, e-mail: users-h...@maven.apache.org


- -- 
Johannes Schneider - blog.cedarsoft.com
-BEGIN PGP SIGNATURE-
Version: GnuPG v1.4.10 (GNU/Linux)

iQEcBAEBAgAGBQJNB9szAAoJEAytD9R7Qv6dDmAH/2yLDezvKGDT29jO+/M9I6Cc
4T70CXx/lEPKkjz7UBKRqiqqtMV7svCeXnmHIPEPv4t5MqzAr/lJ/mHr+4uJgaid
Eqvgmi2VBR3znw+nLys0+z+A2YInJ9GiWL5GmmETHS+cw0iuUO2ScAlWyB44MeFS
MlfuT6UX97oq1eny4VS1avkvF0rqFDirnBi5hMGQAovmXWW99b4pCgK3Gkm1lQnf
r/5tjIGephjDPh9L+jE+RblrP2AqvvLOHWKL2sL6P7Qj066Y7rensLPZdD+wlqjo
eupj7HOPzdE47AXt/2hxrY3K+NFNPCd8EtVV6j7updO/XJh5x41DYScGMIwl9k4=
=xASR
-END PGP SIGNATURE-

-
To unsubscribe, e-mail: users-unsubscr...@maven.apache.org
For additional commands, e-mail: users-h...@maven.apache.org



Re: Reasonable use of profiles?

2010-12-14 Thread Brian Topping
These replies have been incredibly helpful, thanks especially to Ron and 
Stephen for your investment in them!

Very accidentally, I have fallen into the MV-SOA pattern.  I am using Mule 
for the services container with the VM connector for speed and ease of 
iterative development, knowing that I can trade out the connector once the site 
scales into the datacenter and needs it.

At the webapp level, what I've done is focus all command dispatches to a small 
class that either a) knows how to reach Mule or b) returns some mock objects.  
I've found developers who are good with both backend architecture and user 
interfaces are very rare.  In agile environments, UI folks often are at the 
front line of working with business stakeholders though, and if they can't see 
the results of their work without also wiring the backend, they are quickly 
stuck.  In small team situations, stakeholders respond to this by insisting 
that the architect is smoking something bad and being given an ultimatum to get 
the UI folks productive again or be replaced.

As a response to this, I've found that making the command messages to the SOA 
core as a separate shared project between these realms as step one.  Next, the 
web build is set up with a default implementation of the dispatcher that 
returns these command messages with mocked results.  Practically, this is 
managed as a named Spring bean.  

Things change when a developer includes a single library that has the alternate 
implementation of the command dispatcher.  When this library is loaded, it's 
alternate definition of the dispatcher overrides the mock dispatcher and is 
injected into the webapp accordingly.  Now, when commands are dispatched, they 
transparently go to the SOA core instead of returning mocks.  This could also 
be done with JNDI registrations and solves the same problem.

This works at an organizational level because UI devs can create mock 
implementations in their own sandbox, then throw them over the fence to the 
core developers with very little documentation.  A picture speaks a thousand 
words. 

All these artifacts are automatically deployed to Nexus by CI with sources 
after every checkin, so a developer that needs to trace source in the 
complimentary project need not have privileges on both projects.  More 
restrictive environments are obviously possible.  Along with Nexus, my svn repo 
is also slaved to LDAP, so authorization is easy to manage.

One of the last things to note here:  Spring or JNDI is not necessary for 
smaller projects, just use the classpath, whose namespace is naturally the sum 
of all loaded artifacts.  If a build includes a new artifact and that artifact 
includes additional configuration, use it.  Using the same name for the 
configuration file in all artifacts helps here, when doing a getResources() for 
a single filename, you'll be given a list of all classpath files that exist 
across your various jars.

Brian


-
To unsubscribe, e-mail: users-unsubscr...@maven.apache.org
For additional commands, e-mail: users-h...@maven.apache.org



Re: Reasonable use of profiles?

2010-12-14 Thread Ron Wheeler

On 14/12/2010 4:10 PM, Brian Topping wrote:

These replies have been incredibly helpful, thanks especially to Ron and 
Stephen for your investment in them!

Very accidentally, I have fallen into the MV-SOA pattern.  I am using Mule 
for the services container with the VM connector for speed and ease of iterative 
development, knowing that I can trade out the connector once the site scales into the 
datacenter and needs it.

At the webapp level, what I've done is focus all command dispatches to a small class that 
either a) knows how to reach Mule or b) returns some mock objects.  I've found developers 
who are good with both backend architecture and user interfaces are very rare.  In 
agile environments, UI folks often are at the front line of working with 
business stakeholders though, and if they can't see the results of their work without 
also wiring the backend, they are quickly stuck.  In small team situations, stakeholders 
respond to this by insisting that the architect is smoking something bad and being given 
an ultimatum to get the UI folks productive again or be replaced.

As a response to this, I've found that making the command messages to the SOA 
core as a separate shared project between these realms as step one.  Next, the 
web build is set up with a default implementation of the dispatcher that 
returns these command messages with mocked results.  Practically, this is 
managed as a named Spring bean.

Things change when a developer includes a single library that has the alternate 
implementation of the command dispatcher.  When this library is loaded, it's 
alternate definition of the dispatcher overrides the mock dispatcher and is 
injected into the webapp accordingly.  Now, when commands are dispatched, they 
transparently go to the SOA core instead of returning mocks.  This could also 
be done with JNDI registrations and solves the same problem.

This works at an organizational level because UI devs can create mock 
implementations in their own sandbox, then throw them over the fence to the 
core developers with very little documentation.  A picture speaks a thousand 
words.

All these artifacts are automatically deployed to Nexus by CI with sources 
after every checkin, so a developer that needs to trace source in the 
complimentary project need not have privileges on both projects.  More 
restrictive environments are obviously possible.  Along with Nexus, my svn repo 
is also slaved to LDAP, so authorization is easy to manage.

One of the last things to note here:  Spring or JNDI is not necessary for 
smaller projects, just use the classpath, whose namespace is naturally the sum 
of all loaded artifacts.  If a build includes a new artifact and that artifact 
includes additional configuration, use it.  Using the same name for the 
configuration file in all artifacts helps here, when doing a getResources() for 
a single filename, you'll be given a list of all classpath files that exist 
across your various jars.
We were so lucky to get Spring early by hiring a strong architect to 
help get us started.
I would recommend that managers of new projects look at Spring. It takes 
a bit of getting used to if you have a history of development with 
traditional tools but it is really a great approach.


JNDI is so simple, I would recommend getting site specific stuff there 
as quickly as possible. We discovered the real use of JNDI late in the 
project. We wasted a lot of time and energy developing alternative 
solutions to simple configuration problems that JNDI and Spring solved 
rather effortlessly. We read the JNDI documentation and spent a lot of 
time wondering why we could not find the complex details when, in fact, 
there weren't any.


Ron


Brian


-
To unsubscribe, e-mail: users-unsubscr...@maven.apache.org
For additional commands, e-mail: users-h...@maven.apache.org





-
To unsubscribe, e-mail: users-unsubscr...@maven.apache.org
For additional commands, e-mail: users-h...@maven.apache.org



Re: Use error to start a new process?

2010-12-14 Thread Stephen Connolly
antrun anyone

On 14 December 2010 17:15, Wayne Fay wayne...@gmail.com wrote:
 Is there any possibility to configure the exec-maven-plugin (or any other
 existing plugin) in a way, that I can catch the maven build error of the
 install-execute-process and use is as a trigger, that starts the
 re-install-process?

 I doubt this functionality exists in any current plugin. But take a
 look at the failsafe plugin and maybe you'll get some ideas on how you
 can modify the exec-m-p to accommodate this requirement.

 Otherwise I'd just handle this in a shell script/batch file that calls
 Maven, checks the results, and then calls Maven again if necessary.

 Wayne

 -
 To unsubscribe, e-mail: users-unsubscr...@maven.apache.org
 For additional commands, e-mail: users-h...@maven.apache.org



-
To unsubscribe, e-mail: users-unsubscr...@maven.apache.org
For additional commands, e-mail: users-h...@maven.apache.org



Re: POM inheritance breaks build

2010-12-14 Thread Zac Thompson
I posted a reply to this on stackoverflow, since that's where you
included more details.  The bottom line was that the properties are
*not* defined in A or B, only in *profiles* in A and B.  And there's
the rub: those profiles do not apply when building C, and the property
is not defined anywhere in C's pom.

On Sat, Nov 13, 2010 at 11:42 AM, cowwoc cow...@bbs.darktech.org wrote:

 Hi,

 I've posted a simple testcase at
 http://stackoverflow.com/questions/4171222/maven-depending-on-inheriting-artifact-causes-build-error
 that demonstrates how POM inheritance causes builds to break.

 Given three POM files:

    * C depends on B.
    * B inherits from A.
    * I can build A and B
    * C fails to resolve properties defined in A or B, and as a result fails
 to locate its transitive dependencies. The build breaks.

 Please help!

 Thanks,
 Gili

-
To unsubscribe, e-mail: users-unsubscr...@maven.apache.org
For additional commands, e-mail: users-h...@maven.apache.org



Re: provided scope and compilation error

2010-12-14 Thread Anders Hammar
It's a strange error but possibly you have a direct dependency to jpa from
your classes in the war and you haven't declared it (but rather relied on a
transitive dependency to jpa for compilation, but now when that has been
removed compilation fails).

Always declare your (direct) dependencies.

/Anders

On Tue, Dec 14, 2010 at 16:27, El Arbi ABOUSSOROR 
el-arbi.abousso...@sogeti.com wrote:

 Hi



 I have a multi modules project with a jar module and war one.

 Actually I don't want to deploy 2 of my jar module dependencies to the war,
 so I put a scope provided in the jar module pom. But when I run mvn
 install I get the following error :



 [INFO]
 

 [INFO] Building Unnamed - plpm:plpm.example-web:war:0.0.1-SNAPSHOT
 0.0.1-SNAPSHOT

 [INFO]
 

 [INFO]

 [INFO] --- maven-resources-plugin:2.4.1:resources (default-resources) @
 plpm.example-web ---

 [WARNING] Using platform encoding (Cp1252 actually) to copy filtered
 resources, i.e. build is platform dependent!

 [INFO] Copying 1 resource

 [INFO]

 [INFO] --- maven-compiler-plugin:2.3.2:compile (default-compile) @
 plpm.example-web ---

 [WARNING] File encoding has not been set, using platform encoding Cp1252,
 i.e. build is platform dependent!

 [INFO] Compiling 8 source files to
 D:\tamed\produit\workspace\plpm.example\plpm.example-web\target\classes

 [INFO] -

 [ERROR] COMPILATION ERROR :

 [INFO] -

 [ERROR] Failure executing javac, but could not parse the error:

 An exception has occurred in the compiler (1.6.0_23). Please file a bug at
 the Java Developer Connection (http://java.sun.com/webapps/bugreport)
  after
 checking the Bug Parade for duplicates. Include your program and the
 following diagnostic in your report.  Thank you.

 com.sun.tools.javac.code.Symbol$CompletionFailure: class file for
 javax.persistence.GenerationType not found



 [INFO] 1 error

 [INFO] -

 [INFO]
 

 [INFO] Reactor Summary:

 [INFO]

 [INFO] Unnamed - plpm:plpm.example:pom:0.0.1-SNAPSHOT  SUCCESS [0.203s]

 [INFO] Unnamed - plpm:plpm.example-model:jar:0.0.1-SNAPSHOT  SUCCESS
 [0.876s]

 [INFO] Unnamed - plpm:plpm.example-domain:jar:0.0.1-SNAPSHOT  SUCCESS
 [1.562s]

 [INFO] Unnamed - plpm:plpm.example-web:war:0.0.1-SNAPSHOT  FAILURE [0.516s]

 [INFO] Unnamed - plpm:plpm.example-generator:jar:0.0.1-SNAPSHOT  SKIPPED

 [INFO]
 

 [INFO] BUILD FAILURE

 [INFO]
 

 [INFO] Total time: 3.235s

 [INFO] Finished at: Tue Dec 14 16:25:43 CET 2010

 [INFO] Final Memory: 10M/24M

 [INFO]
 

 [ERROR] Failed to execute goal
 org.apache.maven.plugins:maven-compiler-plugin:2.3.2:compile
 (default-compile) on project plpm.example-web: Compilation failure

 Failure executing javac, but could not parse the error:

 An exception has occurred in the compiler (1.6.0_23). Please file a bug at
 the Java Developer Connection (http://java.sun.com/webapps/bugreport)
  after
 checking the Bug Parade for duplicates. Include your program and the
 following diagnostic in your report.  Thank you.

 com.sun.tools.javac.code.Symbol$CompletionFailure: class file for
 javax.persistence.GenerationType not found

 - [Help 1]

 [ERROR]

 [ERROR] To see the full stack trace of the errors, re-run Maven with the -e
 switch.

 [ERROR] Re-run Maven using the -X switch to enable full debug logging.

 [ERROR]

 [ERROR] For more information about the errors and possible solutions,
 please
 read the following articles:

 [ERROR] [Help 1]
 http://cwiki.apache.org/confluence/display/MAVEN/MojoFailureException

 [ERROR]

 [ERROR] After correcting the problems, you can resume the build with the
 command

 [ERROR]   mvn goals -rf :plpm.example-web