Re: Repository scanning problem in 1.0?

2007-12-03 Thread ArneD


Brett Porter wrote:
 
 On 30/11/2007, ArneD [EMAIL PROTECTED] wrote:
 Do you think the repository scanning problem as such, as described in my
 original mail, should be filed as a second issue?
 
 Yes, it looks that way.
 
 - Brett
 
 

I filed http://jira.codehaus.org/browse/MRM-612

In my opinion, this should have a high prio - Archiva should always be able
to synchronize itself to the filesystem contents, shouldn't it.

Arne
-- 
View this message in context: 
http://www.nabble.com/Repository-scanning-problem-in-1.0--tf4897121.html#a14134939
Sent from the archiva-users mailing list archive at Nabble.com.



Re: Repository scanning problem in 1.0?

2007-11-29 Thread ArneD

I don't know if this is related, but now I see a NPE in the logs when using
Update Database Now:

jvm 1| 2007-11-29 14:16:03,340 [Thread-6] ERROR
org.codehaus.plexus.taskqueue.execution.TaskQueueExecutor:database-u
pdate  - Error executing task
jvm 1|
edu.emory.mathcs.backport.java.util.concurrent.ExecutionException:
java.lang.NullPointerException
jvm 1|  at
edu.emory.mathcs.backport.java.util.concurrent.FutureTask.getResult(FutureTask.java:299)
jvm 1|  at
edu.emory.mathcs.backport.java.util.concurrent.FutureTask.get(FutureTask.java:118)
jvm 1|  at
org.codehaus.plexus.taskqueue.execution.ThreadedTaskQueueExecutor$ExecutorRunnable.waitForTask(Thread
edTaskQueueExecutor.java:159)
jvm 1|  at
org.codehaus.plexus.taskqueue.execution.ThreadedTaskQueueExecutor$ExecutorRunnable.run(ThreadedTaskQu
eueExecutor.java:127)
jvm 1| Caused by:
jvm 1| java.lang.NullPointerException
jvm 1|  at
org.apache.maven.archiva.database.updater.ProcessArchivaArtifactClosure.execute(ProcessArchivaArtifac
tClosure.java:56)
jvm 1|  at
org.apache.commons.collections.CollectionUtils.forAllDo(CollectionUtils.java:388)
jvm 1|  at
org.apache.maven.archiva.database.updater.JdoDatabaseUpdater.updateProcessed(JdoDatabaseUpdater.java:
170)
jvm 1|  at
org.apache.maven.archiva.database.updater.JdoDatabaseUpdater.updateAllProcessed(JdoDatabaseUpdater.ja
va:111)
jvm 1|  at
org.apache.maven.archiva.scheduled.executors.ArchivaDatabaseUpdateTaskExecutor.executeTask(ArchivaDat
abaseUpdateTaskExecutor.java:78)
jvm 1|  at
org.codehaus.plexus.taskqueue.execution.ThreadedTaskQueueExecutor$ExecutorRunnable$1.run(ThreadedTask
QueueExecutor.java:116)
jvm 1|  at
edu.emory.mathcs.backport.java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:442)
jvm 1|  at
edu.emory.mathcs.backport.java.util.concurrent.FutureTask.run(FutureTask.java:176)
jvm 1|  at
edu.emory.mathcs.backport.java.util.concurrent.ThreadPoolExecutor$Worker.runTask(ThreadPoolExecutor.j
ava:665)
jvm 1|  at
edu.emory.mathcs.backport.java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:
690)
jvm 1|  at java.lang.Thread.run(Thread.java:801)





ArneD wrote:
 
 Hi all,
 
 first of all, congratulations to all Archiva developers for releasing the
 1.0 version!
 
 I started playing around with it a little bit, and ran into the following
 problem:
 
 After starting up a fresh instance with default configuration, I copied
 parts of my existing repository to Archiva's default internal repository.
 Then I used the Scan Repository Now button on the repository
 administration page. Afterwards, all artifacts were visible on the
 Browse page. So far so good.
 
 I then copied some more groups of my existing repo to Archiva's default
 internal repository, and used Scan Repository Now once again. But this
 time, the Browse page did not show the newly added groups. This was
 quite surprising, as the log output of RepositoryScanner clearly showed
 that Archiva DID walk over these new artifacts, without errors or
 warnings. I tried to use the Update Database Now button, but still no
 effect. 
 
 I finally touched one of the POM files in the missing groups, i.e. I gave
 the POM file a new timestamp. After doing Scan Repository Now again, the
 artifact appeared on the Browse page. When browsing into the artifact,
 however, I get the error message Unable to find project model for [...].
 
 Seems like a bug to me? I know that my playground scenario may be a little
 bit unusual, but I would expect that Archiva should always be able to
 synchronize itself to the file system contents, regardless of the files'
 timestamps.
 
 Thanks
 - Arne
 
 

-- 
View this message in context: 
http://www.nabble.com/Repository-scanning-problem-in-1.0--tf4897121.html#a14025508
Sent from the archiva-users mailing list archive at Nabble.com.



Re: Repository scanning problem in 1.0?

2007-11-29 Thread ArneD


Wendy Smoak-3 wrote:
 
 On Nov 29, 2007 6:07 AM, ArneD [EMAIL PROTECTED] wrote:
 
 After starting up a fresh instance with default configuration, I copied
 parts of my existing repository to Archiva's default internal repository.
 Then I used the Scan Repository Now button on the repository
 administration page. Afterwards, all artifacts were visible on the
 Browse
 page. So far so good.

 I then copied some more groups of my existing repo to Archiva's default
 internal repository, and used Scan Repository Now once again. But this
 time, the Browse page did not show the newly added groups. This was
 quite
 surprising, as the log output of RepositoryScanner clearly showed that
 Archiva DID walk over these new artifacts, without errors or warnings. I
 tried to use the Update Database Now button, but still no effect.

 I finally touched one of the POM files in the missing groups, i.e. I gave
 the POM file a new timestamp. After doing Scan Repository Now again,
 the
 artifact appeared on the Browse page. When browsing into the artifact,
 however, I get the error message Unable to find project model for
 [...].
 
 I've seen Unable to find project model for [...] in recent versions
 of Archiva, also using with existing repository contents, but have not
 been able to isolate it with publicly available repo data.
 
 Can you come up with steps to reproduce this, and attach zipped
 repository contents to a JIRA issue?  It shouldn't take more than a
 few artifacts to provoke the problem, once you know the cause.
 http://jira.codehaus.org/browse/MRM
 
 Thanks,
 -- 
 Wendy
 
 


I filed http://jira.codehaus.org/browse/MRM-608 for the unable to find
project model problem.

Do you think the repository scanning problem as such, as described in my
original mail, should be filed as a second issue?

Thanks
- Arne

-- 
View this message in context: 
http://www.nabble.com/Repository-scanning-problem-in-1.0--tf4897121.html#a14026445
Sent from the archiva-users mailing list archive at Nabble.com.



Repository scanning problem in 1.0?

2007-11-29 Thread ArneD

Hi all,

first of all, congratulations to all Archiva developers for releasing the
1.0 version!

I started playing around with it a little bit, and ran into the following
problem:

After starting up a fresh instance with default configuration, I copied
parts of my existing repository to Archiva's default internal repository.
Then I used the Scan Repository Now button on the repository
administration page. Afterwards, all artifacts were visible on the Browse
page. So far so good.

I then copied some more groups of my existing repo to Archiva's default
internal repository, and used Scan Repository Now once again. But this
time, the Browse page did not show the newly added groups. This was quite
surprising, as the log output of RepositoryScanner clearly showed that
Archiva DID walk over these new artifacts, without errors or warnings. I
tried to use the Update Database Now button, but still no effect. 

I finally touched one of the POM files in the missing groups, i.e. I gave
the POM file a new timestamp. After doing Scan Repository Now again, the
artifact appeared on the Browse page. When browsing into the artifact,
however, I get the error message Unable to find project model for [...].

Seems like a bug to me? I know that my playground scenario may be a little
bit unusual, but I would expect that Archiva should always be able to
synchronize itself to the file system contents, regardless of the files'
timestamps.

Thanks
- Arne

-- 
View this message in context: 
http://www.nabble.com/Repository-scanning-problem-in-1.0--tf4897121.html#a14025501
Sent from the archiva-users mailing list archive at Nabble.com.



Re: Building project root as recursive build

2007-11-20 Thread ArneD


Emmanuel Venisse wrote:
 
 
 ArneD a écrit :
 I noticed two problems with building the root of a projcet as a recursive
 build. I'm still on 1.1-beta-3, but at least from the release notes I
 could
 not see anything that this has been changed for 1.1-beta-4 or the 1.1 RC.
 
 First of all, when adding a Maven 2.0+ project using a POM url, with the
 checkbox For multi modules project, load only root as recursive build
 enabled, still all sub-modules get added to Continuum and must be removed
 manually afterwards.
 
 I can't remember if it was a problem inbeta-3 but I tested with beta-4 and
 1.1 final (that should be released in few days) and I have only the parent
 pom added.
 

Thanks, I'll try again with 1.1 after the release, and would file a bug in
case it occurs again.



 
 
 The second problem is regarding automatic builds in case one of the
 dependencies has changed. Only the dependencies of the root module seem
 to
 be considered. The project information page on multi-module projects
 also
 only shows the dependencies of the root module.
 
 If you use the recursive mode without sub-modules, continuum doesn't know
 sub-modules so it doesn't know sub-dependencies.
 
 In future versions, you'll can add some dependencies between continuum
 projects to create links between them but not in 1.1
 
 

Wouldn't it be possible and desireable for Continuum to evaluate the modules
section of the root POM and then the POMs of all relevant sub-modules in
order to determine the correct dependencies?

Thanks
- Arne


-- 
View this message in context: 
http://www.nabble.com/Building-project-root-as-recursive-build-tf4841777.html#a13855880
Sent from the Continuum - Users mailing list archive at Nabble.com.



Building project root as recursive build

2007-11-19 Thread ArneD

I noticed two problems with building the root of a projcet as a recursive
build. I'm still on 1.1-beta-3, but at least from the release notes I could
not see anything that this has been changed for 1.1-beta-4 or the 1.1 RC.

First of all, when adding a Maven 2.0+ project using a POM url, with the
checkbox For multi modules project, load only root as recursive build
enabled, still all sub-modules get added to Continuum and must be removed
manually afterwards.

The second problem is regarding automatic builds in case one of the
dependencies has changed. Only the dependencies of the root module seem to
be considered. The project information page on multi-module projects also
only shows the dependencies of the root module.

Am I missing something? Should I file an issue?

Thanks
- Arne
-- 
View this message in context: 
http://www.nabble.com/Building-project-root-as-recursive-build-tf4841777.html#a13852345
Sent from the Continuum - Users mailing list archive at Nabble.com.



Re: Rights management for Proxy Connectors

2007-11-06 Thread ArneD


Brett Porter wrote:
 
 I like this idea - can you file a feature request?
 

I filed http://jira.codehaus.org/browse/MRM-579

Cheers
- Arne
-- 
View this message in context: 
http://www.nabble.com/Rights-management-for-Proxy-Connectors-tf4752902.html#a13602241
Sent from the archiva-users mailing list archive at Nabble.com.



Rights management for Proxy Connectors

2007-11-05 Thread ArneD

Hello all,

is it possible to configure Archiva in a way, that access to repo1 through a
proxy connector is available only for a few super users (= repository
managers)?

In my scenario, normal users should only be able to access the artifacts
available in the internal repository. In case they need a new artifact from
repo1, they ask the repository manager to add it. The repository manager
checks whether the artifact complies to e.g. company licensing rules,
whether the metadata on repo1 is ok (which is mostly but not always the
case), and so on. If everything is ok, he downloads the artifact including
all dependencies. Afterwards the artifact should be available to everybody
inside the company.

The repository observer role seems to allow access to the proxied
repositories as well, right?

Thanks
- Arne
-- 
View this message in context: 
http://www.nabble.com/Rights-management-for-Proxy-Connectors-tf4752902.html#a13590815
Sent from the archiva-users mailing list archive at Nabble.com.



Continuum 1.1 release?

2007-02-16 Thread ArneD

Hi Emmanuel and others,

now that Maven 2.0.5 has been released, any chances that we see a Continuum
1.1 release soon? That would be great.

Thanks
- Arne
-- 
View this message in context: 
http://www.nabble.com/Continuum-1.1-release--tf3238534.html#a9001442
Sent from the Continuum - Users mailing list archive at Nabble.com.



Download artifacts from one repo and upload to another

2007-02-06 Thread ArneD

I'm looking for an easy way to download an artifact from one repository and
deploy it to another, including all transitive dependencies. Ideally in just
one step. Mirroring the whole repository is not a solution.

How would you do it? Any hints what to look at? 

Thanks
- Arne
-- 
View this message in context: 
http://www.nabble.com/Download-artifacts-from-one-repo-and-upload-to-another-tf3181645s177.html#a8829481
Sent from the Maven - Users mailing list archive at Nabble.com.


-
To unsubscribe, e-mail: [EMAIL PROTECTED]
For additional commands, e-mail: [EMAIL PROTECTED]



Re: Lib dependencies in .war artifact

2006-10-09 Thread ArneD


Morgovsky, Alexander (US - Glen Mills) wrote:
 
 Hi.  I need some .jar's for compilation of a .war artifact, but I want
 to only include a subset of these in the WEB-INF/lib.  If I set the
 subset of the ones for inclusion with the scope of runtime for each,
 they will not be used for compilation, and the compilation will fail.
 Thus, the question is, how do I include a subset of .jar's which both
 are for compilation and runtime, but exclude all the ones which are for
 compilation only?  Thanks for your help. 
 

Try scope provided for those that you need for compilation only
and scope compile for thos that you need for compilation and runtime.

Regards,
Arne

-- 
View this message in context: 
http://www.nabble.com/Lib-dependencies-in-.war-artifact-tf2408437.html#a6717565
Sent from the Maven - Users mailing list archive at Nabble.com.


-
To unsubscribe, e-mail: [EMAIL PROTECTED]
For additional commands, e-mail: [EMAIL PROTECTED]



Re: Occassional NPEs while checking out through scm-local in scheduled b

2006-09-22 Thread ArneD

Hello Emmanuel,

I've found another NPE bug in DefaultContinuumScm which is easy to fix:

http://jira.codehaus.org/browse/CONTINUUM-937

Regards,
Arne


Emmanuel Venisse wrote:
 
 
 I've seen that you closed issue CONTINUUM-780 in the meanwhile. Are there
 chances to replace just one module in my Continuum 1.0.3 installation by
 a
 snapshot version to get the fix? Thanks for a short hint in which module
 you
 fixed it.
 
 I don't think you can change the jar in 1.0.3 with a 1.1-SNAPSHOT.
 
 You can checkout the 1.0.3 tag, apply my patch on continuum-core and build
 it.
 
 

-- 
View this message in context: 
http://www.nabble.com/Occassional-NPEs-while-checking-out-through-scm-local-in-scheduled-build-tf2289437.html#a6447722
Sent from the Continuum - Users mailing list archive at Nabble.com.



Re: Occassional NPEs while checking out through scm-local in scheduled b

2006-09-20 Thread ArneD

Hello Emmanuel,

I've seen that you closed issue CONTINUUM-780 in the meanwhile. Are there
chances to replace just one module in my Continuum 1.0.3 installation by a
snapshot version to get the fix? Thanks for a short hint in which module you
fixed it.

BTW, did you have a look at SCM-231?

Regards,
Arne


ArneD wrote:
 
 Hello Emmanuel,
 
 i added a comment to the existing issue at
 http://jira.codehaus.org/browse/CONTINUUM-780 because it's the same
 stacktrace.
 
 Regards,
 Arne
 
 
 Emmanuel Venisse wrote:
 
 Can you file an issue?
 
 Emmanuel
 
 ArneD a écrit :
 It's Continuum 1.0.3, but I've updated maven-scm-api and
 maven-scm-provider-local (snapshot version
 maven-scm-provider-local-1.0-20060616.154545-4.jar).
 
 Arne
 
 
 
 
 Emmanuel Venisse wrote:
 What is your continuum version and your scm?

 Emmanuel

 ArneD a écrit :
 Hi,

 occasionally some of our builds are failing during scheduled builds.
 Forced
 builds are working fine. I see the following stacktrace:

 org.apache.maven.continuum.scm.ContinuumScmException: Cannot checkout
 sources.
   at
 org.apache.maven.continuum.scm.DefaultContinuumScm.updateProject(DefaultContinuumScm.java:276)
   at
 org.apache.maven.continuum.core.action.UpdateWorkingDirectoryFromScmContinuumAction.execute(UpdateWorkingDirectoryFromScmContinuumAction.java:58)
   at
 org.apache.maven.continuum.buildcontroller.DefaultBuildController.build(DefaultBuildController.java:166)
   at
 org.apache.maven.continuum.buildcontroller.BuildProjectTaskExecutor.executeTask(BuildProjectTaskExecutor.java:47)
   at
 org.codehaus.plexus.taskqueue.execution.ThreadedTaskQueueExecutor$ExecutorRunnable.run(ThreadedTaskQueueExecutor.java:103)
   at java.lang.Thread.run(Thread.java:534)
 Caused by: java.lang.NullPointerException
   at
 org.apache.maven.continuum.scm.DefaultContinuumScm.convertScmResult(DefaultContinuumScm.java:414)
   at
 org.apache.maven.continuum.scm.DefaultContinuumScm.updateProject(DefaultContinuumScm.java:244)
   ... 5 more

 It seems to be the following issue. A difference is that not all
 builds
 are
 failing, and the problems do not occur every time.

 http://jira.codehaus.org/browse/CONTINUUM-780

 Any ideas how to resolve it yet? Is somebody working on it?

 Regards, Arne


 
 
 
 
 
 

-- 
View this message in context: 
http://www.nabble.com/Occassional-NPEs-while-checking-out-through-scm-local-in-scheduled-build-tf2289437.html#a6412758
Sent from the Continuum - Users mailing list archive at Nabble.com.



Re: [m2] Include project version into generated site

2006-09-19 Thread ArneD


Alan Mosely wrote:
 
 Hi,
 
 On a related note I would like to have the version included in the url for
 further transparency.
 Any ideas?
 
 

We use the following distributionManagement entry in the project's pom.xml:

  distributionManagement
site
  idmy-site/id
 
urlfile://\\myserver\projectsites\${project.artifactId}-${project.version}/url
/site
  /distributionManagement

Regards,
Arne
-- 
View this message in context: 
http://www.nabble.com/-m2--Include-project-version-into-generated-site-tf2291271.html#a6386759
Sent from the Maven - Users mailing list archive at Nabble.com.


-
To unsubscribe, e-mail: [EMAIL PROTECTED]
For additional commands, e-mail: [EMAIL PROTECTED]



Re: [M2] Problem with surefire report

2006-09-19 Thread ArneD


Luis Perea wrote:
 
 The surefire's report show me the same details for each diferent Test in
 my proyect, I've allready looked out the *.txt and *.xml files generated
 by surefire and the *.txt seems correct (shows the correct log for the
 TestCase executed) but the *.xml files has the same info for all the
 Tests.
 

That's probably the same issue as described here:
http://jira.codehaus.org/browse/MSUREFIRE-114

Regards,
Arne
-- 
View this message in context: 
http://www.nabble.com/-M2--Problem-with-surefire-report-tf2295962.html#a6386812
Sent from the Maven - Users mailing list archive at Nabble.com.


-
To unsubscribe, e-mail: [EMAIL PROTECTED]
For additional commands, e-mail: [EMAIL PROTECTED]



Re: Occassional NPEs while checking out through scm-local in scheduled build

2006-09-19 Thread ArneD

Hello Emmanuel,

i added a comment to the existing issue at
http://jira.codehaus.org/browse/CONTINUUM-780 because it's the same
stacktrace.

Regards,
Arne


Emmanuel Venisse wrote:
 
 Can you file an issue?
 
 Emmanuel
 
 ArneD a écrit :
 It's Continuum 1.0.3, but I've updated maven-scm-api and
 maven-scm-provider-local (snapshot version
 maven-scm-provider-local-1.0-20060616.154545-4.jar).
 
 Arne
 
 
 
 
 Emmanuel Venisse wrote:
 What is your continuum version and your scm?

 Emmanuel

 ArneD a écrit :
 Hi,

 occasionally some of our builds are failing during scheduled builds.
 Forced
 builds are working fine. I see the following stacktrace:

 org.apache.maven.continuum.scm.ContinuumScmException: Cannot checkout
 sources.
at
 org.apache.maven.continuum.scm.DefaultContinuumScm.updateProject(DefaultContinuumScm.java:276)
at
 org.apache.maven.continuum.core.action.UpdateWorkingDirectoryFromScmContinuumAction.execute(UpdateWorkingDirectoryFromScmContinuumAction.java:58)
at
 org.apache.maven.continuum.buildcontroller.DefaultBuildController.build(DefaultBuildController.java:166)
at
 org.apache.maven.continuum.buildcontroller.BuildProjectTaskExecutor.executeTask(BuildProjectTaskExecutor.java:47)
at
 org.codehaus.plexus.taskqueue.execution.ThreadedTaskQueueExecutor$ExecutorRunnable.run(ThreadedTaskQueueExecutor.java:103)
at java.lang.Thread.run(Thread.java:534)
 Caused by: java.lang.NullPointerException
at
 org.apache.maven.continuum.scm.DefaultContinuumScm.convertScmResult(DefaultContinuumScm.java:414)
at
 org.apache.maven.continuum.scm.DefaultContinuumScm.updateProject(DefaultContinuumScm.java:244)
... 5 more

 It seems to be the following issue. A difference is that not all builds
 are
 failing, and the problems do not occur every time.

 http://jira.codehaus.org/browse/CONTINUUM-780

 Any ideas how to resolve it yet? Is somebody working on it?

 Regards, Arne


 
 
 
 

-- 
View this message in context: 
http://www.nabble.com/Occassional-NPEs-while-checking-out-through-scm-local-in-scheduled-build-tf2289437.html#a6388904
Sent from the Continuum - Users mailing list archive at Nabble.com.



Occassional NPEs while checking out through scm-local in scheduled build

2006-09-18 Thread ArneD

Hi,

occasionally some of our builds are failing during scheduled builds. Forced
builds are working fine. I see the following stacktrace:

org.apache.maven.continuum.scm.ContinuumScmException: Cannot checkout
sources.
at
org.apache.maven.continuum.scm.DefaultContinuumScm.updateProject(DefaultContinuumScm.java:276)
at
org.apache.maven.continuum.core.action.UpdateWorkingDirectoryFromScmContinuumAction.execute(UpdateWorkingDirectoryFromScmContinuumAction.java:58)
at
org.apache.maven.continuum.buildcontroller.DefaultBuildController.build(DefaultBuildController.java:166)
at
org.apache.maven.continuum.buildcontroller.BuildProjectTaskExecutor.executeTask(BuildProjectTaskExecutor.java:47)
at
org.codehaus.plexus.taskqueue.execution.ThreadedTaskQueueExecutor$ExecutorRunnable.run(ThreadedTaskQueueExecutor.java:103)
at java.lang.Thread.run(Thread.java:534)
Caused by: java.lang.NullPointerException
at
org.apache.maven.continuum.scm.DefaultContinuumScm.convertScmResult(DefaultContinuumScm.java:414)
at
org.apache.maven.continuum.scm.DefaultContinuumScm.updateProject(DefaultContinuumScm.java:244)
... 5 more

It seems to be the following issue. A difference is that not all builds are
failing, and the problems do not occur every time.

http://jira.codehaus.org/browse/CONTINUUM-780

Any ideas how to resolve it yet? Is somebody working on it?

Regards, Arne
-- 
View this message in context: 
http://www.nabble.com/Occassional-NPEs-while-checking-out-through-scm-local-in-scheduled-build-tf2289437.html#a6358643
Sent from the Continuum - Users forum at Nabble.com.



Re: Occassional NPEs while checking out through scm-local in scheduled build

2006-09-18 Thread ArneD

It's Continuum 1.0.3, but I've updated maven-scm-api and
maven-scm-provider-local (snapshot version
maven-scm-provider-local-1.0-20060616.154545-4.jar).

Arne




Emmanuel Venisse wrote:
 
 What is your continuum version and your scm?
 
 Emmanuel
 
 ArneD a écrit :
 Hi,
 
 occasionally some of our builds are failing during scheduled builds.
 Forced
 builds are working fine. I see the following stacktrace:
 
 org.apache.maven.continuum.scm.ContinuumScmException: Cannot checkout
 sources.
  at
 org.apache.maven.continuum.scm.DefaultContinuumScm.updateProject(DefaultContinuumScm.java:276)
  at
 org.apache.maven.continuum.core.action.UpdateWorkingDirectoryFromScmContinuumAction.execute(UpdateWorkingDirectoryFromScmContinuumAction.java:58)
  at
 org.apache.maven.continuum.buildcontroller.DefaultBuildController.build(DefaultBuildController.java:166)
  at
 org.apache.maven.continuum.buildcontroller.BuildProjectTaskExecutor.executeTask(BuildProjectTaskExecutor.java:47)
  at
 org.codehaus.plexus.taskqueue.execution.ThreadedTaskQueueExecutor$ExecutorRunnable.run(ThreadedTaskQueueExecutor.java:103)
  at java.lang.Thread.run(Thread.java:534)
 Caused by: java.lang.NullPointerException
  at
 org.apache.maven.continuum.scm.DefaultContinuumScm.convertScmResult(DefaultContinuumScm.java:414)
  at
 org.apache.maven.continuum.scm.DefaultContinuumScm.updateProject(DefaultContinuumScm.java:244)
  ... 5 more
 
 It seems to be the following issue. A difference is that not all builds
 are
 failing, and the problems do not occur every time.
 
 http://jira.codehaus.org/browse/CONTINUUM-780
 
 Any ideas how to resolve it yet? Is somebody working on it?
 
 Regards, Arne
 
 
 

-- 
View this message in context: 
http://www.nabble.com/Occassional-NPEs-while-checking-out-through-scm-local-in-scheduled-build-tf2289437.html#a6362234
Sent from the Continuum - Users mailing list archive at Nabble.com.



[m2] Include project version into generated site

2006-09-18 Thread ArneD

Hi,

I would like to have the project version included into the generated site,
e.g. into the title, to make this information transparent to the user
directly on the welcome page.

I tried to use ${project.version} in the site.xml descriptor, but this is
not replaced. Only ${project.name} seems to be working.

Any hints?

Thanks,
Arne


-- 
View this message in context: 
http://www.nabble.com/-m2--Include-project-version-into-generated-site-tf2291271.html#a6363605
Sent from the Maven - Users mailing list archive at Nabble.com.


-
To unsubscribe, e-mail: [EMAIL PROTECTED]
For additional commands, e-mail: [EMAIL PROTECTED]



Re: Update over scm-local does not delete files removed from source dir

2006-09-14 Thread ArneD


Emmanuel Venisse wrote:
 
 
 Do you think the suggested enhancement for scm-local would be useful and
 make sense? Not only in combination with ClearCase dynamic views but for
 other usage as well, e.g. testing.
 
 yes. Do you want to implement it?
 
 

Yes, I can try to do so and then provide a patch.

Regards,
Arne
-- 
View this message in context: 
http://www.nabble.com/Update-over-scm-local-does-not-delete-files-removed-from-source-dir-tf2257460.html#a6303945
Sent from the Continuum - Users forum at Nabble.com.



Re: Update over scm-local does not delete files removed from source dir

2006-09-14 Thread ArneD

Thanks for your answer, David.


continuum-3 wrote:
 
 my Continuum installation is accessing a file system directory using
 scm-local provider to get sources (BTW, the directory is on a ClearCase
 dynamic view). Updates are working fine as long as files are only changed
 or
 added. But when files are removed from the source directory, they still
 exist in the target directory. Especially after refactoring activities,
 this
 leads to build errors.
 
 Make sure you are runnning clean goal in maven.  When you do a clean it
 should delete this info from the target directories.  And that will fix
 your problem.
 

This is what I am doing. The problem is: My project source dir resides on a
dynamic view on drive Z:\VOB\some\where. My Continuum working directory is
e.g. D:\continuum-work\99.

What Continuum does (with help of scm-local) is to copy all files from
Z:\VOB\some\where to D:\continuum-work\99 before running the build. If
someone removes an outdated class from ClearCase, it won't be on
Z:\VOB\some\where any longer. As scm-local currently does not delete
anything from the checkout directory, the outdated class will still be there
at d:\continuum-work\99. The clean goal will only delete
d:\continuum-work\99\target. 



 BTW, you'll speed up your build if you get maven to put the target
 directories somewhere else, outside of the dynamic view, as dynamic
 views can be slow.
 

Did you manage to let Continuum operate directly on a dynamic view, in my
example on Z:\VOB\some\where, without copying to a working directory?

Probably you are only talking about Maven stand-alone usage. Then, I agree,
it is no problem, as long as you are running the clean goal.

Regards,
Arne
-- 
View this message in context: 
http://www.nabble.com/Update-over-scm-local-does-not-delete-files-removed-from-source-dir-tf2257460.html#a6309477
Sent from the Continuum - Users forum at Nabble.com.



Re: Update over scm-local does not delete files removed from source dir

2006-09-13 Thread ArneD


Emmanuel Venisse wrote:
 
 
 thanks a lot for your answer. A ClearCase dynamic view looks like a
 normal
 filesystem, so I think scm-local in principle is a fine solution. I think
 there wouldn't be much what a dedicated dynamic-view support in the
 Clearcase SCM provider could add.
 
 Maybe it looks like a normal filesystem, but I think the clearcase server
 know if files are deleted 
 and update your local copy, right?
 

ClearCase of course has the information. But the problem is that a ClearCase
dynamic view resides on a virtual network share, e.g. \\view\some_view. You
can map it to a drive letter under windows, e.g. Z:, but I don't see a way
to define it in Continuum's working directory. So the files have to be
copied from the dynamic view to the working directory - and that's what
scm-local is doing.



 
 Couldn't the scm-local adapter consider all files that are in the
 checkout
 dir but not in the source dir as deleted? This should be easy to
 implement
 and do the job.
 
 We can't, because some users (or the build) add some files in the checkout
 directory like the target 
 directory and they don't want to remove them at each build.
 

That's true. Maybe we could enhance scm-local to keep its own metadata? In
particular, 
scm-local could maintain a simple file, say .maven-scm-local, that contains
as plain text the list of files in the source directory, as seen during the
last checkout or update operation:

- During checkout, the file .maven-scm-local is created in the checkout base
directory. Its just a plain text file containing the list of files that have
been checked out.
- The update command looks for the file. If it is there, it compares the
contents of that file to the current source directory contents (including
subdirs). All files that are in .maven-scm-local but are no longer in the
source dir, have been deleted in the source dir. The update command
therefore removes them from the checkout dir.
- If for whatever reason .maven-scm-local is not there, the update command
won't delete any files. That way, we're backwards compatible.
- After completing the update process, the update command rewrites the
.maven-scm-local metadata file.
- Even the changelog command can interpret .maven-scm-local
- For add and checkin commands, I don't think that changes are needed.

What do you think?

Regards,
Arne

-- 
View this message in context: 
http://www.nabble.com/Update-over-scm-local-does-not-delete-files-removed-from-source-dir-tf2257460.html#a6281593
Sent from the Continuum - Users forum at Nabble.com.



Re: Update over scm-local does not delete files removed from source dir

2006-09-13 Thread ArneD


Emmanuel Venisse wrote:
 
 Are you sure it isn't possible to checkout sources in a specific folder
 with cleartool?
 

It is possible with snapshot views. My ClearCase know-how is limited, but I
am quite sure it is not possible with dynamic views.

Do you think the suggested enhancement for scm-local would be useful and
make sense? Not only in combination with ClearCase dynamic views but for
other usage as well, e.g. testing.

Regards,
Arne



 That's true. Maybe we could enhance scm-local to keep its own metadata?
 In
 particular, 
 scm-local could maintain a simple file, say .maven-scm-local, that
 contains
 as plain text the list of files in the source directory, as seen during
 the
 last checkout or update operation:
 
 - During checkout, the file .maven-scm-local is created in the checkout
 base
 directory. Its just a plain text file containing the list of files that
 have
 been checked out.
 - The update command looks for the file. If it is there, it compares the
 contents of that file to the current source directory contents (including
 subdirs). All files that are in .maven-scm-local but are no longer in the
 source dir, have been deleted in the source dir. The update command
 therefore removes them from the checkout dir.
 - If for whatever reason .maven-scm-local is not there, the update
 command
 won't delete any files. That way, we're backwards compatible.
 - After completing the update process, the update command rewrites the
 .maven-scm-local metadata file.
 - Even the changelog command can interpret .maven-scm-local
 - For add and checkin commands, I don't think that changes are needed.
 
 What do you think?
 
 Regards,
 Arne
 
 
 
 

-- 
View this message in context: 
http://www.nabble.com/Update-over-scm-local-does-not-delete-files-removed-from-source-dir-tf2257460.html#a6289204
Sent from the Continuum - Users forum at Nabble.com.



Update over scm-local does not delete files removed from source dir

2006-09-12 Thread ArneD

Hello everybody,

my Continuum installation is accessing a file system directory using
scm-local provider to get sources (BTW, the directory is on a ClearCase
dynamic view). Updates are working fine as long as files are only changed or
added. But when files are removed from the source directory, they still
exist in the target directory. Especially after refactoring activities, this
leads to build errors.

Seems like a bug to me... Or am I missing something?

Many thanks,
Arne





-- 
View this message in context: 
http://www.nabble.com/Update-over-scm-local-does-not-delete-files-removed-from-source-dir-tf2257460.html#a6261871
Sent from the Continuum - Users forum at Nabble.com.



Re: Update over scm-local does not delete files removed from source dir

2006-09-12 Thread ArneD


Emmanuel Venisse wrote:
 
 scm-local was developped for tests and it's used in some case when a scm
 provider isn't available 
 like dynamic-view support for Clearcase.
 
 It's a very simple provider that copy file from sources directory, but it
 can't know if some files 
 are deleted because it doesn't have some metadata that can inform it.
 
 The only possibility you have for the moment is to remove the checkout
 directory and run a new 
 build. But the best way would be to add dynamic-view support in clearcase
 provider. Only clearcase 
 users will can add this features because we don't know clearcase and we
 don't have access to a 
 clearcase server.
 

Hello Emmanuel,

thanks a lot for your answer. A ClearCase dynamic view looks like a normal
filesystem, so I think scm-local in principle is a fine solution. I think
there wouldn't be much what a dedicated dynamic-view support in the
Clearcase SCM provider could add.

Couldn't the scm-local adapter consider all files that are in the checkout
dir but not in the source dir as deleted? This should be easy to implement
and do the job.

Best regards,
Arne

-- 
View this message in context: 
http://www.nabble.com/Update-over-scm-local-does-not-delete-files-removed-from-source-dir-tf2257460.html#a6267099
Sent from the Continuum - Users forum at Nabble.com.



Re: [POLL] Why switch to Maven?

2006-08-31 Thread ArneD


Eric Redmond wrote:
 
 Any more reasons? Care to expand these ideas?
 

Hi Eric,

for corporate users, I believe there are some additional issues to the ones
already named.

Especially in large companies, it is often unacceptable to let users
download artefacts directly from an Internet repository, even not through a
proxy. Companies need to have full control over all artifacts used in their
build processes, that means that only artefacts that have been explicitly
released by someone should be used. So when you start using Maven, you want
to immediately set up an internal repository which, I believe, is still a
too complicated task.

This is especially true for setting up an internal plugin repository. We did
not find a comfortable way to initially fill an internal plugin repository.
You should be able to say fill my plugin repository with all required JARs
for the plugins x, y and z. Our workaround was to set up a Maven reference
installation zip, that already has all required plugin JARs in its local
repository, so that users who unzip that archive have everything they need.

Another point is, plugin JARs and production JARs should be separated. This
is already possible for remote repositories, but it should be possible for
the local repository as well. Tool JARs like xerces etc. are used both by
plugins and by projects. But just because some plugin needs a JAR file,
corporate users do not want to make that JAR available for builds. Corporate
users just don't want to get tools mixed up with production software.

Anyway, Maven is a great piece of software, full of great ideas. But some
things still need to be improved before Maven has chances to achieve a wide
acceptance.

Arne

-- 
View this message in context: 
http://www.nabble.com/-POLL--Why-switch-to-Maven--tf2185174.html#a6078097
Sent from the Maven - Users forum at Nabble.com.


-
To unsubscribe, e-mail: [EMAIL PROTECTED]
For additional commands, e-mail: [EMAIL PROTECTED]



Re: [POLL] Why switch to Maven?

2006-08-31 Thread ArneD


Eric Redmond wrote:
 
 Especially in large companies, it is often unacceptable to let users
 download artefacts directly from an Internet repository, even not through
 a
 proxy. Companies need to have full control over all artifacts used in
 theirE
 build processes, that means that only artefacts that have been explicitly
 released by someone should be used.
 
 
 I don't understand  why a Proxy is unacceptible.  The administrator of the
 Proxy can - at any time- disconnect the Proxy from the internet.
 

In the company that I helped setting up a Maven-based build environment, a
public site like ibiblio.org is considered a potentially unsafe source. Like
it or not. Only JARs that have been approved internally may be used for
production. (BTW, this was within finance industry which is partly quite
sensitive at such points.)

Practically, that meant we had to set up an internal remote repository and
deploy all existing JARs to it. This worked fine in the end, but felt
difficult in the beginning - probably due to the lack of good documentation
and examples.

The bigger problem was setting up an internal **plugin** repository:



 This is especially true for setting up an internal plugin repository. We
 did
 not find a comfortable way to initially fill an internal plugin
 repository.
 You should be able to say fill my plugin repository with all required
 JARs
 for the plugins x, y and z. Our workaround was to set up a Maven
 reference
 installation zip, that already has all required plugin JARs in its local
 repository, so that users who unzip that archive have everything they
 need.
 
 
 Why can't you solve this with an in-house remote repository, rather than
 zipping up jars for local repos?
 

You can do it with an in-house remote plugin repository, but how would you
populate it with all plugins that you need? Ok, you can set up a proxy for
this purpose. Then, you somehow have to make sure that Maven downloads all
relevant plugins (this is not trivial either). Afterwards you can disconnect
the proxy from Internet, or use the proxy's cache as your internal plugin
repository. (Keeping the proxy alive and connected to the Internet might be
unacceptable because you want to evaluate new plugins before you release
them for internal usage.)

When I wanted to use Maven-proxy for this purpose, I soon encountered
problems because it did not support NTLM authentication, so I could not get
through the firewall. So we helped us with zipping up a local repo as a
workaround.

Anyway, I think that the whole task of setting up an internal plugin
repository is too complicated. We don't want to use a proxy, so why do we
have to set it up.

When talking about tools, companies are used to download the tool from the
vendor's site or install it from CD and then just use it. This is not
possible with Maven, you need an Internet connection during run-time, at
least until you have managed to set up an internal plugin repository.



 Another point is, plugin JARs and production JARs should be separated.
 This
 is already possible for remote repositories, but it should be possible
 for
 the local repository as well. Tool JARs like xerces etc. are used both by
 plugins and by projects. But just because some plugin needs a JAR file,
 corporate users do not want to make that JAR available for builds.
 Corporate
 users just don't want to get tools mixed up with production software.
 
 Why does your build software get mixed up with your production software?
 Using the release or dependency plugins will not bundle build tools with
 the
 jars to be released.
 

Maybe it becomes clearer with an example: Let's say, your build uses plugin
X which has a dependency to Xerces. So you will have Xerces in your local
repository after running your build, because it is downloaded from the
internal plugin repository to your local repository.

Thereafter, you are able to declare a dependency to Xerces in your project,
and the build will run through - even if Xerces has not been released to the
internal remote repository, and there's no connection to the Internet.

Builds should only be able to use artifacts that have explicitly been
released to the internal remote repository. Nothing else. Even not JARs from
an internal plugin repository.



 Anyway, Maven is a great piece of software, full of great ideas. But some
 things still need to be improved before Maven has chances to achieve a
 wide
 acceptance.
 
 
 Unless I'm really misunderstanding the problems, I haven't seen a problem
 you've had that is inherent to Maven (except possibly by creating a mirror
 repository with a set of dependencies).
 

Did my points get a little bit clearer now?

But don't get me wrong, I do not consider these points real showstoppers. We
did set up a Maven-based build environment succesfully in the end, and it
works quite well.

Regards,
Arne

-- 
View this message in context: 
http://www.nabble.com/-POLL--Why-switch-to-Maven--tf2185174.html#a6084903
Sent from the Maven 

Re: [POLL] Why switch to Maven?

2006-08-31 Thread ArneD


cstamas wrote:
 
 Right. Take Proximity for example, since it is not JUST proxy. If you
 visit
 the demo site, you will see that proximity is able to PROXY repositories
 but
 also to HOST them only. Furthermore, you can just take offline (offline
 ==
 do not touch remote peer!) or take unavailable (unavailable == refuse
 all
 requests) a repo from proximity by a switch.
 

I agree, Proximity is promising, and I will definitely take a further look.
I first heard about it after initially setting up the Maven build
environment.

Eric's question was, what is people preventing from using Maven. The points
that I named did not prevent me from using Maven, neither from recommending
its usage. I just want to point out things that corporate users are
concerned of. All these things are solveable. The question is, how much do
you get out-of-the-box and how easy is it to use.



 And for Xerces example: IF you have a developer who declares an obiously
 existing (it's in repo downloaded from plugn repo) artifact which is
 obviously forbidden (it is not in company repo), than you have a HR
 problem. No software will ever overcome human stupidity. Nor IT rigidness.
 

The reality in many companies is that you have people with very different
skills and experiences in IT departments. The idea of build automation is
not only to make life simpler, but also to help people to avoid mistakes.
Build automation should be easy-to-use for people even who use it the first
time. So, in the Xerces example, one might just not be aware of the fact
that Xerces is not in the company repository, but only in the plugin
repository.

Regards,
Arne
-- 
View this message in context: 
http://www.nabble.com/-POLL--Why-switch-to-Maven--tf2185174.html#a6088427
Sent from the Maven - Users forum at Nabble.com.


-
To unsubscribe, e-mail: [EMAIL PROTECTED]
For additional commands, e-mail: [EMAIL PROTECTED]



Re: [POLL] Why switch to Maven?

2006-08-31 Thread ArneD



 After all, if can't trust your team to stick to approved
 versions of artifacts how can you trust them to write your precious
 business code?
 

I think it's not a question of mistrusting people, but a question of how can
you help people to avoid mistakes. Even if they use a Maven-based build for
the first time and don't have a M.Sc. in Computer Science or x years of
experience.

If the team *wants* to used unapproved versions of artifacts, of course,
they will be able to do it. But you should help them to not do it by
mistake.



 So, how to you verify instead of lock?
 You have a parent pom that declares and defines all versions of
 artifacts and plugins and in your module poms you declare that you
 want a plugin but you provide no version information.  Then your
 configuration team only needs to check the parent pom against the
 internal standards.
 

Nice suggestion, I'll think about it for the customer's environment. Anyway,
if you talk about ease-of-use and quick adoption, this is not simple enough.
I am sure that these issues prevent some people from using Maven - and that
was Eric's question.



 NTLM support would help but hey, it's a closed proprietary
 undocumented authentication scheme from Microsoft, you can't expect
 everyone to support it. 
 

Again, Eric's question was what is people preventing from using it. The
reality in many companies is, they do use NTLM.



 As a workaround you can use NTLMAPS on
 sourceforge as a local proxy for any apps that don't know how to
 support NTLM.
 

Possible, but unnecessarily complicated, and far away from out-of-the-box.

Regards,
Arne
-- 
View this message in context: 
http://www.nabble.com/-POLL--Why-switch-to-Maven--tf2185174.html#a6088629
Sent from the Maven - Users forum at Nabble.com.


-
To unsubscribe, e-mail: [EMAIL PROTECTED]
For additional commands, e-mail: [EMAIL PROTECTED]



maven-dependency-plugin - (Beta-)Release planned?

2006-08-29 Thread ArneD

Are there plans for releasing maven-dependency-plugin? The last snapshot
version is from April:

http://people.apache.org/maven-snapshot-repository/org/apache/maven/plugins/maven-dependency-plugin/2.0-SNAPSHOT/
 

If 2.0 is not yet ready, a beta release would be great.

Thank you,
Arne


-- 
View this message in context: 
http://www.nabble.com/maven-dependency-plugin---%28Beta-%29Release-planned--tf2182224.html#a6035134
Sent from the Maven - Users forum at Nabble.com.


-
To unsubscribe, e-mail: [EMAIL PROTECTED]
For additional commands, e-mail: [EMAIL PROTECTED]



maven-dependency-plugin - Why is scope parameter read-only?

2006-08-29 Thread ArneD

I am trying to use maven-dependency-plugin (latest snapshot version) using
the following configuration:

  plugin 
groupIdorg.apache.maven.plugins/groupId
artifactIdmaven-dependency-plugin/artifactId 
version2.0-SNAPSHOT/version
executions
  execution
 idcopy_jars_to_webinf_lib/id
 phaseinitialize/phase
 goals
goalcopy-dependencies/goal
 /goals
 configuration
   
outputDirectory${basedir}/WebContent/WEB-INF/lib/outputDirectory
scopecompile/scope
 /configuration
  /execution
/executions
  /plugin

However, I get the error message:

[INFO] Error configuring: org.apache.maven.plugins:maven-dependency-plugin.
Reason: ERROR: Cannot override read-only parameter: scope in goal:
dependency:copy-dependencies

Why is scope read-only? (See AbstractDependencyFilterMojo.scope).

Thanks,
Arne
-- 
View this message in context: 
http://www.nabble.com/maven-dependency-plugin---Why-is-%22scope%22-parameter-read-only--tf2182289.html#a6035345
Sent from the Maven - Users forum at Nabble.com.


-
To unsubscribe, e-mail: [EMAIL PROTECTED]
For additional commands, e-mail: [EMAIL PROTECTED]



[M2] maven-clean-plugin and multi-module build

2006-08-28 Thread ArneD

I have a multi-module build. In one of the sub-modules I use the
maven-clean-plugin in the pom.xml as follows:

  plugin 
artifactIdmaven-clean-plugin/artifactId 
configuration
  filesets
fileset
  directoryWebContent/WEB-INF/lib/directory
  includes
include*/include
  /includes
/fileset
  /filesets
/configuration
  /plugin

The WebContent/WEB-INF/lib folder gets cleaned, when I call mvn clean on the
sub-module. But it does not get cleaned when I call mvn clean on the main
module.

Should I file a JIRA issue? Or am I making a mistake? Is there any
workaround available?

Thanks,
Arne
-- 
View this message in context: 
http://www.nabble.com/-M2--maven-clean-plugin-and-multi-module-build-tf2177477.html#a6020540
Sent from the Maven - Users forum at Nabble.com.


-
To unsubscribe, e-mail: [EMAIL PROTECTED]
For additional commands, e-mail: [EMAIL PROTECTED]



RE: [m2] Updates of transitive dependencies not working?

2006-08-24 Thread ArneD


Jörg Schaible wrote:
 
 ArneD wrote on Thursday, August 24, 2006 2:43 PM:
 
 I defined two repositories in the settings.xml, both with
 updatePolicy set to always.
 
 My project A has a dependency to version 5.0-SNAPSHOT of a
 JAR B. That JAR B
 has a dependency to version 1.6 of another JAR C. In my local
 repository there's an outdated version 1.6 of JAR C (i.e. version 1.6
 has been redeployed after a bug has been found).
 
 The problem is: During my build of project A Maven is looking
 for an update
 of JAR B, but NOT of JAR C.
 
 Is this a bug or am I missing some setting?
 
 Yes. You cannot upgrade a final version! Therefore it is final. Maven will
 *never* look for an update. That's what snapshots are for 
 

I know that you normally should use snapshot versions for that. Anyway,
Maven *does* allow to overwrite an existing version in the repository by
re-deploying it. To make builds repeatable, I believe, Maven has to look for
updates, even for released versions, not only snapshot versions.

Isn't that what the repositoryreleaseupdatePolicyalways/... setting
is for?



 (although it does also not work because of a bug).
 

Do you know the bug number?

Thank you!
Arne

-- 
View this message in context: 
http://www.nabble.com/-m2--Updates-of-transitive-dependencies-not-working--tf2158398.html#a5963829
Sent from the Maven - Users forum at Nabble.com.


-
To unsubscribe, e-mail: [EMAIL PROTECTED]
For additional commands, e-mail: [EMAIL PROTECTED]



RE: [m2] Updates of transitive dependencies not working?

2006-08-24 Thread ArneD

Thank you, Jörg. I filed issue MNG-2528.

Arne
-- 
View this message in context: 
http://www.nabble.com/-m2--Updates-of-transitive-dependencies-not-working--tf2158398.html#a5964861
Sent from the Maven - Users forum at Nabble.com.


-
To unsubscribe, e-mail: [EMAIL PROTECTED]
For additional commands, e-mail: [EMAIL PROTECTED]



Maven 2 and JUnit-Tests for Eclipse Plugins

2006-07-14 Thread ArneD

Has anybody set up Maven 2 builds for Eclipse (PDE) plugin projects and has
managed to integrate JUnit in-container tests for Eclipse-Plugins into the
Maven build lifecycle?

I would be very much interested in some best practices.

Thank you!
Arne

-- 
View this message in context: 
http://www.nabble.com/Maven-2-and-JUnit-Tests-for-Eclipse-Plugins-tf1942500.html#a5324188
Sent from the Maven - Users forum at Nabble.com.


-
To unsubscribe, e-mail: [EMAIL PROTECTED]
For additional commands, e-mail: [EMAIL PROTECTED]



Re: ClearCase SCM and Continuum

2006-06-21 Thread ArneD


Wim Deblauwe wrote:
 

 
  I found another workaround by building the super-module recursively:
 Delete
  all sub-modules. Change the super module definition and remove the
  --non-recursive option from the build targets.
 
 
 
 Maybe we need an option in Continuum so the user can select if he wants
 his
 submodules as seperate modules (default now) or as just 1 project without
 the --non-recursive option?
 

Yes, I believe that would be helpful.



  By the way, do you know has anybody planned to implement the checkout
 by
  tag functionality for ClearCase, so that the Maven release plugin can
 be
  used?

 you can perhaps help Wim and us if you can provide this feature in
 maven-scm.
 
 
 That would be great if you could do that. I started the ClearCase
 implementation because I hoped we would be moving to M2 soon, but it still
 has not happend. Currently, I can't spend any more time on the SCM
 implementation of ClearCase to do that, so if you could help out, it would
 be great!
 

I am not sure if I can do something here. I am currently helping a customer
in setting up a Maven-based build environment. (They use ClearCase at the
moment but think about moving to SVN which would make it obsolete.) My own
ClearCase knowledge is very limited, and I also do not have access to a
ClearCase environment when I'm not on-site with the customer. But I'll see
if I can do something. Can't promise though.




  And, at last, one suggestion. I believe the ClearCase integration would
 be
  even better, if the SCM url would contain all necessary information
 (like
  with CVS, SVN, ...). To define the config spec as an external file adds
  unnecessary complexity for the users. Anyway, thank you and all Maven /
  Continuum developers! You've done a great job. But for a broad
 acceptance of
  Maven, I believe, a simplification of some issues would be helpful.

 All proposals are welcome.
 
 
 Yes, how would you do that in 1 line? Most config specs are at least 2
 lines, so that is why we need an external file. But if you can think of
 something, please do.
 

My idea was that maybe the config spec could be generated by the SCM plugin.
That would somehow anyway be necessary in order to make the checkout from a
tag functionality work.

But, as said, I currently only have limited ClearCase knowledge, and I am
not sure what can be specified in the config spec and how much flexibility
is needed there. Anyway, the trivial cases I've seen so far, were mostly the
same two lines that are needed for checking out the main trunk:

element * CHECKEDOUT
element * /main/LATEST

Maybe a config spec can generated for most cases (for checking out from the
main trunk, from branches and from tags). The SCM url could look similiar to
a SVN SCM url in that case. Optionally a reference to ones own config spec
could be specified with a SCM url similiar to the current ClearCase SCM url.

Regards,
Arne
--
View this message in context: 
http://www.nabble.com/ClearCase-SCM-and-Continuum-t1812397.html#a4981893
Sent from the Continuum - Users forum at Nabble.com.



Re: ClearCase SCM and Continuum

2006-06-20 Thread ArneD

Thank you, Wim!


Wim Deblauwe wrote:
 
 1) This is the way that ClearCase works, there is nothing I know we can do
 about it.
 

Maybe something could be changed in Continuum and the POM schema, so that
the POM location relative to the SCM checkout directory could be specified
within the POM.



 2) I guess the workaround here would be to add the submodules
 individually.
 You can remove the view that was created via cleartool or from the
 clearcase
 home base.
 

I found another workaround by building the super-module recursively: Delete
all sub-modules. Change the super module definition and remove the
--non-recursive option from the build targets.


By the way, do you know has anybody planned to implement the checkout by
tag functionality for ClearCase, so that the Maven release plugin can be
used?

And, at last, one suggestion. I believe the ClearCase integration would be
even better, if the SCM url would contain all necessary information (like
with CVS, SVN, ...). To define the config spec as an external file adds
unnecessary complexity for the users. Anyway, thank you and all Maven /
Continuum developers! You've done a great job. But for a broad acceptance of
Maven, I believe, a simplification of some issues would be helpful.

Regards,
Arne
--
View this message in context: 
http://www.nabble.com/ClearCase-SCM-and-Continuum-t1812397.html#a4956443
Sent from the Continuum - Users forum at Nabble.com.



Automatically deploy snapshot bulds to remote repo

2006-06-16 Thread ArneD

Hi,
 
is there a way of telling Continuum to make nightly builds only for snapshot
versions?
 
Particularly, I want to have the nightly snapshot builds automatically
deployed to the snapshot repository. This is why I would change the build
goals from clean install to clean deploy. Projects that have a
non-snapshot version (i.e. projects that are currently not under
development) should not be deployed.
 
Thanks!
Arne
--
View this message in context: 
http://www.nabble.com/Automatically-deploy-snapshot-bulds-to-remote-repo-t1798808.html#a4902069
Sent from the Continuum - Users forum at Nabble.com.



Re: Automatically deploy snapshot bulds to remote repo

2006-06-16 Thread ArneD

Thank you, Emmanuel.


Emmanuel Venisse wrote:
 
 yes, goals to use is clean deploy, but continuum don't know what is a
 project in dev and a 
 released project. Generally, when we release a project, the code is tagged
 and the code in trunk is 
 updated to an incremented snapshot version.
 

clean deploy works well if you always have a snapshot version under
development. But that's not the case for every project. Sometimes you
release a version, and do not have plans for a future release yet. You do
not know, will the next rel. be a major or a minor release? Wiill there be a
next release at all? etc.

At the moment, I only see one solution, but I do not like it very much:
- Only keep snapshot versions in Continuum.
- Set build goals to clean deploy
- Make sure that Continuum only deploys snapshot versions by defining only
the snapshot repository in the distributionManagement section of the pom.
The release repository would be defined within a profile section that has to
explicitly be enabled, e.g. by parameter -Drelease=true.

That way the deployment of the release version can only be done manually,
when the user explicitly sets the -Drelease=true parameter. If, by mistake,
a release version is in Continuum, the build will fail because of the
missing distributionManagement information.

Do you have a better idea?

Thanks again
Arne

--
View this message in context: 
http://www.nabble.com/Automatically-deploy-snapshot-bulds-to-remote-repo-t1798808.html#a4903614
Sent from the Continuum - Users forum at Nabble.com.



Maven 2 and RAD 6

2006-05-31 Thread ArneD

I am trying to integrate my Mavenized web projects with RAD6. Even though I
was searching the list archives up and down, I still find this difficult,
because the information on the list is very fragmented and partly outdated.
I believe a kind of How-To documentation would be very helpful.

I tried to follow the instructions at:
http://www.nabble.com/Best+Practice+-+Maven+with+WSAD+or+RAD6-t966147.html#a2509735

All the JARs should be located in the EAR project's lib dir - not in the WAR
project's WEB-INF/lib.

Questions:
1. Even though I specified
warSourceExcludesWEB-INF/lib/*/warSourceExcludes in the pom.xml as
described in the thread above, all the JARs are still copied to WEB-INF/lib.

2. How can I achieve that during the build of the EAR project, all the JARs
are automatically copied to the EAR's *source* directory, so that RAD can
find the JARs thereafter. I am searching for something like mvn war:inplace,
but for the EAR project.

Any help would greatly be appreciated!

Thanks,
Arne

--
View this message in context: 
http://www.nabble.com/Maven+2+and+RAD+6-t1711035.html#a4645535
Sent from the Maven - Users forum at Nabble.com.


-
To unsubscribe, e-mail: [EMAIL PROTECTED]
For additional commands, e-mail: [EMAIL PROTECTED]



Internal plugin repository - how to populate?

2006-05-29 Thread ArneD

In many corporate environments it is unacceptable to let users download Maven
plugins automatically and directly from the Internet (even not through a
proxy). Therefore, it is necessary to set up an internal plugin repository.
 
Question: Is there an easy way how to initially populate the internal
repository with all the maven plugins needed?
 
I considered the following ways:
1. mvn deploy:deploy-file for all the plugin jars. Problem: This seems to be
a rather inconvenient approach with lots of manual work.
2. First popule a local repository by running all desired Maven goals, and
then copy the local repository contents to the internal remote repository.
Problem: As far as I understood, this wouldn't work, because a remote
repository contains some extra XML definition files, that are not available
in the local repository.
 
Any other ideas?
 
Thanks in advance,
Arne

--
View this message in context: 
http://www.nabble.com/Internal+plugin+repository+-+how+to+populate--t1697509.html#a4606575
Sent from the Maven - Users forum at Nabble.com.


-
To unsubscribe, e-mail: [EMAIL PROTECTED]
For additional commands, e-mail: [EMAIL PROTECTED]



[2.0.4] Bug? Profile not used when running outside a Maven project

2006-05-29 Thread ArneD

I want to roll out Maven in a corporate environment with an internal plugin
repository. My approach is to define and activate a profile in the global
settings.xml, which normally works fine.
 
Problem: When running Maven in a directory that does not contain a pom.xml,
the profile from settings.xml does not become active: Maven does not use the
pluginRegistry setting and connects to the public repository at
repo1.maven.org instead.
 
Seems like a bug to me? 

Thanks!
Arne

--
View this message in context: 
http://www.nabble.com/-2.0.4-+Bug-+Profile+not+used+when+running+outside+a+Maven+project-t1699843.html#a4612873
Sent from the Maven - Users forum at Nabble.com.


-
To unsubscribe, e-mail: [EMAIL PROTECTED]
For additional commands, e-mail: [EMAIL PROTECTED]