On Dec 12, 2007, at 4:18 PM, Dan Fabulich wrote:

John Casey wrote:

First things first. maven-invoker and maven-invoker-plugin are not separate things. The maven-invoker-plugin uses maven-invoker, but maven-invoker is meant to be a reusable library, not just a plugin.

I find this remark quite confusing... if one is a library, and one is a plugin, and they have two separate POMs, one for the library JAR ("maven-invoker") and one for the plugin ("maven-invoker- plugin") doesn't that make them two things that share code?

Actually, I think our "disagreement" is mostly a misunderstanding, and I think it mostly turns on this point of whether maven-invoker and maven-invoker plugin are "two things."

In particular, I said that we should not use maven-invoker-plugin for integration testing. But I didn't mean that we shouldn't use maven-invoker!

To me, shared code implies some sort of mutual dependency on something, whether that's cyclical or dependent on the same third library. Sorry for the confusion. Having said that, I definitely disagree with the approach you outline below. See my comments inline.


The maven-verifier (not the verifier plugin, I know it's confusing) gets away from this, in that it requires a project directory in src/test/resources AND a JUnit test case to orchestrate the test. IMO, this makes it extremely difficult to run a single core integration test from the command line, so you can capture logs for offline examination, for instance.

With that, I definitely disagree. Can you say more about what's difficult about running a single test?

Certainly it's trivial to run a single JUnit test from the command line using Surefire: "mvn test -Dtest=FooTest" does the job nicely; I use that all the time. And it's REALLY easy to run a single test from Eclipse/IDEA.

Maybe you meant that you think maven-verifier tests are harder to WRITE than writing goals.txt + beanshell tests? (I disagree with that, too, but it's worth clarifying what we're talking about.)

Not at all; I mean running the test. In order to run one of these tests (which are orchestrated by something akin to the maven-verifier from a JUnit or other java-driven test case), you must run JUnit or whatever, so you can be sure you have the same options enabled, environment present, and assertions tested as take place in the test- case code. For instance, simply cd into src/test/resources/it0105 in core-integration-tests, and see if you can figure out what it tests, and how. You can't, not without looking at the JUnit code that drives it, to determine what the criteria for success is, and which flags and goals/lifecycle phases to invoke. If I needed to re-run this test to actually diagnose a failure (which is the whole point here), I have to dig around in source code that's completely outside the it0105 directory, then come back and replicate that maven command, with the addition of the -X flag and a pipe to the tee command so I can analyze the output outside of the build.

This is much harder than it needs to be, and the same is true for plugin integration tests.


Again, the idea behind test builds driven by the invoker is to provide a test suite with a low barrier to entry, and with all the potential for configuration present in the main user's experience with Maven. It's not a panacea, but neither is writing JUnit tests that orchestrate and verify test-build resources in a separate directory structure, which can really only be run properly from the JUnit api.

You call it the "JUnit API" like it's this big hassle to run JUnit tests... but it's really easy to run JUnit tests, both from the command line and from an IDE; certainly 99% of our users know how to do it, and I hope most of them do it frequently! :-)

Furthermore, you seem to imply here that you can't just cd into src/ test/resources and run the projects there by hand directly... but of course you can do that when you want/need to. I often do that when I'm first setting up an integration test, before I've written the JUnit assertions for that test. Occasionally I do that when a test fails just so I can make sure I can reproduce manually what the test is doing automatically.

Again, it's not just about running the tests, but being able to actually debug a failing test effectively. Tests work best when they're easy to understand and work with, and when a maven core- integration-test fails, you can definitely see how this setup falls down. Running and re-running the same test without change from the IDE isn't useful for debugging, and running the invoker from this kind of code with the remote debugging enabled is virtually impossible...incidentally, if you've figured out how to do it, I'd be interested in learning.



However, it's also critical to allow a project's POM to remain unmodified for the purposes of orchestrating or verifying the build. Modifying the POM introduces the potential to taint the test results, and can limit flexibility in terms of verification. For instance, if you need to simply scan the console output for a particular message, it becomes much more difficult to do this if you try to do it while the build is still running.

I don't understand this remark, because I don't think either strategy requires anyone to "modify the POM"...? I hope it's just an artifact of the earlier misunderstanding.

This remark refers to one alternative to the JUnit/maven-verifier approach that I've heard in the past, which is to inject the assertions directly into the POM via something like the maven- verifier-plugin, and always run to the verify phase. Even though I now know you're against that approach, it's worth expanding a little. This approach limits flexibility to test things like running a multimodule build to the package or compile phases, not to mention the log-checking that I mentioned before. It could also interfere with other things bound to the verify phase, especially when those are the items under test (thinking of integration tests for the verifier plugin itself here).

In my opinion, there should certainly be hooks available to generate a JUnit wrapper around an integration test, but that wrapper should not carry information only exists outside the test project directory. I'd favor something more like having an orchestrator POM that called something like the invoker to run the real build using the real POM, then verifies the results of that build. Then, all assertions are contained within that orchestrator POM, and anyone could step into that directory and either run something like 'mvn -f test-pom.xml test' or else simply crack open the test-pom.xml - which is right alongside the rest of the test resources for that case - and read what it's doing. This makes it simpler to debug a failing integration- test without stepping outside the test-case directory, and it also provides a degree of documentation for the test. Best of all, it could be written as a very simple archetype that generates a portable test case which can live in almost any directory structure within the integration-test aggregator build to make it easier to organize the test cases according to functionality.

-john


-Dan

---------------------------------------------------------------------
To unsubscribe, e-mail: [EMAIL PROTECTED]
For additional commands, e-mail: [EMAIL PROTECTED]


---
John Casey
Committer and PMC Member, Apache Maven
mail: jdcasey at commonjava dot org
blog: http://www.ejlife.net/blogs/john
rss: http://feeds.feedburner.com/ejlife/john


Reply via email to