On 10-Aug-08, at 5:26 PM, Ralph Goers wrote:

Jason van Zyl wrote:


The other issues identify a problem that is a little harder to fix only because I haven't figured out how it could be done without being incompatible, even if what is currently happening - deploying poms with a variable in the version element - is just wrong.

Not necessarily. A property referring to a profile activated on a particular platform where it was expected to be evaluated during the build would cause a problem. I don't think it's as straight forward as interpolating prior to deployment. This is part of the overall process we need a spec for. This is not a very concrete answer but there are probably a range of things that can safely be interpolated and probably make sense, any properties associated with profiles activate by OS, JDK, or some other ad-hoc property referring to an environment will probably cause problems. Should people do this, probably not. But we never told them not to. Then how to you categorize what's allowed and what's not. No idea.

I think Brian wanted to cut from 2.0.9 instead of 2.0.10 but you guys should go for it. We just need to make sure the integration tests actually mean something so when I try to match capabilities with a potentially different implementation/solution it don't whack everyone.


This is somewhat off topic from versioning. I was specifically calling out using a property for the artifact version. To leave that in the pom when it is deployed is just wrong. I can think of all kinds of other properties that shouldn't be replaced, so full interpolation is clearly unacceptable.

Sure, like I said above there are things I would agree with you on. But I know of places that I've seen (ClearCase setups) which use the properties left to be interpreted by the system. Think of the versions specified in the depMan section being retrieved from an external source. You can deploy with the variables in there provided the external system (like a set of envars) or a custom version resolver (something I hacked into 2.1 for a client). Just telling you what I've seen. If we want to cut off this behavior then 2.1 would be a good place to define that.



I'm also a little unclear on the integration tests. I see a couple of branches that appear to be for specific jira issues, but nothing else. How are different tests run against maven trunk vs 2.0.x?

There is only one body of integration tests, and when a feature is added an integration test should be added. If it's version specific you can say so in the constructor of the test and harness will know whether to run it, or not, based on the version of Maven you're testing with. We run the same body on integration tests against the branches and trunk all the time. This way we can make sure, or detect, when things work in the branch but don't in the trunk.



Ralph

---------------------------------------------------------------------
To unsubscribe, e-mail: [EMAIL PROTECTED]
For additional commands, e-mail: [EMAIL PROTECTED]


Thanks,

Jason

----------------------------------------------------------
Jason van Zyl
Founder,  Apache Maven
jason at sonatype dot com
----------------------------------------------------------

We know what we are, but know not what we may be.

  -- Shakespeare


---------------------------------------------------------------------
To unsubscribe, e-mail: [EMAIL PROTECTED]
For additional commands, e-mail: [EMAIL PROTECTED]

Reply via email to