Github user vanzin commented on the pull request:

    https://github.com/apache/spark/pull/6243#issuecomment-103238397
  
    So, the only explanation I have is: `mvn package` does not deploy things to 
the local repository. But the subsequent `mvn test` picks up things from the 
local reposistory. So you're running the tests against a different set of 
artifacts than the ones that were built.
    
    I checked by running `mvn test -X` and the classpaths all point to files in 
`~/.m2/repository`. No references to the `classes` directory under the `target` 
dir, except for the particular module being currently built (spark-sql in my 
case).
    
    That would actually imply that the maven builds are technically broken. 
Tests are being run against some random artifact that is under the jenkins 
account's home dir, not against what was just built. IMO the correct way to do 
things would be to have a per-build local repository and use `mvn install` 
instead of `mvn package`; I'm not sure how you'd do that (I think in our 
internal builds we use `chroot` for that).


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at [email protected] or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to