Github user harishreedharan commented on the pull request:
https://github.com/apache/spark/pull/2882#issuecomment-60183932
1 is not an issue at all. mvn will exclude any dependencies whose versions
are hard-coded. So in any profile, minicluster will pull in the correct
versions, and we don't bundle it since we don't actually package it. Exclusions
don't come in the way - which is why maven build works fine (see Flume for
example - we build against any arbitrary HDFS version and use minicluster in
our tests without issues. There are other projects which build against
arbitrary HDFS versions and still use the minicluster).
I don't like (2) - we made test fixes which we didn't see in the local
tests. I'd rather keep it.
This definitely seems like sbt-mvn resolver related issue. Since the tests
are fine on mvn, it looks like new top-level dependencies are somehow not
getting pulled in.
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at [email protected] or file a JIRA ticket
with INFRA.
---
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]