SBT and Maven resolution rules do differ. I thought SBT was generally
latest-first though, which should make 3.0 take priority. Maven is more
like closest-first, which means you can pretty much always override things
in your own build. An exclusion is the right way to go in this case because
the de
Well, recent experience tells me that at least SBT does not behave this
way. Given a `test` dependency on scalatest 3.0.0, my tests were still
being compiled against 2.2.6, which caused a combination of compile errors
and runtime errors (given that apparently 3.0.0 was still present at
runtime duri
On 29 Oct 2016, at 10:50, Sean Owen
mailto:so...@cloudera.com>> wrote:
Declare your scalatest dependency as test scope (which is correct anyway). That
would override it I think as desired?
not sure about that, but then mvn dependencies are one of those graph-theory
problems. It may just add
Declare your scalatest dependency as test scope (which is correct anyway).
That would override it I think as desired?
On Fri, Oct 28, 2016, 21:22 Shixiong(Ryan) Zhu
wrote:
> This is my test pom:
>
>
> 4.0.0
> foo
> bar
> 1.0
>
>
>
> org.apache.spark
> spark-core_2.10
> 2.0.1
>
Hmm. Yes, that makes sense. Spark's root pom does not affect your
application's pom, in which case it will pick compile over test if
there are conflicting dependencies.
Perhaps spark-tags should override it to provided instead of compile...
On Fri, Oct 28, 2016 at 1:22 PM, Shixiong(Ryan) Zhu
wro
This is my test pom:
4.0.0
foo
bar
1.0
org.apache.spark
spark-core_2.10
2.0.1
scalatest is in the compile scope:
[INFO] bar:foo:jar:1.0
[INFO] \- org.apache.spark:spark-core_2.10:jar:2.0.1:compile
[INFO]+- org.apache.avro:avro-mapred:jar:hadoop2:1.7.7:compile
[INFO]
The root pom declares scalatest explicitly with test scope. It's added
by default to all sub-modules, so every one should get it in test
scope unless the module explicitly overrides that, like the tags
module does.
If you look at the "blessed" dependency list in dev/deps, there's no scalatest.
Th
Yes, but scalatest doesn't end up in compile scope, says Maven?
...
[INFO] +- org.apache.spark:spark-tags_2.11:jar:2.1.0-SNAPSHOT:compile
[INFO] | +- (org.scalatest:scalatest_2.11:jar:2.2.6:test - scope managed
from compile; omitted for duplicate)
[INFO] | \- (org.spark-project.spark:unused:j
You can just exclude scalatest from Spark.
On Fri, Oct 28, 2016 at 12:51 PM, Jeremy Smith
wrote:
> spark-core depends on spark-launcher (compile)
> spark-launcher depends on spark-tags (compile)
> spark-tags depends on scalatest (compile)
>
> To be honest I'm not all that familiar with the proje
spark-tags is in the compile scope of spark-core...
On Fri, Oct 28, 2016 at 12:27 PM, Sean Owen wrote:
> It's required because the tags module uses it to define annotations for
> tests. I don't see it in compile scope for anything but the tags module,
> which is then in test scope for other modu
spark-core depends on spark-launcher (compile)
spark-launcher depends on spark-tags (compile)
spark-tags depends on scalatest (compile)
To be honest I'm not all that familiar with the project structure - should
I just exclude spark-launcher if I'm not using it?
On Fri, Oct 28, 2016 at 12:27 PM, S
It's required because the tags module uses it to define annotations for
tests. I don't see it in compile scope for anything but the tags module,
which is then in test scope for other modules. What are you seeing that
makes you say it's in compile scope?
On Fri, Oct 28, 2016 at 8:19 PM Jeremy Smith
Hey everybody,
Just a heads up that currently Spark 2.0.1 has a compile dependency on
Scalatest 2.2.6. It comes from spark-core's dependency on spark-launcher,
which has a transitive dependency on spark-tags, which has a compile
dependency on Scalatest.
This makes it impossible to use any other v
13 matches
Mail list logo