Did a quick search and found the following, I haven't tested it myself.

Add the following to your build.sbt

libraryDependencies += "com.holdenkarau" % "spark-testing-base_2.10" %
"1.5.0_1.4.0_1.4.1_0.1.2"



Create a class extending com.holdenkarau.spark.testing.SharedSparkContext

And you should be able to use it.

Thanks
Best Regards

On Mon, Oct 12, 2015 at 2:18 PM, Fengdong Yu <fengdo...@everstring.com>
wrote:

> Hi,
> How to add dependency in build.sbt  if I want to use SharedSparkContext?
>
> I’ve added spark-core, but it doesn’t work.(cannot find SharedSparkContext)
>
>
>
> ---------------------------------------------------------------------
> To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
> For additional commands, e-mail: user-h...@spark.apache.org
>
>

Reply via email to