We are using a local hive context in order to run unit tests. Our unit
tests runs perfectly fine if we run why by one using sbt as the next
example:

>sbt test-only com.company.pipeline.scalers.ScalerSuite.scala
>sbt test-only com.company.pipeline.labels.ActiveUsersLabelsSuite.scala

However, if we try to run them as:

>sbt test-only com.company.pipeline.*

we start to run into issues. It appears that the issue is that the hive
context is not properly shutdown after finishing the first test. Does any
one know how to attack this problem? The test part in my build.sbt file
looks like:

libraryDependencies += "org.scalatest" % "scalatest_2.10" % "2.0" % "test",
parallelExecution in Test := false,
fork := true,
javaOptions ++= Seq("-Xms512M", "-Xmx2048M", "-XX:MaxPermSize=2048M",
"-XX:+CMSClassUnloadingEnabled")

We are working under Spark 1.3.0


Thanks
-- 
Cesar Flores

Reply via email to