Thanks, I will look into the classpaths and check.
On Mon, Nov 21, 2016 at 3:28 PM, Jakob Odersky wrote:
> The issue I was having had to do with missing classpath settings; in
> sbt it can be solved by setting `fork:=true` to run tests in new jvms
> with appropriate classpaths.
>
> Mohit, from t
The issue I was having had to do with missing classpath settings; in
sbt it can be solved by setting `fork:=true` to run tests in new jvms
with appropriate classpaths.
Mohit, from the looks of the error message, it also appears to be some
classpath issue. This typically happens when there are libr
Trying it out locally gave me an NPE. I'll look into it in more
detail, however the SparkILoop.run() method is dead code. It's used
nowhere in spark and can be removed without any issues.
On Thu, Nov 17, 2016 at 11:16 AM, Mohit Jaggi wrote:
> Thanks Holden. I did post to the user list but since t
Moving to user list
So this might be a better question for the user list - but is there a
reason you are trying to use the SparkILoop for tests?
On Thu, Nov 17, 2016 at 5:47 PM Mohit Jaggi wrote:
>
>
> I am trying to use SparkILoop to write some tests(shown below) but the
> test hangs with the
I am trying to use SparkILoop to write some tests(shown below) but the test
hangs with the following stack trace. Any idea what is going on?
import org.apache.log4j.{Level, LogManager}
import org.apache.spark.repl.SparkILoop
import org.scalatest.{BeforeAndAfterAll, FunSuite}
class SparkReplSpec